Product release

Announcing the new Elasticsearch JavaScript client (RC1)

This is not an April Fools' Day joke! In the past months, we have worked hard to refactor the current JavaScript client, and we couldn't be happier to announce that we now have the first release candidate of a new JavaScript client! In this blog post, we'll explain why we refactored the client and the technical decisions that have been made. We plan to release the final version of the client a few days after the final release of Elasticsearch 7.0.

The new client will be released under the elastic npm organization, so you can install it with:

npm install @elastic/elasticsearch

Why a new Elasticsearch npm module?

From as far back as 5.0, Elastic has released its software using the same version number across the whole stack, so if you install Elasticsearch v6.4.0, you will use Kibana v6.4.0 and so on. All of the official Elasticsearch clients follow the same release train, except for elasticsearch-js, which currently is at version v15.4.1.

The versioning problem.

One of the main issues with the old JavaScript client was the versioning strategy and the fact that we were shipping the entire API in a single package. Many solutions were considered, here are a few and why they weren’t a good fit:

  • Restart publishing old versions of the module and force users to use them.Needless to say, it wasn’t a good idea. There would have been a lot of cognitive overhead and dozens of issues related to version handling.
  • Start publishing version 50.x, 60.x of the module, where the major is kind of similar to the major version of Elasticsearch (50.x maps to 5.x, 60.x maps to 6.x and so on).It may have fixed the issue, but it does not feel right. The versioning scheme would have been strange to newcomers, and it feels weird to use such large version numbers.
  • Publish many modules with the major number in the module name, elasticsearch-js-6 for example.This would have worked, but it would have created a dispersion of packages on npm and the overall user experience wouldn’t have been very good.
  • Publish a new scoped package inside the npm organization, so @elastic/elasticsearch, resulting in a brand new version handling.This fixes the versioning problem, but scoped packages are harder to find, and the current elasticsearch module has many downloads every week, so people would continue to use that.

None of those solutions would have worked well enough.

We had an Engineering All-Hands meeting in Dublin in October 2018, and we discussed at length how to fix this. You can ask the rest of the clients team about how much I personally mulled over this!

Then, one night, after the right amount of Guinness, I woke up with an idea. I wrote it down and went back to sleep. The following day, I presented the solution to the rest of the team, and everyone agreed that it was the winner!

The Package Reorganization.

Given that we couldn’t get rid of the elasticsearch module on npm because of how npm works and because it would have broken too many users, the solution we agreed to adopt is the following:

Publish a new scoped package under the @elastic organization on npm, @elastic/elasticsearch with the new codebase and the correct version numbers. Then, the current unscoped package will directly depend on the scoped module, therefore transforming it to a “pointer” to the latest and greatest version of it.

Note, this last step will not happen immediately to avoid confusion and help users transition from the old client to the new one.

Why should I use the new JS client?

Great question! There are many reasons why you should use the new client. Here are seven:

  1. The client has been rewritten from scratch using a new, modern syntax. It’s thus more intuitive and easier to use.
  2. The API has been redesigned to offer a consistent developer experience between callbacks and promises. It offers different utilities to handle complex situations, such as an extend method to build tailor-made APIs, and a child method, that allows you to have multiple instances of the client, with different options but that share a connection pool.
  3. The overall performance has been improved, including startup time and execution time. Since more and more services are moving to Serverless we thought that is important to offer the best startup performance, to help our users save time and money.
  4. All of the public APIs of Elasticsearch are supported out of the box, this means that no matter which API you are using you will not be required to use external plugins.
  5. You can handle any Elasticsearch response as a Node.js stream, in case you need to pipe it directly into an HTTP response.
  6. Automatic node resurrection by default. If a node is marked as dead, the client will try automatically to revive it.
  7. TypeScript is supported out of the box. The client offers type definitions for all its internals and request parameters. Request bodies and responses are shipped as any because there is not an official spec yet (but we are working on it). Furthermore, because they are shipped as any, you can easily override them with your own typing definitions.

Breaking changes are coming

If you were already using the previous version of this client — i.e. the one you used to install with npm install elasticsearch — you will encounter some breaking changes.

Don’t panic!

We understand that breaking changes are not ideal, but every breaking change has been carefully considered and we believe it was right to thing to do, to advance the client. Furthermore, the new codebase has been rewritten with modern JavaScript in mind, has been carefully designed to be easy to maintain, and to offer a more intuitive developer experience.

Breaking changes

  • The minimum supported version of Node.js is v6.
  • Everything has been rewritten using ES6 classes to help users extend the defaults more easily.
  • There is no longer an integrated logger. The client now is an event emitter that emits the following events: request, response, sniff, and resurrect.
  • The code is no longer shipped with all the versions of the API, but only that of the package’s major version, This means that if you are using Elasticsearch v6, you will be required to install @elastic/elasticsearch@6, and so on.
  • The internals are completely different, so if you used to tweak them a lot, you will need to refactor your code. The public API should be almost the same.
  • No more browser support. This module is intended for Node.js only.
  • The returned value of an API call will no longer be the body, statusCode, and headers for callbacks and just the body for promises. The new result object contains the body, statusCode, headers, warnings, and meta, for both callback and promises.

    // before
    const body = await client.search({
      index: 'my-index',
      body: { foo: 'bar' }
    })
    client.search({
      index: 'my-index',
      body: { foo: 'bar' }
    }, (err, body, statusCode, headers) => {
      if (err) console.log(err)
    })
    
    // after
    const { body, statusCode, headers, warnings } = await client.search({
      index: 'my-index',
      body: { foo: 'bar' }
    })
    client.search({
      index: 'my-index',
      body: { foo: 'bar' }
    }, (err, { body, statusCode, headers, warnings }) => {
      if (err) console.log(err)
    })
        
  • Errors: there is no longer a custom error class for every HTTP status code (such as BadRequest or NotFound). There is instead a single ResponseError. Each error class has been renamed, and now each is suffixed with Error at the end.
  • Errors that have been removed: RequestTypeError, Generic, and all the status code specific errors (such as BadRequest or NotFound).
  • Errors that have been added: ConfigurationError (in case of bad configurations) and ResponseError, which contains all the data you may need to handle the specific error, such as statusCode, headers, body, and message.
  • Errors that have been renamed:
    • RequestTimeout (408 statusCode) ⇒ TimeoutError
    • ConnectionFaultConnectionError
    • NoConnectionsNoLivingConnectionsError
    • SerializationSerializationError
    • SerializationDeserializationError
  • You must specify the port number in the configuration. In the previous version you can specify the host and port in a variety of ways, with the new client there is only one way, via the node parameter.
  • The plugins option has been removed, if you want to extend the client now you should use the client.extend API.

    // before
    const { Client } = require('elasticsearch')
    const client = new Client({ plugins: [...] })
    
    // after
    const { Client } = require('@elastic/elasticsearch')
    const client = new Client({ ... })
    client.extend(...)
        
  • There is a clear distinction between the API related parameters and the client related configurations, the parameters ignore, headers, requestTimeout and maxRetries are no longer part of the API object, and you should specify them in a second option object.

    // before
    const body = await client.search({
      index: 'my-index',
      body: { foo: 'bar' },
      ignore: [404]
    })
    client.search({
      index: 'my-index',
      body: { foo: 'bar' },
      ignore: [404]
    }, (err, body, statusCode, headers) => {
      if (err) console.log(err)
    })
    
    // after
    const { body, statusCode, headers, warnings } = await client.search({
      index: 'my-index',
      body: { foo: 'bar' }
    }, {
      ignore: [404]
    })
    client.search({
      index: 'my-index',
      body: { foo: 'bar' }
    }, {
      ignore: [404]
    }, (err, { body, statusCode, headers, warnings }) => {
      if (err) console.log(err)
    })
        
  • The transport.request method will no longer accept the query key, but the querystring key instead (which can be a string or an object), furthermore, you need to send a bulk-like request, instead of the body key, you should use the bulkBody key. Also in this method, the client specific parameters should be passed as a second object.

    // before
    const body = await client.transport.request({
      method: 'GET',
      path: '/my-index/_search',
      body: { foo: 'bar' },
      query: { bar: 'baz' }
      ignore: [404]
    })
    client.transport.request({
      method: 'GET',
      path: '/my-index/_search',
      body: { foo: 'bar' },
      query: { bar: 'baz' }
      ignore: [404]
    }, (err, body, statusCode, headers) => {
      if (err) console.log(err)
    })
    
    // after
    const { body, statusCode, headers, warnings } = await client.transport.request({
      method: 'GET',
      path: '/my-index/_search',
      body: { foo: 'bar' },
      querystring: { bar: 'baz' }
    }, {
      ignore: [404]
    })
    client.transport.request({
      method: 'GET',
      path: '/my-index/_search',
      body: { foo: 'bar' },
      querystring: { bar: 'baz' }
    }, {
      ignore: [404]
    }, (err, { body, statusCode, headers, warnings }) => {
      if (err) console.log(err)
    })
        

Release Candidate

As said at the beginning of this post, the currently available client is a release candidate. This means that if you find a bug or you want to propose a new feature, please don’t hesitate!

If you want to know more, you can open an issue in the client repository, post a question in Discuss, or ping @delvedor on Twitter. Happy coding!