Loading

Configure access to LLMs

Serverless Stack

Elastic's AI features work with the out-of-the-box Elastic Managed LLMs or with third-party LLMs configured using one of the available connectors.

Elastic Managed LLMs are available in Kibana deployments and serverless projects with an appropriate subscription or feature tier. They provide immediate access to generative AI features without requiring any setup or external model integration.

Elastic Managed LLMs are available out-of-the box. They do not require manual configuration or API key management. Alternatively, you can configure and use third-party LLM connectors, such as OpenAI, Azure, or Amazon Bedrock.

To learn more about security and data privacy, refer to Elastic Managed LLMs which provides details on the models used by the connector.

Important

Using Elastic Managed LLMs incurs additional costs. Refer to Elastic Cloud pricing for more information.

Follow these guides to connect to one or more third-party LLM providers:

Serverless Unavailable Stack

You can also use preconfigured connectors to set up third-party LLM connectors by editing the kibana.yml file. This allows you enable a connector for multiple spaces at once, without performing set up in the Kibana UI for each space.

If you use a preconfigured connector for your LLM connector, we recommend adding the exposeConfig: true parameter to the xpack.actions.preconfigured section of the kibana.yml config file. This parameter makes debugging easier by adding configuration information to the debug logs, including which LLM the connector uses.