OpenAI connector and actionedit

The OpenAI connector uses axios to send a POST request to an OpenAI provider, either OpenAI or Azure OpenAI. The connector uses the run connector API to send the request.

Create connectors in Kibanaedit

You can create connectors in Stack Management > Connectors. For example:

OpenAI connector
Connector configurationedit

OpenAI connectors have the following configuration properties:

Name
The name of the connector.
OpenAI provider
The OpenAI API provider, either OpenAI or Azure OpenAI.
URL
The OpenAI request URL.
Default model
(optional) The default model to use for requests. This option is available only when the provider is OpenAI.
API key
The OpenAI or Azure OpenAI API key for authentication.

Test connectorsedit

You can test connectors with the run connector API or as you’re creating or editing the connector in Kibana. For example:

OpenAI params test

The OpenAI actions have the following configuration properties.

Body

A JSON payload sent to the OpenAI API URL. For example:

{
  "model": "gpt-3.5-turbo",
  "messages": [
    {
      "role": "user",
      "content": "Hello world"
    }
  ]
}

Connector networking configurationedit

Use the Action configuration settings to customize connector networking configurations, such as proxies, certificates, or TLS settings. You can set configurations that apply to all your connectors or use xpack.actions.customHostSettings to set per-host configurations.

Token usage dashboardedit

Once you’ve created a OpenAI connector, you can monitor its token usage using the OpenAI Token Usage dashboard. Select the connector in Stack Management > Connectors to view its details, then click the View OpenAI Usage Dashboard for "<Name>" Connector link to open the dashboard.

To view the dashboard, you need at least read and view_index_metadata privileges for the .kibana-event-log-* index and the Read feature privilege for Kibana. You can set up a role with these minimum privileges and assign it to non-admin users who need to view this dashboard.