Integrations
Using Elasticsearch as a vector database for Azure OpenAI On Your Data
Explore how to quickly set up and ingest data into Elasticsearch for use as a vector database with Azure OpenAI On Your Data, enabling you to chat with your private data.
Elasticsearch open inference API adds Azure AI Studio support
Elasticsearch open inference API adds support for embeddings generated from models hosted on Azure AI Studio and completion tasks from large language models such as Meta-Llama-3-8B-Instruct."
Elasticsearch open inference API adds support for Azure OpenAI chat completions
Elasticsearch open inference API adds support for Azure Open AI chat completions, providing full developer access to the Azure AI ecosystem
Elasticsearch open inference API adds support for Azure OpenAI embeddings
Elasticsearch open inference API adds support for Azure OpenAI embeddings to be stored in the world's most downloaded vector database.
Elasticsearch delivers performance increase for users running the Elastic Search AI Platform on Arm-based architectures
Benchmarking in preview provides up to 37% better performance on Microsoft Cobalt 100 Arm-based VMs
ES|QL queries to Java objects
How perform ES|QL queries with the Java client
Evolution of the Elasticsearch .NET Client
From NEST to Elastic.Clients.Elasticsearch
Elasticsearch open inference API adds support for OpenAI chat completions
Learn how OpenAI chat completions and Elasticsearch can be used to summarize, translate or perform question & answering on any text.
RAG (Retrieval Augmented Generation) with LlamaIndex, Elasticsearch and Mistral
Learn to implement a RAG system using LlamaIndex, Elasticsearch and locally running Mistral.