The Elastic Microsoft SQL connector is a connector for Microsoft SQL databases.
Availability and prerequisitesedit
This connector is available as a native connector in Elastic versions 8.8.0 and later. To use this connector as a native connector, satisfy all native connector requirements.
This connector is also available as a connector client from the Python connectors framework. To use this connector, satisfy all connector client requirements.
This connector is in beta and is subject to change. The design and code is less mature than official GA features and is being provided as-is with no warranties. Beta features are not subject to the support SLA of official GA features.
To use this connector as a native connector, use the Connector workflow. See Native connectors.
To use this connector as a connector client, see Connector clients and frameworks.
Users require the
sysadmin server role.
For additional operations, see Usage.
The following are compatible with Elastic connector frameworks:
- Microsoft SQL Server versions 2017, 2019
- Azure SQL
- Amazon RDS for SQL Server
When using the connector client workflow, initially these fields will use the default configuration set in the connector source code.
Note that this data source uses the
generic_database.py connector source code.
mssql.py for additional code, specific to this data source.
These configurable fields will be rendered with their respective labels in the Kibana UI.
Once connected, users will be able to update these values in Kibana.
The following configuration fields are required to set up the connector:
The server host address where the Microsoft SQL Server is hosted. Default value is
The port where the Microsoft SQL Server is hosted. Default value is
- The username of the account for Microsoft SQL Server.
- The password of the account to be used for the Microsoft SQL Server.
Name of the Microsoft SQL Server database. Examples:
Comma-separated list of tables. The Microsoft SQL connector will fetch data from all tables present in the configured database, if the value is
*. Default value is
Name of the Microsoft SQL Server schema. Default value is
SSL verification enablement.
Default value is
Content of SSL certificate. If SSL is disabled, the
ssl_cavalue will be ignored.
Expand to see an example certificate
-----BEGIN CERTIFICATE----- MIID+jCCAuKgAwIBAgIGAJJMzlxLMA0GCSqGSIb3DQEBCwUAMHoxCzAJBgNVBAYT ... 7RhLQyWn2u00L7/9Omw= -----END CERTIFICATE-----
Host validation enablement.
Default value is
The number of rows to fetch on each request to Microsoft SQL Server.
Default value is
The number of retry attempts after failed request to Microsoft SQL Server.
Default value is
Deployment using Dockeredit
Follow these instructions to deploy the Microsoft SQL connector using Docker.
Step 1: Download sample configuration file
Download the sample configuration file. You can either download it manually or run the following command:
curl https://raw.githubusercontent.com/elastic/connectors-python/main/config.yml --output ~/connectors-python-config/config.yml
Remember to update the
--output argument value if your directory name is different, or you want to use a different config file name.
Step 2: Update the configuration file for your self-managed connector
Update the configuration file with the following settings to match your environment:
Use jira as the
Don’t forget to uncomment "jira" in the
sources section of the
If you’re running the connector service against a Dockerized version of Elasticsearch and Kibana, your config file will look like this:
elasticsearch: host: http://host.docker.internal:9200 username: elastic password: <YOUR_PASSWORD> connector_id: <CONNECTOR_ID_FROM_KIBANA> service_type: jira sources: # UNCOMMENT "jira" below to enable the Microsoft SQL connector #mongodb: connectors.sources.mongo:MongoDataSource #s3: connectors.sources.s3:S3DataSource #dir: connectors.sources.directory:DirectoryDataSource #mysql: connectors.sources.mysql:MySqlDataSource #network_drive: connectors.sources.network_drive:NASDataSource #google_cloud_storage: connectors.sources.google_cloud_storage:GoogleCloudStorageDataSource #azure_blob_storage: connectors.sources.azure_blob_storage:AzureBlobStorageDataSource #postgresql: connectors.sources.postgresql:PostgreSQLDataSource #oracle: connectors.sources.oracle:OracleDataSource #mssql: connectors.sources.mssql:MSSQLDataSource
Note that the config file you downloaded might contain more entries, so you will need to manually copy/change the settings that apply to you.
Normally you’ll only need to update
service_type to run the connector service.
Step 3: Run the Docker image
Run the Docker image with the Connector Service using the following command:
docker run \ -v ~/connectors-python-config:/config \ --network "elastic" \ --tty \ --rm \ docker.elastic.co/enterprise-search/elastic-connectors:22.214.171.124-SNAPSHOT \ /app/bin/elastic-ingest \ -c /config/config.yml
Refer to this guide in the Python framework repository for more details.
Documents and syncsedit
- Tables with no primary key defined are skipped.
sys.dm_db_index_usage_statstable is not available for a specific table and database then all data in that table will be synced.
- Permissions are not synced. All documents indexed to an Elastic deployment will be visible to all users with access to that Elastic Deployment.
- Filtering rules are not available in the present version. Currently, filtering is controlled by ingest pipelines.
See Content extraction.
Connector client operationsedit
The connector framework enables operators to run functional tests against a real data source. Refer to Connector testing for more details.
To perform E2E testing for the Microsoft SQL connector, run the following command:
make ftest NAME=mssql
For faster tests, add the
make ftest NAME=mssql DATA_SIZE=small
There are no known issues for this connector. See Known issues for any issues affecting all connectors.
Framework and sourceedit
This connector is included in the Python connectors framework.
This connector uses the generic database connector source code (branch 8.8, compatible with Elastic 8.8).
View additional code specific to this data source (branch 8.8, compatible with Elastic 8.8).
Intro to Kibana
ELK for Logs & Metrics