Elastic Notion Connector referenceedit

The Notion connector is built with the Elastic connectors Python framework and is available as a self-managed connector client. View the source code for this connector (branch main, compatible with Elastic 8.15).

Availability and prerequisitesedit

This connector was introduced in Elastic 8.13.0, available as a self-managed connector client.

To use this connector, satisfy all connector client prerequisites. Importantly, you must deploy the connectors service on your own infrastructure. You have two deployment options:

This connector is in beta and is subject to change. The design and code is less mature than official GA features and is being provided as-is with no warranties. Beta features are not subject to the support SLA of official GA features.

Usageedit

To use this connector in the UI, select the Notion tile when creating a new connector under Search → Connectors.

For additional operations, see Using connectors.

Connecting to Notionedit

To connect to Notion, the user needs to create an internal integration for their Notion workspace, which can access resources using the Internal Integration Secret Token. Configure the Integration with following settings:

  1. Users must grant READ permission for content, comment and user capabilities for that integration from the Capabilities tab.
  2. Users must manually add the integration as a connection to the top-level pages in a workspace. Sub-pages will inherit the connections of the parent page automatically.

Deploy with Dockeredit

You can deploy the Notion connector as a self-managed connector client using Docker. Follow these instructions.

Step 1: Download sample configuration file

Download the sample configuration file. You can either download it manually or run the following command:

curl https://raw.githubusercontent.com/elastic/connectors/main/config.yml.example --output ~/connectors-config/config.yml

Remember to update the --output argument value if your directory name is different, or you want to use a different config file name.

Step 2: Update the configuration file for your self-managed connector

Update the configuration file with the following settings to match your environment:

  • elasticsearch.host
  • elasticsearch.api_key
  • connectors

If you’re running the connector service against a Dockerized version of Elasticsearch and Kibana, your config file will look like this:

# When connecting to your cloud deployment you should edit the host value
elasticsearch.host: http://host.docker.internal:9200
elasticsearch.api_key: <ELASTICSEARCH_API_KEY>

connectors:
  -
    connector_id: <CONNECTOR_ID_FROM_KIBANA>
    service_type: notion
    api_key: <CONNECTOR_API_KEY_FROM_KIBANA> # Optional. If not provided, the connector will use the elasticsearch.api_key instead

Using the elasticsearch.api_key is the recommended authentication method. However, you can also use elasticsearch.username and elasticsearch.password to authenticate with your Elasticsearch instance.

Note: You can change other default configurations by simply uncommenting specific settings in the configuration file and modifying their values.

Step 3: Run the Docker image

Run the Docker image with the Connector Service using the following command:

docker run \
-v ~/connectors-config:/config \
--network "elastic" \
--tty \
--rm \
docker.elastic.co/enterprise-search/elastic-connectors:8.15.0.0 \
/app/bin/elastic-ingest \
-c /config/config.yml

Refer to DOCKER.md in the elastic/connectors repo for more details.

Find all available Docker images in the official registry.

We also have a quickstart self-managed option using Docker Compose, so you can spin up all required services at once: Elasticsearch, Kibana, and the connectors service. Refer to this README in the elastic/connectors repo for more information.

Configurationedit

Note the following configuration fields:

Notion Secret Key(required)

Secret token assigned to your integration, for a particular workspace. Example:

  • zyx-123453-12a2-100a-1123-93fd09d67394
Databases(required)

Comma-separated list of database names to be fetched by the connector. If the value is *, connector will fetch all the databases available in the workspace. Example:

  • database1, database2
  • *
Pages(required)

Comma-separated list of page names to be fetched by the connector. If the value is *, connector will fetch all the pages available in the workspace. Examples:

  • *
  • Page1, Page2
Index Comments
Toggle to enable fetching and indexing of comments from the Notion workspace for the configured pages, databases and the corresponding child blocks. Default value is False.

Enabling comment indexing could impact connector performance due to increased network calls. Therefore, by default this value is False.

Content Extractionedit

Refer to content extraction.

Documents and syncsedit

The connector syncs the following objects and entities:

  • Pages

    • Includes metadata such as page name, id, last updated time, etc.
  • Blocks

    • Includes metadata such as title, type, id, content (in case of file block), etc.
  • Databases

    • Includes metadata such as name, id, records, size, etc.
  • Users

    • Includes metadata such as name, id, email address, etc.
  • Comments

    • Includes the content and metadata such as id, last updated time, created by, etc.
    • Note: Comments are excluded by default.
  • Files bigger than 10 MB won’t be extracted.
  • Permissions are not synced. All documents indexed to an Elastic deployment will be visible to all users with access to the relevant Elasticsearch index.

Sync rulesedit

Basic sync rules are identical for all connectors and are available by default.

Advanced sync rulesedit

A full sync is required for advanced sync rules to take effect.

The following section describes advanced sync rules for this connector, to filter data in Notion before indexing into Elasticsearch. Advanced sync rules are defined through a source-specific DSL JSON snippet.

Advanced sync rules for Notion take the following parameters:

  1. searches: Notion’s search filter to search by title.
  2. query: Notion’s database query filter to fetch a specific database.

Examplesedit

Example 1edit

Indexing every page where the title contains Demo Page:

  {
    "searches": [
      {
        "filter": {
          "value": "page"
        },
        "query": "Demo Page"
      }
    ]
  }
Example 2edit

Indexing every database where the title contains Demo Database:

{
  "searches": [
    {
      "filter": {
        "value": "database"
      },
      "query": "Demo Database"
    }
  ]
}
Example 3edit

Indexing every database where the title contains Demo Database and every page where the title contains Demo Page:

{
  "searches": [
    {
      "filter": {
        "value": "database"
      },
      "query": "Demo Database"
    },
    {
      "filter": {
        "value": "page"
      },
      "query": "Demo Page"
    }
  ]
}
Example 4edit

Indexing all pages in the workspace:

{
  "searches": [
    {
      "filter": {
        "value": "page"
      },
      "query": ""
    }
  ]
}
Example 5edit

Indexing all the pages and databases connected to the workspace:

{
  "searches":[
    {
      "query":""
    }
  ]
}
Example 6edit

Indexing all the rows of a database where the record is true for the column Task completed and its property(datatype) is a checkbox:

{
  "database_query_filters": [
    {
      "filter": {
          "property": "Task completed",
          "checkbox": {
            "equals": true
          }
      },
      "database_id": "database_id"
    }
  ]
}
Example 7edit

Indexing all rows of a specific database:

{
  "database_query_filters": [
    {
      "database_id": "database_id"
    }
  ]
}
Example 8edit

Indexing all blocks defined in searches and database_query_filters:

{
  "searches":[
    {
      "query":"External tasks",
      "filter":{
        "value":"database"
      }
    },
    {
      "query":"External tasks",
      "filter":{
        "value":"page"
      }
    }
  ],
  "database_query_filters":[
    {
      "database_id":"notion_database_id1",
      "filter":{
        "property":"Task completed",
        "checkbox":{
          "equals":true
        }
      }
    }
  ]
}

In this example the filter object syntax for database_query_filters is defined per the Notion documentation.

Connector Client operationsedit

End-to-end Testingedit

The connector framework enables operators to run functional tests against a real data source, using Docker Compose. You don’t need a running Elasticsearch instance or Notion source to run this test.

Refer to Connector testing for more details.

To perform E2E testing for the Notion connector, run the following command:

$ make ftest NAME=notion

For faster tests, add the DATA_SIZE=small flag:

make ftest NAME=notion DATA_SIZE=small

By default, DATA_SIZE=MEDIUM.

Known issuesedit

  • Updates to new pages may not be reflected immediately in the Notion API.

    This could lead to these pages not being indexed by the connector, if a sync is initiated immediately after their addition. To ensure all pages are indexed, initiate syncs a few minutes after adding pages to Notion.

  • Notion’s Public API does not support linked databases.

    Linked databases in Notion are copies of a database that can be filtered, sorted, and viewed differently. To fetch the information in a linked database, you need to target the original source database. For more details refer to the Notion documentation.

  • Documents' properties objects are serialized as strings under details.

    Notion’s schema for properties is not consistent, and can lead to document_parsing_exceptions if indexed to Elasticsearch as an object. For this reason, the properties object is instead serialized as a JSON string, and stored under the details field. If you need to search a sub-object from properties, you may need to post-process the details field in an ingest pipeline to extract your desired subfield(s).

Refer to Known issues for a list of known issues for all connectors.

Troubleshootingedit

See Troubleshooting.

Securityedit

See Security.