Update a Saved Object type
This page describes how to upgrade existing Saved Object type definitions: transitioning legacy types to model versions and adding new model versions to types that already use them. It does not cover updating Saved Object instances via the client.
Model versions are identified by a single integer. The first version must be 1; each new version increments by one with no gaps.
Valid:
const myType: SavedObjectsType = {
name: 'test',
modelVersions: {
1: modelVersion1,
2: modelVersion2,
},
// ...other mandatory properties
};
Invalid:
const myType: SavedObjectsType = {
name: 'test',
modelVersions: {
2: modelVersion2,
4: modelVersion3,
},
// ...other mandatory properties
};
- invalid: first version must be 1
- invalid: skipped version 3
If you are updating a legacy Saved Object type that does not yet use model versions, you must establish a baseline first. This is a two-step process so that Serverless can roll back safely if needed.
The first PR must define the current, existing shape of the type's documents.
- No mapping changes — Do not change any existing mappings; only add the required schemas.
- Deploy first — This PR must be merged and released in Serverless before you open a second PR with your real changes.
Please refer to Create: Initial model version for more details on how to define the initial model version.
If your type was using the legacy migrations property, and it was already defining schemas, you can reuse the latest schema as the initial model version.
When the type already defines modelVersions, add a new model version for your change. Do not modify existing versions. The new version must be the next consecutive integer and must include the appropriate changes and updated create and forwardCompatibility schemas. See Structure: Structure of a model version for the available change types and schema options.
You must add a new model version whenever mappings change. The migration logic uses the presence of a new model version (and its mappings_addition or other mapping-related changes) to determine that it needs to update the index mappings for that type.
See the use-case examples below for adding fields, backfilling defaults, and removing fields.
These examples show migration scenarios supported by the model version system.
More complex scenarios (e.g. field mutation by copy/sync) can be implemented with the current tooling, but without higher-level support from Core, much of the sync and compatibility work falls on the type owner and is not documented here.
Type is at model version 1 with two indexed fields: foo and bar. You want to add a non-indexed field dolly with no default.
Version 1:
const myType: SavedObjectsType = {
name: 'test',
namespaceType: 'single',
modelVersions: {
1: {
changes: [],
schemas: {
forwardCompatibility: schema.object(
{ foo: schema.string(), bar: schema.string() },
{ unknowns: 'ignore' }
),
create: schema.object(
{ foo: schema.string(), bar: schema.string() },
)
},
},
},
mappings: {
properties: {
foo: { type: 'text' },
bar: { type: 'text' },
},
},
};
Add version 2 with no changes; only extend the schemas to include dolly:
let modelVersion2: SavedObjectsModelVersion = {
changes: [],
schemas: {
forwardCompatibility: schema.object(
{ foo: schema.string(), bar: schema.string(), dolly: schema.string() },
{ unknowns: 'ignore' }
),
create: schema.object(
{ foo: schema.string(), bar: schema.string(), dolly: schema.string() },
)
},
};
Same as above but dolly must be indexed. Add a mappings_addition change and update the root mappings:
let modelVersion2: SavedObjectsModelVersion = {
changes: [
{
type: 'mappings_addition',
addedMappings: {
dolly: { type: 'text' },
},
},
],
schemas: {
forwardCompatibility: schema.object(
{ foo: schema.string(), bar: schema.string(), dolly: schema.string() },
{ unknowns: 'ignore' }
),
create: schema.object(
{ foo: schema.string(), bar: schema.string(), dolly: schema.string() },
)
},
};
// And update root mappings:
mappings: {
properties: {
foo: { type: 'text' },
bar: { type: 'text' },
dolly: { type: 'text' },
},
},
Add both a data_backfill change and a mappings_addition change:
let modelVersion2: SavedObjectsModelVersion = {
changes: [
{
type: 'data_backfill',
transform: (document) => {
return { attributes: { dolly: 'default_value' } };
},
},
{
type: 'mappings_addition',
addedMappings: {
dolly: { type: 'text' },
},
},
],
schemas: {
forwardCompatibility: schema.object(
{ foo: schema.string(), bar: schema.string(), dolly: schema.string() },
{ unknowns: 'ignore' }
),
create: schema.object(
{ foo: schema.string(), bar: schema.string(), dolly: schema.string() },
)
},
};
Update the root mappings to include dolly as in the previous example.
For a non-indexed field with a default, use only the data_backfill change (no mappings_addition or root mapping update).
Removing a field must be done in two releases to preserve rollback safety:
- Release N — Application still uses the field.
- Release N+1 — Application stops using the field; remove it from
forwardCompatibilityandcreateschemas so it is no longer returned, but do not delete data yet. - Release N+2 — Add a
data_removalchange to delete the field from documents.
If you deleted the data in N+1 and then rolled back to N, the old version would expect the field and data would be lost.
Version N+1 — stop returning the field:
let modelVersion2: SavedObjectsModelVersion = {
changes: [],
schemas: {
forwardCompatibility: schema.object(
{ kept: schema.string() },
{ unknowns: 'ignore' }
),
create: schema.object(
{ kept: schema.string() },
)
},
};
- removed no longer in schema
Version N+2 — remove data:
let modelVersion3: SavedObjectsModelVersion = {
changes: [
{
type: 'data_removal',
removedAttributePaths: ['removed']
}
],
schemas: {
forwardCompatibility: schema.object(
{ kept: schema.string() },
{ unknowns: 'ignore' }
),
create: schema.object(
{ kept: schema.string() },
)
},
};
The root mappings can still list the removed field (Elasticsearch does not support removing mapping fields without reindexing). You can flag it with a mappings_deprecation change so it can be cleaned up when supported.