Create an inference endpoint Added in 8.11.0

PUT /_inference/{inference_id}

Path parameters

application/json

Body Required

Responses

  • 200 application/json
    Hide response attributes Show response attributes object
PUT /_inference/{inference_id}
curl \
 -X PUT http://api.example.com/_inference/{inference_id} \
 -H "Content-Type: application/json" \
 -d '{"service":"string","service_settings":{},"task_settings":{}}'
Request examples
{
  "service": "string",
  "service_settings": {},
  "task_settings": {}
}
Response examples (200)
{
  "service": "string",
  "service_settings": {},
  "task_settings": {},
  "inference_id": "string",
  "task_type": "sparse_embedding"
}