Create a DeepSeek inference endpoint
Generally available
Path parameters
-
The type of the inference task that the model will perform.
Values are
completionorchat_completion. -
The unique identifier of the inference endpoint.
Query parameters
-
Specifies the amount of time to wait for the inference endpoint to be created.
External documentation
PUT
/_inference/{task_type}/{deepseek_inference_id}
curl \
--request PUT 'http://api.example.com/_inference/{task_type}/{deepseek_inference_id}' \
--header "Authorization: $API_KEY" \
--header "Content-Type: application/json" \
--data '{"service":"deepseek","service_settings":{"api_key":"string","model_id":"string","url":"string"}}'