Closed
Description
Confirm this is a Node library issue and not an underlying OpenAI API issue
- This is an issue with the Node library
Describe the bug
The service_tier
should accept "priority"
(per the API docs), but the type definition for ChatCompletionCreateParamsBase.service_tier
only allows 'auto' | 'default' | 'flex' | 'scale' | null
.
As a result, passing "priority"
causes a TypeScript error and forces a type-cast workaround.
To Reproduce
-
Install latest openai (
npm i openai@<version>
) in a TypeScript project. -
Add the code below and run
tsc
(or your IDE’s type-checker):import { OpenAI } from 'openai'; const openai = new OpenAI({ ... }); await openai.chat.completions.create({ model: "gpt-4.1", messages: [{ role: "user", content: "Hello" }], service_tier: "priority", // should be accepted });
-
Observe the compiler error:
No overload matches this call.
Overload 1 of 3, '(body: ChatCompletionCreateParamsNonStreaming, options?: RequestOptions | undefined): APIPromise<ChatCompletion>', gave the following error.
...
Types of property 'service_tier' are incompatible.
Type '"priority" | undefined' is not assignable to type '"default" | "auto" | "flex" | "scale" | null | undefined'.
Code snippets
/**
* Specifies the latency tier to use for processing the request.
*/
service_tier?: 'auto' | 'default' | 'flex' | 'scale' | null;
OS
All
Node version
Node v22.11.0
Library version
openai v5.7.0