-
-
Notifications
You must be signed in to change notification settings - Fork 416
Simplifying the OpenAI provider to use multiple model providers #1248
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
@@ -125,19 +128,24 @@ export function ChatSettings(props: ChatSettingsProps): JSX.Element { | |||
} | |||
|
|||
setLmLocalId(server.chat.lmLocalId); | |||
setEmLocalId(server.config.embeddings_provider_id ?? ''); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is the line causing the local model ID being set to the universal model ID (including provider prefix). We need to add emLocalId
to the server.chat: ProvidersInfo
object. This is returned from use-server-info.ts
.
const [clmProvider, setClmProvider] = | ||
useState<AiService.ListProvidersEntry | null>(null); | ||
const [showLmLocalId, setShowLmLocalId] = useState<boolean>(false); | ||
const [showEmLocalId, setShowEmLocalId] = useState<boolean>(false); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks like a duplicate of line 60.
Replaced by PR #1264 |
Description
The OpenAI model interface has been widely adopted by many model providers (DeepSeek, vLLM, etc.) and this PR enables accessing these models using the OpenAI provider. Current OpenAI models are also accessible via the same interface.
This PR also updates related documentation on the use of these models that work via the OpenAI provider.
Demo
See the new usage of models and the required settings shown below, note the new "OpenAI::general interface":

For any OpenAI model:

For DeepSeek models:

For models deployed with vLLM:

Embedding Models
First, tested to make sure that the OpenAI models are working as intended with no changes to the code:

Second, modified check that the interface takes any OpenAI embedding model as an input and test that it works with OpenAI models as before:
Code Completion
This is also supported now as follows:
