Skip to content
This repository was archived by the owner on Nov 3, 2024. It is now read-only.

Instruction on using Bing API on LiteLLM to function as OpenAI compatible API #62

Open
cracksauce opened this issue Oct 29, 2023 · 2 comments

Comments

@cracksauce
Copy link

Hi there- is anyone willing to share whether you can get the bing AI API set up to use it as an OpenAI compatible API?

May be able to use LiteLLM but I'm having trouble figuring it out. They just released a server to host the liteLLM API so could be used (I think) either locally or online via proxy server?

Would be grateful if someone could explain or provide a brief tutorial since I'm struggling to figure it out... Thank you!

@ishaan-jaff
Copy link

Hi @cracksauce I'm the maintainer of LiteLLM - happy to help with this

  • why use the Bing API ?

@cracksauce
Copy link
Author

@ishaan-jaff Thanks for the reply!

In short, the ability to use GPT4-capable LLM for costly products using the OpenAI API (particularly autonomous agents that use a ton of API calls). It'd be amazing to use Bing's GPT4 model since it's free with a Microsoft account.

Particularly interested in testing newer agent products like MemGPT+AutoGen and XAgent for long, complex tasks. Would probably need to code for a wrapper than transform the API endpoints into chat/completions etc. Thoughts?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants