-
Notifications
You must be signed in to change notification settings - Fork 1.2k
fix support for litserve>0.2.4
#1994
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Do we expect the serving to work on macos? that seems to be failing. |
Yeah, passes locally. |
0133554
to
7270c23
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Love this @ali-alshaar7!! thank you so much 🚀
Also, can we set the version >=0.2.7
? This way we will know if in case LitGPT stops working for a new version of LitServe again?
@aniketmaurya do you see any reason why it may hang/fail on Mac? |
reran with debugging enabled @Borda! probably some issue with LitGPT setting up the devices. Ali mentioned that it works locally. will debug this in depth later today. |
I get this error locally:
investigating more cc @ali-alshaar7 @k223kim |
@aniketmaurya Did you pull the main branch? That's coming from my gemma PR |
Note that the useful CI relation is opposite to >= dependencies: If we want to declare a |
right @t-vi, we should aim for a test to ensure this! |
it did not help so let's revert it back and try to debug it... |
seems like the server process is not terminated properly on macos - |
Hey, thanks for looking into this. I did add a |
I think uvicorn runs in the main process and rest of the FastAPI logic lies in the child processes so if the child processes die then it won't be reachable. Maybe try using this logic from LitServe tests for running and stopping the tests? |
Yeah, so if we kill all the children, we should free the port and be able to spin up a new server and so we won't have this issue right? that's what |
yup, you're right! theoretically that's how things should go. |
@aniketmaurya would be nice if |
LitGPT CI fails with latest LitServe (>=0.2.4), this PR fixes that.
The real issue was that LitGPT was always failing with LitServe but the tests were flaky and they passed. Post
0.2.4
, LitServe added the mechanism to check if the inference worker is alive before starting the uvicorn worker and it made the LitGPT CI fail.