context_window_size hard coded to 1000 in prompts.py #260
JEHollierJr
started this conversation in
General
Replies: 1 comment
-
Hi This number is actually over-ridden automatically based on the actual number of tokens in the context window allowed by the individual models themselves (50% input/50% output). |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I see the following to set the size of the context window in prompts.py
Line 62
# default batch size, assuming all LLMs have min 2048 full context (50% in / 50% out)
self.context_window_size = 1000
can this be made configurable or set to the max of the models?
I see some of the newer models have a larger context window that would be useful to be able to access.
Beta Was this translation helpful? Give feedback.
All reactions