Custom context length for every model

Allow to set custom context length for every model. Like when selecting GPT-4 turbo I would like to set context length to 64k instead of default 128k.

Reason (longer context length is losing data):

https://twitter.com/GregKamradt/status/1722386725635580292

Please authenticate to join the conversation.

Upvoters
Status

Completed

Board
πŸ’‘

Feature Request

Tags

Billing/Usage Management

Date

Over 2 years ago

Subscribe to post

Get notified by email when there are changes.