Allow to set custom context length for every model. Like when selecting GPT-4 turbo I would like to set context length to 64k instead of default 128k.
Reason (longer context length is losing data):
Please authenticate to join the conversation.
Completed
Feature Request
Billing/Usage Management
Over 2 years ago
Get notified by email when there are changes.
Completed
Feature Request
Billing/Usage Management
Over 2 years ago
Get notified by email when there are changes.