Context length limit reached warning issue

Thanks for adding the warning about when the context length limit is reached. However, I don't think it's working correctly for GPT-4 Turbo. I attached two images. One shows that the current context length is ~ 2680 tokens, which is 2.09% of the 128k limit. The other image shows the message displayed when I click on the "Context length limit reached" warning. It states that "you have reached the context length limit of the gpt-turbo-preview model for this chat. You can still continue to chat, but the model will start to forget the old messages. The first 1 messages are forgotten."

The context limit wouldn't be reached even using the smaller context models.

Please authenticate to join the conversation.

Upvoters
Status

Closed

Board
πŸ’‘

Feature Request

Date

About 2 years ago

Subscribe to post

Get notified by email when there are changes.