We should update the OpenRouter integration so that, instead of defaulting to a chat completion call to check the API, we use the list models call. This will allow us to retrieve all available model information directly.
With this change, users can easily select from the available free models without needing to configure a separate custom model for free mode. This will significantly improve overall app usability.
Since OpenRouter follows the OpenAI specification, we can make the feature more robust by:
Providing the LLM gateway (OpenAI-compatible) URL and API key.
Retrieving the model list.
Allowing users to select their desired model from that list.
This useful for other LLM gateway provider like Vercel.

Please authenticate to join the conversation.
Open
Feature Request
3 months ago
Get notified by email when there are changes.
Open
Feature Request
3 months ago
Get notified by email when there are changes.