LLM gateway - GET MODEL

Hi,

As we have one click setup, for openrouter models.

could we get similar to like that Vercel AI gateway and other LLM gateway provider too.
In custom model setting we can have option like “GET MODEL”
where we can display, model/context window/price

Vercel:
GET /models

Response:

SyncPage[Model](data=[Model(id='alibaba/qwen-3-14b', created=1755815280, object='model', owned_by='alibaba', name='Qwen3-14B', description='Qwen3 is the latest generation of large language models in Qwen series, offering a comprehensive suite of dense and mixture-of-experts (MoE) models. Built upon extensive training, Qwen3 delivers groundbreaking advancements in reasoning, instruction-following, agent capabilities, and multilingual support', context_window=40960, max_tokens=16384, type='language', tags=['reasoning', 'tool-use'], pricing={'input': '0.00000006', 'output': '0.00000024'}),……..

, object='list')

Please authenticate to join the conversation.

Upvoters
Status

Open

Board
💡

Feature Request

Date

3 months ago

Subscribe to post

Get notified by email when there are changes.