Support for Ollama Cloud as a custom model/API provider

I would like TypingMind to support connecting to Ollama Cloud using its official OpenAI-compatible API endpoint (https://ollama.com/v1/). Ollama Cloud offers hosted versions of models like Llama, Gemma, Nemotron, but currently TypingMind’s proxy does not support this endpoint, and browser CORS blocks direct access.

Please add proxy and/or native support for Ollama Cloud’s API endpoint, so that users can use Ollama Cloud models within TypingMind just like OpenAI or Mistral. This would enable access to the latest open-source models with easy paid hosting.

Thank you!

Please authenticate to join the conversation.

Upvoters
Status

Open

Board
💡

Feature Request

Date

1 day ago

Subscribe to post

Get notified by email when there are changes.