8 comments

  • oceanplexian an hour ago

    The Anthropic API was already supported by llama.cpp (The project Ollama ripped off and typically lags in features by 3-6 months), which already works perfectly fine with Claude Code by setting a simple environment variable.

    • xd1936 an hour ago

      And they reference that announcement and related information in the second line.

  • eli an hour ago

    There are already various proxies to translate between OpenAI-style models (local or otherwise) and an Anthropic endpoint that Claude Code can talk to. Is the advantage here just one less piece of infrastructure to worry about?

    • g4cg54g54 an hour ago

      siderailing here - but got one that _actually_ works?

      in particular i´d like to call claude-models - in openai-schema hosted by a reseller - with some proxy that offers anthropic format to my claude --- but it seems like nothing gets to fully line things up (double-translated tool names for example)

      reseller is abacus.ai - tried BerriAI/litellm, musistudio/claude-code-router, ziozzang/claude2openai-proxy, 1rgs/claude-code-proxy, fuergaosi233/claude-code-proxy,

  • d0100 an hour ago

    Does this UI work with Open Code?

  • dosinga 2 hours ago

    this is cool. not sure it is the first claude code style coding agent that runs against Ollama models though. goose, opencode and others have been able to do that a while no?

  • mchiang 2 hours ago

    hey, thanks for sharing. I had to go to the Twitter feed to find the GitHub link:

    https://github.com/21st-dev/1code