LLM Proxy for Agent Containers

(github.com)

3 points | by kalib_tweli 8 hours ago ago

1 comments

  • kalib_tweli 8 hours ago

    LLM proxy for containerized AI agents. The daemon proxies LLM API calls and remote MCP tool calls through a unix socket. The runtime drives the agent loop inside the container. Credentials never cross the socket boundary.