3 points | by kalib_tweli 8 hours ago ago
1 comments
LLM proxy for containerized AI agents. The daemon proxies LLM API calls and remote MCP tool calls through a unix socket. The runtime drives the agent loop inside the container. Credentials never cross the socket boundary.
LLM proxy for containerized AI agents. The daemon proxies LLM API calls and remote MCP tool calls through a unix socket. The runtime drives the agent loop inside the container. Credentials never cross the socket boundary.