I’ve been obsessed with the "last mile" bottleneck in AI agents. Traditional web automation relies on CDP (Chrome DevTools Protocol) to serialize the DOM or take screenshots. This "capture-encode-transmit" loop is computationally ruinous and introduces latencies that often exceed the reasoning time of the underlying LLM.
I built Glazyr Viz (originally Neural Chromium) by forking Chromium to integrate the agent directly into the browser’s compositor subsystem (Viz).
Key Technicals:
Zero-Copy Vision: We establish a Shared Memory (SHM) segment between the Viz process and the agent using shm_open. This allows the agent to access raw frame buffers with sub-16ms latency—essentially syncing with the 60Hz refresh rate.
Hardening (ThinLTO/CFI): Security is usually a performance tax, but we achieved a "Performance Crossover." By coupling Control Flow Integrity (CFI) with ThinLTO (LLVM 19), we achieved a 40% speed increase in JS execution and an 85.8% reduction in P99 latency jitter.
Context Density: Instead of raw markdown, we deliver structured context via a vision.json schema. On our "Big Iron" (GCE n2-standard-8) cluster, we’re hitting 177 TPS.
Economic Layer (x402): I wanted a way for machines to pay machines. We use the x402 (Payment Required) status on Base (USDC). It’s gasless via EIP-7702 delegation, so agents can pay-per-frame without holding ETH.
Why this matters:
If you're building agents for high-frequency tasks (live trading, dynamic UI auditing, or real-time navigation), standard scrapers are too brittle and slow. Glazyr Viz treats the browser as a high-speed sensory organ rather than a slow external portal.
Try it: I’ve published the orchestration layer to NPM. You can spin up a local node and hook your agent into it immediately:
npx -y glazyrviz
I’d love to hear from anyone working on low-level rendering or agentic infrastructure. Happy to dive into the shm_open implementation or the ThinLTO pass details in the comments.
Actually, for the early beta testers here on HN, I’m upping the sponsorship to 500 free frames. That should give you plenty of headroom to stress-test the Zero-Copy pipeline on some heavier sites.
No ETH needed—just run the npx command and the Paymaster handles the rest on Base
For anyone wanting the deep-dive on the binary hardening and Viz sub-system patches I mentioned, I just offloaded the full technical analysis here:https://glazyrviz.blogspot.com/2026/02/inside-zero-copy-engi....
Also, I’ve officially bumped the sponsorship pool to 500 free frames for everyone testing npx -y glazyrviz today.
I’ve been obsessed with the "last mile" bottleneck in AI agents. Traditional web automation relies on CDP (Chrome DevTools Protocol) to serialize the DOM or take screenshots. This "capture-encode-transmit" loop is computationally ruinous and introduces latencies that often exceed the reasoning time of the underlying LLM.
I built Glazyr Viz (originally Neural Chromium) by forking Chromium to integrate the agent directly into the browser’s compositor subsystem (Viz).
Key Technicals:
Zero-Copy Vision: We establish a Shared Memory (SHM) segment between the Viz process and the agent using shm_open. This allows the agent to access raw frame buffers with sub-16ms latency—essentially syncing with the 60Hz refresh rate.
Hardening (ThinLTO/CFI): Security is usually a performance tax, but we achieved a "Performance Crossover." By coupling Control Flow Integrity (CFI) with ThinLTO (LLVM 19), we achieved a 40% speed increase in JS execution and an 85.8% reduction in P99 latency jitter.
Context Density: Instead of raw markdown, we deliver structured context via a vision.json schema. On our "Big Iron" (GCE n2-standard-8) cluster, we’re hitting 177 TPS.
Economic Layer (x402): I wanted a way for machines to pay machines. We use the x402 (Payment Required) status on Base (USDC). It’s gasless via EIP-7702 delegation, so agents can pay-per-frame without holding ETH.
Why this matters: If you're building agents for high-frequency tasks (live trading, dynamic UI auditing, or real-time navigation), standard scrapers are too brittle and slow. Glazyr Viz treats the browser as a high-speed sensory organ rather than a slow external portal.
Try it: I’ve published the orchestration layer to NPM. You can spin up a local node and hook your agent into it immediately: npx -y glazyrviz
I’d love to hear from anyone working on low-level rendering or agentic infrastructure. Happy to dive into the shm_open implementation or the ThinLTO pass details in the comments.
this sounds really great. I'm going to try it.
Actually, for the early beta testers here on HN, I’m upping the sponsorship to 500 free frames. That should give you plenty of headroom to stress-test the Zero-Copy pipeline on some heavier sites.
No ETH needed—just run the npx command and the Paymaster handles the rest on Base
For anyone wanting the deep-dive on the binary hardening and Viz sub-system patches I mentioned, I just offloaded the full technical analysis here:https://glazyrviz.blogspot.com/2026/02/inside-zero-copy-engi.... Also, I’ve officially bumped the sponsorship pool to 500 free frames for everyone testing npx -y glazyrviz today.