Tooling like this is why I really want to build my own harness that can replace Claude Code, because I have been building a few different custom tools that would be nice as part of one single harness so I don't have to tweak configurations across all my different environments, projects and even OS' it gets tiresome, and Claude even has separate "memories" on different devices, making the experience even more inconsistent.
I've actually had the same itch and decided to give it a go ... So far I'm one year into the project, learned a ton and highly recommend to anyone who'd listen - try writing you own harness. It can be fun, it can be intoxicating, it can also be boring and mundane. However you'll learn so much along the way, even if you thought you already were well versed.
The way this works is that it stores workstreams and session state in a local SQLite DB, and links each ctx session to the exact local Claude Code and/or Codex raw session log it came from (also stored locally).
Prompt caching is done on the provider side. If you send two requests to a provider in short succession and the beginning of your second request is the same as your first (for example, because your second request is the continuation of an ongoing chat), the repeated tokens are much less expensive the second time.
Obviously, your tool does not provide this. But I think GP is undervaluing the UX advantages of having your conversation history.
Yes that's it. I actually just ask codex/claude code to look up the session id when I want to resume sessions cross harness, it's just jsonl files locally so it can access the full conversation history when needed.
Tooling like this is why I really want to build my own harness that can replace Claude Code, because I have been building a few different custom tools that would be nice as part of one single harness so I don't have to tweak configurations across all my different environments, projects and even OS' it gets tiresome, and Claude even has separate "memories" on different devices, making the experience even more inconsistent.
I've actually had the same itch and decided to give it a go ... So far I'm one year into the project, learned a ton and highly recommend to anyone who'd listen - try writing you own harness. It can be fun, it can be intoxicating, it can also be boring and mundane. However you'll learn so much along the way, even if you thought you already were well versed.
Pi is very extensible, and could possibly serve as a good foundation to build on.
Is it Pi LLM you're referring to? I've heard "Pi" referenced twice now, and now I'm curious, I do have unused Pis, though not Raspberry Pi 5s...
https://github.com/badlogic/pi-mono/tree/main/packages/codin...
Since prompt caching won't work across different models, how is this approach better than dropping a PR for the other harnesses to review?
Sorry, I may be misunderstanding the question.
The way this works is that it stores workstreams and session state in a local SQLite DB, and links each ctx session to the exact local Claude Code and/or Codex raw session log it came from (also stored locally).
What do you mean by prompt caching?
Prompt caching is done on the provider side. If you send two requests to a provider in short succession and the beginning of your second request is the same as your first (for example, because your second request is the continuation of an ongoing chat), the repeated tokens are much less expensive the second time.
Obviously, your tool does not provide this. But I think GP is undervaluing the UX advantages of having your conversation history.
Yes that's it. I actually just ask codex/claude code to look up the session id when I want to resume sessions cross harness, it's just jsonl files locally so it can access the full conversation history when needed.
Have you considered making it possible to share a stream/context? As an export/import function.
I wrote a tool for myself to copy (and archive) the claude/codex conversations github.com/rkuska/carn
Thanks
that's interesting, I hadn't at this point but this sounds potentially useful