2 comments

  • ajbd 5 hours ago

    This is interesting, and is solving a real problem. The contradiction resolution approach makes a lot of sense. It feels like a step towards letting models decide when they're out of their depth.

    I do have a couple questions though; 1. How does consolidation handle partial updates vs. full contradictions? (eg., "budget is $50k" -> "$50k but can flex to $60k") 2. What's the overhead of the nightly consolidation pass at scale? 3. Does the system ever surface uncertainty when two facts are recent but contradictory, or does it always prefer the newer one?

    The 99.2% -> 49.2% seems like a dramatic gap, so I'm interested to see how other memory systems perform when they submit to the benchmark.

  • Divergence42 11 hours ago

    [dead]