29 comments

  • williamstein 8 hours ago

    Title says "open source", but the Business Source License (BSL) is not an Open Source Initiative (OSI) approved open-source license.

    • taylorsatula 8 hours ago

      Fixed! BSL (to my understanding) is a copy of the license and its a 'hashicorp document' so it had their title on it.

      However, someone earlier today put me onto the concept of AGPL licenses so I changed MIRA over to AGPL because it still aligns with my overall intent of protecting my significant effort from someone coming in and Flappy Bird-ing it while still making it freely available to anyone who wants to access, modify, anything it.

    • api 8 hours ago

      The correct term for things like BSL is “source available.”

      • CamperBob2 8 hours ago

        The "correct term" for things like BSL is whatever they want to call it, as long as no trademarks are being infringed.

    • shevy-java 7 hours ago

      I see this more and more used. It seems companies want to fake stuff now, aka claiming to be open source when they are not.

      DHH also claims he is super open source when in reality he already soul-sent to the big tech bros:

      https://world.hey.com/dhh/the-o-saasy-license-336c5c8f

      We also had this recently with arduino. I don't understand why companies try to get that way. To me it is not an open source licence - it is a closed source business licence. Just with different names.

      • taylorsatula 6 hours ago

        (As I said above I changed to an AGPL earlier today but I'll speak to my BSL logic)

        I liked BSL because the code ~was~ proprietary for a time so someone couldn't duplicate my software I've worked so hard on, paywall it, and put me out of business. I'm a one-man development operation and a strong gust of wind could blow me over. I liked BSL because it naturally decayed into a permissive open source license automatically after a timeout. I'd get a head start but users could still use it and modify it from day one as long as they didn't charge money for it.

        • nawtagain 5 hours ago

          Totally fair - but just call it Source Available then.

          Open Source has a specific definition and this license does not conform to that definition.

          Stating it is open source creates a bait and switch effect with people who understand this definition, get excited, then realize this project is not actually open source.

          • eadwu 4 hours ago

            "Open Source has a specific definition and this license does not conform to that definition."

            To be fair, this wouldn't be an issue if Open Source stuck with "Debian Free Software". If you really want to call it a bait and switch, open source did it first.

          • taylorsatula 4 hours ago

            That’s fair. It’s OSI now but I get what you’re saying broadly.

      • skeledrew 5 hours ago

        > already soul-sent to the big tech bros

        I'm not seeing the justification for this comment. If anything that license, like the BSL, is aimed at keeping the small guy who worked on X in business so they can profit from their work (always need to put food on the table) while also sharing its innards with the world.

        • aeon_ai 4 hours ago

          Same.

          If you’re able to self host and run the tool for any use, it’s effectively a free, extensible, modifiable software solution.

          Copyleft licenses are as restrictive as the license DHH put out with Fizzy. I’m an Apache 2.0 or MIT licensing OSS advocate myself, but it’s difficult to argue that it’s worse or equal to a fully closed SaaS solution.

          It’s not even remotely close to one of these bullshit “ee” OSS licenses

  • kgeist 7 hours ago

    I tried making something similar a while ago, and the main problem was that long-term memory makes it easy to move the AI into a bad state where it overfixates on something (context poisoning), or decides to refuse talking to me completely. So in the end, I added a command that wipes out all memory, and ended up using it all the time.

    Maybe I was doing it wrong. The question is: how do you prevent the AI from falling into a corrupt state from which it cannot get out?

    • taylorsatula 6 hours ago

      I use a two-step generation process which both avoids memory explosion in the window and the one turn behind problem.

      When a user sends a message I: generate a vector of the user message -> pull in semantically similar memories -> filter and rank them -> then send an API call with the memories from the last turn that were 'pinned' plus the top 10 memories just surfaced. the first API call's job is to intelligently pick the actual worthwhile memories and 'pin' them till the next turn -> do the main LLM call with an up-to-date and thinned list of memories.

      Reading the prompt itself that the analysis model carries is probably easier than listening to my abstract description: https://github.com/taylorsatula/mira-OSS/blob/main/config/pr...

      I can't say with confidence that this is ~why~ I don't run into the model getting super flustered and crashing out though I'm familiar with what you're talking about.

  • JonChesterfield 9 hours ago

    Deleted comment pointed out that LICENSE.pdf is a screenshot from hashicorp. That's pretty weird, raised an issue for it.

    • taylorsatula 8 hours ago

      Issue closed! Thanks! I modified the license type to be AGPL vs. BSL.

  • lukasb 2 hours ago

    I'm playing around with it, and it's very cool! One issue is that fingerprint expansion doesn't always work, e.g. I have a memory "Going to Albania in January for a month-long stay in Tirana" and asking "Do I need a visa for my trip?" didn't turn up anything, using expansion "visa requirements trip destination travel documents..."

    What would you think about adding another column that is used for matching that is a superset of the actual memory, basically reusing the fingerprint expansion prompt?

  • Avicebron 3 hours ago

    Decay based memory scoring is a cool idea (if I'm understanding it correctly). Did you take it as an interpretation of Hebbian plasticity?

  • skeledrew 5 hours ago

    I would love to see how this shakes out using purely free/open models (for example via OpenRouter).

  • TheCraiggers 3 hours ago

    I'm curious why some features are disabled if you're using local models. Sorry if this is a dumb question.

    • reilly3000 3 hours ago

      Probably for context conservation? I’m happy to get 80k with a decent model too on my 4090.

  • chaosharmonic 5 hours ago

    > This is my TempleOS.

    This is easily one of my favorite descriptive details I've ever seen in a README.

    • taylorsatula 4 hours ago

      :D I’d also like to thank David Hahn for obsessively (and arguably compulsively) learning about a topic way out of his depth and then manifesting it till the cops took him away.

  • oidar 9 hours ago

    If you are on a Claude Pro/Max plan, can you use it by signing into your account? Or is it API only for now?

    • taylorsatula 7 hours ago

      No, I wish. That would be a really cool functionality but to my knowledge it is not possible BUT I could be wrong and would be more than happy to implement that support if someone gives me the information needed to integrate.

  • z3ratul163071 an hour ago

    hope this is not python.. hope this is not python.. ..opens repo ..python ¯\_(ツ)_/¯

  • maxwell-neumann 9 hours ago

    This is really fascinating. I’d love to see a demo.

    • taylorsatula 8 hours ago

      There is a live hosted instance a miraos.org where you can make an account and chat with MIRA through a web frontend. For now during this phase of people discovering it I'm eating the token costs so its 100% free to access and chat with.

  • idiotsecant 9 hours ago

    I looked at your web hosted version. It seemed really easy to get it to hang up when it's search tools raise an exception.

    • taylorsatula 8 hours ago

      If it throws an actual error please let me know by lodging it as an issue in the GitHub repo and I'll modify the code. I'm hanging around the house tonight to fix bugs people uncover.

      EDIT: Thanks for the feedback! I was able to pinpoint it to a change I made earlier today to allow simultanious OAI endpoints and the native Claude support. When on a model via a 3rd party provider certain parts of a toolcall were being stripped. Not any more! Pushed an update.