Peerweb: Decentralized website hosting via WebTorrent

(peerweb.lol)

377 points | by dtj1123 2 months ago ago

136 comments

  • xd1936 2 months ago

    Fun! I wish WebTorrent had caught on more. I've always thought it had a worthy place in the modern P2P conversation.

    In 2020, I messed around with a PoC for what hosting and distributing Linux distros could look like using WebTorrent[1]. The protocol project as a whole has a lovely and brilliant design but has stayed mostly stagnant in recent years. There are only a couple of WebRTC-enabled torrent trackers that have remained active and stable.

    1. https://github.com/leoherzog/LinuxExchange

    • r14c 2 months ago

      I think the issue has generally been that web torrent doesn't work enough like the real thing to do its job properly. There are huge bit torrent based streaming media networks out there, illicit, sure, but its a proven technology. If browsers had real torrent clients we would be having a very different conversation imo

      I don't remember the web torrent issue numbers off the top of my head, but there are a number of long standing issues that seem blocked on webrtc limitations.

      • embedding-shape 2 months ago

        I think we still have the same blocker as we had back when WebTorrent first appeared; browsers cannot be real torrent clients and open connections without some initial routing for the discovery, and they cannot open bi-directional unordered connections between two browsers.

        If we could say do peer discovery via Bluetooth, and open sockets directly from a browser page, we could in theory have local-first websites running in the browser, that does P2P connections straight between browsers.

        • miki123211 2 months ago

          Could you run some kind of hybrid DHT where part of it was Webrtc and part was plain HTTP(S) / WebSocket?

          There are some nodes (desktop clients with UPNP, dedicated servers) that can accept browser connections. Those nodes could then help you exchange offers/answers to give you connections with the Webrtc-only ones, and those could facilitate offer/answer exchanges with their peers in turn.

          It'd be dog-slow compared to the single-udp-packet-in, single-udp-packet-out philosophy of traditional mainline DHT, but I don't see why the idea couldn't work in principle.

          I think a much bigger problem is content discovery and update distribution. You can't really do decentralized search because it'd very quickly get sybil-attacked to death. You'd always need some kind of centralized, trusted content index, but not necessarily one hosted on a centralized server. If you could have a reliable way to go from a pubkey to the latest hash signed by that pubkey in a decentralized way, + E.G. a Sqlite extension to get pages on-demand via WebTorrent, that would get you a long way towards solving the problem.

          • namibj 2 months ago

            That was you ask exists; it updates through a version counter. It just works on mainline DHT btw.

        • Seattle3503 2 months ago

          If a tracker could be connected to via WebRTC and had additional STUN functionality, would that suffice? Are there additional WebRTC limitations?

          > they cannot open bi-directional unordered connections between two browsers.

          Last I checked, DataChannels were bidirectional

          • embedding-shape 2 months ago

            Yes, but it's STUN that sucks. If the software ships with a public (on the internet) relay/STUN server for connecting the two clients, it won't work if either aren't connected to the internet, even though the clients could still be on the same network and reach each other.

            • westurner 2 months ago

              /? STUN: https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...

              There is a Native Sockets spec draft that only Chrome implements;

              "Direct Sockets API": https://developer.chrome.com/docs/iwa/direct-sockets :

              > The Direct Sockets API addresses this limitation by enabling Isolated Web Apps (IWAs) to establish direct TCP and UDP connections without a relay server. With IWAs, thanks to additional security measures—such as strict Content Security Policy (CSP) and cross-origin isolation— this API can be safely exposed.

              Though there's UPNP XML, it lacks auth for port forwarding permissions. There's also IPV6.

              Similar: "Breaking the QR Limit: The Discovery of a Serverless WebRTC Protocol – Magarcia" https://news.ycombinator.com/item?id=46829296 re: Quick Share, Wi-Fi Direct, Wi-Fi Aware, BLE Beacons, BSSIDs and the Geolocation API

            • jychang 2 months ago

              That seems like a nonissue for the purposes of this discussion though, in terms of user uptake. Tiktok and Facebook and other websites aren't exactly focused on serving to people on the same network.

      • 1vuio0pswjnm7 2 months ago

        "If browsers had real torrent clients we would be having a very different conversation imo"

        The elinks text-only browser has a "real" torrent client

      • khimaros 2 months ago
    • cranberryturkey 2 months ago
      • FireInsight 2 months ago

        Can't seem to find any mentions of this online from over a week ago, not much commentary either, mostly stuff that smells like advertising / astroturfing. Hmm...

      • palata 2 months ago

        I had never heard of this! Pretty cool!

      • xd1936 2 months ago

        Oh wow

    • bluedino 2 months ago

      Was there ever a web-based Jigdo?

      • xd1936 2 months ago

        This was a fun rabbit hole. Thanks for educating me!

  • woctordho 2 months ago

    What a pity that although webtorrent support is already merged in the libtorrent master branch years ago, it's not merged into the stable branch yet, therefore not working out of the box in clients like qBittorrent.

    • palata 2 months ago

      If it was, would it mean that qBittorrent would share with web clients by default? My understanding was that it's not the same protocol, so I'm guessing that a client like qBittorrent would have to choose to "bridge" between both protocols, right?

    • nektro 2 months ago

      this is the real reason it hasnt caught on

  • mcjiggerlog 2 months ago

    This is cool - I actually worked on something similar way back in the day: https://github.com/tom-james-watson/wtp-ext. It avoided the need to have any kind of intermediary website entirely.

    The cool thing was it worked at the browser level using experimental libdweb support, though that has unfortunately since been abandoned. You could literally load URLs like wtp://tomjwatson.com/blog directly in your browser.

  • kamranjon 2 months ago

    I think one of the values of (what appears to be) AI generated projects like this is that they can make me aware of the underlying technology that I might not have heard about - for example WebTorrent: https://webtorrent.io/faq

    Pretty cool! Not sure what this offers over WebTorrent itself, but I was happy to learn about its existence.

  • DJBunnies 2 months ago

    Every time I try these they never work, including this one.

    I’m not sure what the value prop is over just using a torrent client?

    Maybe when they’re less buggy they’ll become a thing.

    • Sephr 2 months ago

      I'm planning to eventually launch an open source platform with the same name (peerweb.com) that I hope will be vastly more usable, with a distributed anti-abuse protocol, automatic asset distribution prioritization for highly-requested files, streaming UGC APIs (e.g. start uploading a video and immediately get a working sharable link before upload completion), proper integration with site URLs (no ugly uuids etc. visible or required in your site URLs), and adjustable latency thresholds to failover to normal CDNs whenever peers take too long to respond.

      I put the project on hiatus years ago but I'm starting it back up soon! My project is not vibe coded and has thus far been manually architected with a deep consideration for both user and site owner expectations in the web ecosystem.

      • coxmi 2 months ago

        This sounds really interesting, do you have any more info or a repo to follow?

        • Sephr 2 months ago

          I'll post about the repo and more on my blog once I'm ready.

    • palata 2 months ago

      Well this is supposed to load a website in the browser like a "normal" website (doesn't work for me, stuck on "Connecting to peers...").

      Just using a torrent client means that you have to download the website locally with a torrent client, and then open it in your browser. Most people wouldn't do that.

    • bawolff 2 months ago

      If it actually worked i could certainly see the value prop of not making users download a separate program. Generally downloading a separate program is a pretty big ask.

  • misir 2 months ago

    I wonder if these colors are a kind of a watermark that are hardcoded as system instructions. Almost all slopware made using claude have the same color palette. So much for a random token generator to be this consistent

    • orbital-decay 2 months ago

      https://en.wikipedia.org/wiki/Mode_collapse

      Ask any modern (post-GPT-2) LLM about a random color/name/city repeatedly a few dozen times, and you'll see it's not that random. You can influence this with a prompt, obviously, but if the prompt stays the same each time, the output is always very similar despite the existence of thousands of valid alternatives. Which is the case for any vibecoded thing that doesn't specify the color palette, in particular.

      This effect is largely responsible for slop (as in annoying stereotypes). It's fixable in principle, but there's pretty little research and I don't see big AI shops care enough.

    • 2 months ago
      [deleted]
    • karanSF 2 months ago

      Emojis on every line are an AI tell. The times I do use AI (shhhh...) I always remove them and tweak the language a bit.

      • netule 2 months ago

        Before LLMs became big, I used emojis in my PRs and merge requests for fun and to break up the monotony a bit. Now I avoid them, lest I be accused of being a bot.

      • rudhdb773b 2 months ago

        Isn't it mostly ChatGPT that does that?

        Grok almost never uses emojis.

    • IhateAI 2 months ago

      Yep, and I refuse to use sites that look like this. Lovable built frontend/landing pages have a similar feel. Instant lost of trust and desire to try it out.

      • bawolff 2 months ago

        Its interesting - AI has a certain style. You can see it in pictures and even text content. It does instantly get my guard up.

      • j45 2 months ago

        That's interesting - do you think because it's familiar to you?

        Would it be the case for folks who don't have any idea what Lovable is.

        Familiar UI is similar to what Tailwind or Bootstrap offers, do they do something different to keep it fresh?

        Average internet users/consumers are likely used to the default Shopify checkout.

        • IhateAI 2 months ago

          Its probably more of a me "problem". But I'm sure there are plenty of others that share my sentiment. It doesn't really have anything to do with it being familiar, familiar can be good, but what I'm talking about is a familiar ugliness and lack of intention.

          The Stripe or Shopify checkout is familiar, but it only became familiar because it was well designed and people wanted to keep using it.

          Also when its obvious someone used an LLM, it bleeds into my overall opinion of the product whether the product is good or not. I assume less effort was put into the project, which is probably a fair assumption.

  • bawolff 2 months ago

    > Enhanced security with DOMPurify integration!

    > XSS Protection - All HTML sanitized with DOMPurify > Malicious Code Removal - Dangerous tags and attributes filtered > Sandboxed Execution - Sites run in isolated iframe environment

    I don't think that super makes sense. You probably just want the iframe sandbox and not remove all js. Or ideally put the torrent hash as the subdomain to use same origin policy.

  • sroerick 2 months ago

    This is pretty interesting!

    I think serving video is a particularly interesting use of Webtorrent. I think it would be good if you could add this as a front end to basically make sites DDOS proof. So you host like a regular site, but with a JS front end that hosts the site P2P the more traffic there is.

    • NewsaHackO 2 months ago

      I think it is very difficult (and dangerous to the host) to serve user-uploaded videos at scale, particularly from a moderation standpoint. The problem is even worse if everyone is anonymous. There is a reason YouTube has such a monopoly on personal video hosting. Maybe developments in AI moderation will make it more palatable in the future.

      • t-3 2 months ago

        The "host" is the user in this case. Every user that watches the video, shares the video. Given that discovery doesn't appear to be a part of this platform, any links would undoubtedly be shared "peer-to-peer" as well, so if you aren't looking at illegal things and don't have friends sending you illegal things to watch, it's perfectly safe.

        • lgats 2 months ago

          webtorrent!

      • sroerick 2 months ago

        What I'm suggesting is more in the context of self hosting - a JS wrapper which would make it easy to host a video with plain HTML while preventing bandwith issues.

        • zkhrv 2 months ago

          Hey, that's pretty much the project I've been thinking of doing for a while!

          I really dislike the monopoly YouTube has on online video, but other options or self-hosting can become really expensive due to bandwidth (especially if your video suddenly goes viral). I think P2P (over WebRTC, for browser compatibility) has potential for creating a solution.

          Roughly:

          * You have a Website with some <video>s you want to share hosted somewhere—doesn't matter if it's dynamic, static, hosted on a CDN or a VPS, as long as it imports a few scripts and embellishes the <video> tag with a few details.

          * Base case: you want to publish a video, you host the video. That means you host a server at home that is running the hosting software (maybe WebTorrent based, maybe something custom). As high-speed fiber Internet becomes more common, hosting video from a home network becomes more feasible (unless ISPs decide to cock-block it).

          * A signaling server establishes a P2P connection (WebRTC datachannels) between a visitor of your Website and your video-hosting home server.

          * If you have a fast enough Internet connection (100+ Mbps) and low traffic, I don't see why the base case shouldn't work (other than network connectivity problems due to complicated NATs and such). If you have a surge of simultaneous traffic, those that came earlier can offload pressure on the home server by seeding the video chunks they have already downloaded. Theoretically, infinite scalability without bandwidth or hardware bottlenecks (but likely coordination woes in practice).

          But there's more!

          Say there's another person on the Web hosting their videos in the same way. If I like their video, I can re-host it on my own home server and let their signaling server know about it. Now there are two "persistent" video servers hosting the video that a visitor can download from. If I trust this person, I can choose to automatically re-host all their past and future videos.

          Moderation isn't a problem, because you explicitly choose which videos to re-host, or because you re-host (future) videos of people you trust.

          The more people re-host videos, the better their availability, download speed, and latency if hosts are geographically distributed.

          Further ideas of pooling signaling servers and home servers into networks to enable other possible niceties (though likely with a more substantial moderation burden) ...

          I've tried a really basic proof of concept of the base case (across a few countries on a mobile network!), and it worked!

          Currently, I'm looking to talk with anyone who is interested in any of this :)

    • stanac 2 months ago

      There is PeerTube for video content.

      • sroerick 2 months ago

        I like Peertube a lot, and I didn't realize until just now that they had a form of P2P distributed distribution which uses WebRTC. But it would be great to be able to do that with a static site, without deploying a whole framework. Just a simple JS wrapper which could sit on top of a <video> element would be amazing

  • ajnavarro 2 months ago

    I built something similar a while back: Distribyted Gate: it turns any magnet link into a browsable webpage.

    The key difference is the approach: it uses a Service Worker as an embedded HTTP server in the browser. This means files are loaded on-demand rather than requiring full downloads upfront. The SW intercepts fetch requests and streams chunks directly from the torrent swarm.

    Live demos using some PeerWeb demo sites:

    - Chess: https://gate.distribyted.com/?tid=1e14b1ba7fcd03e5f165d53ed8...

    - Functionality test page: https://gate.distribyted.com/?tid=90c020bd252639622a14895a0f...

    Code: https://github.com/distribyted/gate

    Caveat: This is a proof of concept, so stability varies and it works best on Chromium-based browsers.

  • logicallee 2 months ago

    I tried this, the functional "Functionality test page:" is stuck on "Loading peer web site... connecting to peers". I can't load any website from this.

    https://imgur.com/gallery/loaidng-peerweb-site-uICLGhK

    • davidcollantes 2 months ago

      Yes, none work for me. They either don’t have peers, or the few ones are on a very slow network.

  • littlecranky67 2 months ago

    Cool. Some people complained about broken demos, I uploaded the mdwiki.info [1] website unaltered and seems to work fine [0]. MDwiki is a single .html file that fetches custom markdown via ajax relative to the html file and renders it via Javascript.

    [0]: https://peerweb.lol/?orc=b549f37bb4519d1abd2952483610b8078e6...

    [1]: https://dynalon.github.io/mdwiki/

    • Timwi 2 months ago

      Why is it called MDwiki? It's clearly not a wiki.

      • littlecranky67 2 months ago

        The idea is to host it on github, and people send changes to the content via pull requests (vs. editing like in wikipedia). There is no backend, just plain files.

        • Timwi 2 months ago

          I see. I would personally have counted GitHub as a backend.

      • jmercouris 2 months ago

        Sure, in a sense, but “wiki” actually just means “quick”.

  • SLWW 2 months ago

    I can't imagine that Peerweb has much in the way of stopping certain types of material from being uploaded.

    • 2 months ago
      [deleted]
    • j45 2 months ago

      Smaller site likely have a smaller footprint

    • estimator7292 2 months ago

      [flagged]

      • ericyd 2 months ago

        This response feels disproportionate to the comment's comment

        • rainonmoon 2 months ago

          And also just… misguided? I don’t particularly think of neo-Nazis when I think of people who advocate against CSAM.

          • SLWW 2 months ago

            We all know that CSAM is one of the first things that gets uploaded to these sorts of platforms.

            If advocating against CSAM = Fascism then I'll be the first to say that i'm a nazi facist. o7

            • tombert 2 months ago

              In high school, an acquaintance of mine made the website "e-imagesite.com" [1]. It was a very easy-to-use image uploading site (and honestly less irritating than ImageShack and predated imgur). It was just being hosted on HostGator, I believe, and written in PHP and used jQuery.

              I believe he had to eventually shut it down because people kept uploading horrifying stuff to it, and it was never even that popular. Child porn and bestiality were constantly being uploaded and I don't think he liked having to constantly report stuff to the FBI.

              After building a proper comment section for my blog (including tripcodes!), I've thought about making my own "chan" site, since I think that could be fun, but I am completely terrified of people uploading horrible stuff that I would be forced to sift through and moderate pretty frequently. User submissions open up a huge legal can of worms and I am not sure that's a path that I'm willing to commit myself going down.

              When there's strong anonymity, I suspect that this problem could be even worse.

              It's a little depressing, because decentralized and distributed computing is one of the most interesting parts of computer science to me, but it feels like whenever I mention anything about it, people immediately assume piracy or illicit material.

              [1] https://web.archive.org/web/20090313063155/http://www.e-imag...

              • 2 months ago
                [deleted]
            • rainonmoon 2 months ago

              Yeah, I’m fully in support of a decentralised web but the internet is old enough now that being naive about this stuff has become equivalent to being maliciously incompetent. Without designing for things like community or self-governance and moderation, you’re designing for trouble. Thinking about ways to healthily cultivate a peer-to-peer web doesn’t make someone a Nazi, it makes them a responsible member of a community.

    • b00ty4breakfast 2 months ago

      you can't stop someone from verbally describing certain objectionable material, therefore we should regulate the medium thru which sound travels and suck up all the oxygen on the planet. it's the only way to save the children

      • 2 months ago
        [deleted]
      • palata 2 months ago

        You're so right! I had never thought of that! We should remove all moderation everywhere, everything should be legal everywhere all the time! /s

        • SLWW 2 months ago

          I wonder who censored my other response to another commenter (I said that if being against hosting CSA means I'm a 'n4z1' than I'm ok with that derogatory label)

          Though the irony of being censored while talking about how CSA should be censored is very funny to me.

          These pro-speech absolutists (who are usually libertarian) miss that NAP is clearly violated when any abuse is hosted/disseminated. Additionally while speech should never be censored.. everything must be permissible if all media is speech. You can have a society that censors multi-media but not the caption/description attached; other allowances (for other types of "speech") always leads to the most degenerate of people having a voice while everyone else is punished for calling them out.

        • 2 months ago
          [deleted]
  • gnarbarian 2 months ago

    love this. I've been working on something similar for months now

    https://metaversejs.github.io/peercompute/

    it's a gpgpu decentralized heterogeneous hpc p2p compute platform that runs in the browser

  • turtleyacht 3 months ago
    • dang 2 months ago

      Thanks! we'll put that link in the toptext.

  • kkfx 2 months ago

    In the past ZeroNet was performant enough to realistically share websites but it's abandoned (ZeroNet Conservacy exist but no active peers seems to exists) this allow client to use an website without installing anything, which is nice, but how to get things visible initially it's well... A human challenge...

  • j45 2 months ago

    In its own reimagined way from what’s possible in 2026, this could kick off a new kind of geocities.

  • BrouteMinou 2 months ago

    Nice, I clicked on the first demo, and I got stuck at connecting with peers.

    I like the idea though.

  • Omodaka9375 2 months ago

    Hi, Omodaka here thanks for checking out PeerWeb. I forgot to turn the client to serve the demos - should be better now ;)

    This is meant to work with PeerWeb App which is more secure and stripped down torrent desktop client, that you can use to share your websites and host them through peerweb.lol. Still haven't released the dektop client but might do it.

    Point of this is for everyone to host their content without needing servers, and as a great learning experience. Security is very big caviat here, so in no way is this final secure version.

  • dana321 2 months ago

    None of the demo sites work for me.

    Probably needs more testing and debugging.

  • keepamovin 2 months ago

    I'm glad to see this was not unexpectedly fast to load. Would not want to upset those distributed expectations! I wonder if there's a business model in selling speed on a robust network that is on average too slow. Is there anyway to incentivize more nodes through micropayments distributed from people who pay for their site to be served faster?

    Ultimately I guess the distributed web is felled by economics thus far.

    • Omodaka9375 2 months ago

      Runing dedicated, trusted tracker swarm would do.

  • 1vuio0pswjnm7 2 months ago

    No Javascript

    https://github.com/Omodaka9375/peerweb

    https://github.com/Omodaka9375/peerweb/releases/expanded_ass...

    If the address is a hash perhaps it could contain a public key

    • 1vuio0pswjnm7 2 months ago

      In theory the hash could contain a public key. In practice it does not

      Design decisions

    • Omodaka9375 2 months ago

      Nope just a magnet link

  • 2 months ago
    [deleted]
  • karel-3d 2 months ago

    the problem is always updating regularly.

    I liked BitTorrent Sync, but it was always closed source, and now it's part of something called Resilio.

  • palata 2 months ago

    I have been intrigued by WebTorrent for a while. From my experience downloading Linux distros over Torrent, I know that it works really well when many people contribute.

    But I have never had a successful experience with WebTorrent, presumably because it is less popular and I have never found a use-case where enough peers were sharing?

  • khimaros 2 months ago

    reminds me a bit if ZeroNet, which still has a maintained fork somewhere out there https://github.com/zeronet-conservancy/zeronet-conservancy/

  • Uptrenda 2 months ago

    I feel like if it were combined with federated caching servers it would actually work. Then you would have persistence and the p2p part helps take load off popular content. There are now P2P databases that seem to operate with this. Combining the best of both worlds.

  • Reptur 2 months ago

    Heads up: I left a Peerweb link open overnight (~8 hours) hoping it would load. It consumed ~15GB of memory and triggered an OOM kill, crashing the browser. Browser: LibreWolf (latest) OS: Fedora 43

  • bricss 2 months ago

    Somebody has to revive Nullsoft WASTE p2p from 2003 tho

  • cyrusradfar 2 months ago

    OT: Can someone vibe-code Geocities back to life?

    • 800xl 2 months ago

      Check out neocities.org

      • cyrusradfar 2 months ago

        you made my life. Thank you life long internet friend.

    • ipaddr 2 months ago

      That would take forever. If you can get the domain I'll hand code it in perl.

      • awesome_dude 2 months ago

        <marquee><blink>Neat!!</blink></marquee>

    • AreShoesFeet000 2 months ago

      give me the tokens.

  • kruhft 2 months ago

    This is probably going to be taken down like my site was that used Web Torrent.

    dropclickpaste.com is for sale. kruhft.at.gmail.com

  • fooker 2 months ago

    What do you all think of the chances that we have decentralized AI infrastructure like this at some point?

  • rickcarlino 2 months ago

    Similar project I vibe coded a few weeks ago: "Gnutella/Limewire but WebRTC".

    https://github.com/RickCarlino/hazelhop

    It works, though probably needs some cleanup and security review before being used seriously (thus no running public instance).

  • likiiio 2 months ago

    Can sanitation be disabled? I.e. can this be used to access static websites as-is?

  • dcreater 2 months ago

    Good, important idea. Unfortunately bad, low effort vibe coded execution

    • j45 2 months ago

      Still a shipped idea, driven by someone. The author has some other interesting ideas.

      • dcreater 2 months ago

        does it matter if it shipped if no one uses it?

  • journal 2 months ago

    i wish stuff like this was more like double-click, agree, and use. they always make it complicated to where you're spending time trying to understand if you should continue to spend more time on this.

  • supernes 2 months ago

    Is the tracker down? Can't open the demos...

  • dpweb 2 months ago

    Useless if it takes > 5 sec. to load a page

    • TuringTest 2 months ago

      You never lived the 90's

      • alfiedotwtf 2 months ago

        lol.

        Not only did it take > 5 seconds to load a page, images were progressively loaded as fast as two at a time over the next minute or so - if there were no errors during transfer!

  • als0 2 months ago

    Why does every sentence have an emoji?

  • wackget 2 months ago

    Nice idea. Shame absolutely everything about the website screams AI slop.

  • m00dy 2 months ago

    connection overhead is way too much for modern world.

  • tiku 2 months ago

    Napster.. so what happens if peerweb.lol goes down?

  • maximgeorge 2 months ago

    [dead]

  • TaoWay 2 months ago

    [dead]

  • vyr 2 months ago

    [dead]

  • ethepax 2 months ago

    [dead]

  • elbci 3 months ago

    I don't get it, I upload my files to your site, then I send my friends links to your site? How is this not a single point of failure?

    • toomuchtodo 2 months ago

      IPFS [1] requires a gateway unfortunately (whether remote or running locally). If you can use content idents that are supported by web primitives, you get the distributed nature without IPFS scaffolding required. Content is versioned by hash, although I haven't looked to see if mutable torrents [2] [3] are used in this implementation. Searching via distributed hash tables for torrent metadata, cryptographically signed by the publisher, remains as a requirement imho.

      Bittorrent, in my experience, "just works," whether you're relying on a torrent server or a magnet link to join a swarm and retrieve data. So, this is an interesting experiment in the IPFS, torrent, filecoin distributed content space.

      [1] https://ipfs.tech/

      [2] https://news.ycombinator.com/item?id=29920271

      [3] https://www.bittorrent.org/beps/bep_0046.html

      • amelius 2 months ago

        You don't hear much these days about IPFS, but I can remember one big problem with it was illegal content and how to deal with it.

        • grumbel 2 months ago

          It's worse than just illegal content. Copyright doesn't allow you to redistribute anything without the permission of the copyright holder. IPFS however has no means to track the author or the license of content.

          That means even distributing a piece of perfectly legal Open Source becomes illegal. Unlike a tarball or even a torrent where you can bundle content and license, IPFS allows addressing individual files or blocks, thus stripping the license from the content, which most licenses forbid. This does not even require an intentional action on the user, but happens automatically by partial content landing in your cache.

    • dang 2 months ago

      [sorry for the weird timestamps - the OP was submitted a while ago and I just re-upped it.]

    • dtj1123 3 months ago

      This isn't my site, nor do I have any opinions on the implementation here. I do however find the idea of serving web pages via torrent interesting.

      • elbci 3 months ago

        p2p storage as in torrent or IPFS or whatever is the part that we kinda' solved already. Serving/searching/addressing without the (centralized) DNS is still missing for a (urgently needed) p2p censorship resistant internet. Unfortunately this guy just uses some buzzwords to offer nothing new - why would I share links to that site instead of sharing torrent magnet links?

        • recursivegirth 2 months ago

          Thinking about this a little bit... could we use a blockchain ledger as an authoritative source for DNS records?

          User's can publish their DNS + pub key to the append-only blockchain, signed with their private key.

          Use a torrent file to connect to an initial tracker to download the blockchain.

          Once the blockchain is downloaded, every computer would have a full copy of the DNS database and could use that for discoverability.

          I have no experience with blockchains or building trackers, so maybe this is a dumb idea.

        • sroerick 2 months ago

          This is a great point.

          One issue I've had with IPFS is that there's nothing baked into the protocol to maintain peer health, which really limits the ability to keep the swarm connected and healthy.

          • theendisney 2 months ago

            I use to add webseeds but clients seem to love just downloading it from there rather than from my conventional seeding.

            Some new ideas are needed in this space.

        • dtj1123 3 months ago

          You make a good point.

    • 3 months ago
      [deleted]