Canvas Fingerprinting

(browserleaks.com)

99 points | by janandonly 2 days ago ago

122 comments

  • thisislife2 2 days ago

    PaleMoon browser has a canvas.poisondata config setting (also available in the Preferences GUI) which prevents such fingerprinting. However, note that these things are not used in isolation - browser fingerprinting is done by collecting various other data too, and when collated together, these can provide enough uniqueness to identify (or categorise) your browser into some specific group. We are fighting a losing battle for privacy.

    • kmeisthax 2 days ago

      Hot take: instead of making fingerprints more unique, we should be striving to make fingerprints as identical as possible, so they stop being useful as fingerprints. In the case of Canvas 2D, that would mean standardized bit-level definitions of what all the canvas operations do, so all the browsers do the same thing.

      • modeless a day ago

        Unfortunately this could only be done by forbidding web content from using GPUs at all. GPUs are extremely diverse compared to CPUs. It is not feasible to standardize bit-exact operations on diverse GPU hardware while maintaining reasonable performance.

        Some will say that we should forbid web content from using GPUs. But that would push apps and their users to native platforms that don't even try to minimize fingerprinting at a technical level, instead either relying on legal enforcement by monopoly gatekeepers (with all the problems that implies), or providing no protection against fingerprinting at all.

        • shiomiru a day ago

          > Some will say that we should forbid web content from using GPUs. But that would push apps and their users to native platforms that don't even try to minimize fingerprinting at a technical level

          Conflating "websites" with "apps" is what lead to this mess in the first place. You can't just "try to minimize fingerprinting", you either prevent it or don't. And by the time you expose a GPU to the web, you're well inside the latter category.

          IMO the only solution to the tracking epidemic is making boundaries between the two clear. Just like some random blog can't read my GPS location without asking first, it shouldn't be given access to other tracking vectors without user consent either.

          • modeless a day ago

            > You can't just "try to minimize fingerprinting"

            Plainly false. You can minimize the number of bits of entropy in the fingerprint even in situations where a couple of bits are unavoidable, and you can mitigate fingerprinting methods by detection and/or blocking. Browsers do this today.

            The web is crucial as the only free platform for distributing software to a huge chunk of consumer devices. Apple would love to strengthen their iOS app distribution monopoly by forbidding sophisticated web apps. That's why they have dragged their feet implementing more advanced web standards and limited their capabilities when they do implement them (for example making fullscreen mode unusable for games).

            • shiomiru a day ago

              > Plainly false. You can minimize the number of bits of entropy in the fingerprint even in situations where a couple of bits are unavoidable,

              A single API may just yield a couple of bits, but it adds up when there are hundreds of APIs, with new ones introduced every week. And you don't need that many bits to uniquely identify someone.

              But sure, leaking a few bits here and there might as well be unavoidable when two of the three major browser vendors are ad companies and preventing it isn't a priority. (See the saga about Google and 3rd-party cookies.)

              > and you can mitigate fingerprinting methods by detection and/or blocking. Browsers do this today.

              You can mitigate a finite set of fingerprinting methods that you know of. It becomes exponentially harder with every new tracking vector that is enabled by default, especially when the expectation is that things Just Work.

              (For example, blocking canvas readout breaks canvas-based image resizing on lots of websites that use the first result from stackoverflow.)

              > The web is crucial as the only free platform for distributing software to a huge chunk of consumer devices. Apple would love to strengthen their iOS app distribution monopoly by forbidding sophisticated web apps. That's why they have dragged their feet implementing more advanced web standards and limited their capabilities when they do implement them (for example making fullscreen mode unusable for games).

              Respectfully, I don't see a pressing need to solve the issue of "you don't own Apple devices you pay for" by stuffing every possible API under the sun into the browser.

              Besides, I'm not advocating against sophisticated web apps; I just wish browsers applied the principle of least privilege when adding features ripe for abuse. e.g. maybe I would allow GPU access for a web-based 3D game whose developer I trust, but not some random blog that will use it to either fingerprint me or run a cryptominer.

              • modeless a day ago

                > See the saga about Google and 3rd-party cookies

                This is a pet peeve of mine. I haven't seen a sane take on this anywhere. Getting rid of 3rd party cookies to prevent tracking has been a priority for Google for many years. Everyone thinks they haven't done it because they hate privacy or something; nothing could be further from the truth. They have been blocked on disabling 3rd party cookies because of antitrust concerns coming from other ad companies who object to being blocked from tracking users.

                • shiomiru 20 hours ago

                  Antitrust concerns which wouldn't have had any weight if not for yet another tracking mechanism that Google had intended to add in place of third party cookies.[0]

                  It's not because Google "hates privacy", it's because Google operates to generate profit, and it does so from targeted advertising.

                  [0]: https://www.eff.org/deeplinks/2021/03/googles-floc-terrible-...

                  • modeless 19 hours ago

                    See, you have it backwards. It's exactly the opposite. FLoC etc were designed to mitigate the antitrust concerns by replacing 3rd party cookie tracking for the other ad companies, allowing 3rd party cookie deprecation to proceed. By blocking FLoC, activists made it impossible for Google to deprecate 3rd party cookies in Chrome, as that would guarantee a loss in antitrust court.

                    Google themselves never needed FLoC for their own ads business. Their search and video ad businesses don't need 3rd party tracking to be successful. Google has the most first party data; users literally tell Google their intent directly by typing it into the search box. Advertising on 3rd party sites is a small minority of Google's revenue, and the part of that attributable to cross site tracking is even smaller.

                    But Google had to provide something to replace cookie tracking for the other ad companies that don't have the first party data Google has. Those ad companies rely on 3rd party cookies to compete with Google. If Google blocked 3rd party cookies in Chrome with no replacement they would instantly be sued for leveraging their browser market share to kill their competition in the ads market, and they would lose big.

      • amy-petrik-214 18 hours ago

        As COVID has taught us "if only we had all obeyed the stay at home order" (which would have eliminated COVID and a chunk of dozens or hundreds of infectious diseases), relying on the bulk of man to "get with the plan" is a fool's errand, ipso facto the proposition of such is and was foolish for reason of blindness to this fact.

        That is to say, if we all have a "same browser" then there will be people with "not same", there will be divergences, it will be a mess.

        A very simple solution of polymorphic fingerprinting - the fingerprinters will get a fingerprint, a good looking one, just that it will be different each time. Even better to think of when the russians poisoned the ammunition in vietnams with bad bullets. Can't make them all bed or they'll toss the whole batch. I think they arrived at 1:10. The idea is to make fingerprinting as ugly as possible - up to and no further than the point that they are forced to cook up something even more evil.

      • rsync a day ago

        Hotter take: we should provide random answers to all of these measures.

        My user agent and canvas size and gpu capabilities should be unique on every single request.

        (Within reason, of course)

      • 2 days ago
        [deleted]
  • WillyWonkaJr 2 days ago

    You can test your browser here: https://coveryourtracks.eff.org/ Brave works super well for me. Chrome not so much.

    • whitehexagon 2 days ago

      This seems like a battle the end user can never win. The only idea I have is that everyone would run an identical, specially crafted, 'browser-vm' / container? that presents identical metrics and values for any amount of system query.

      Imagine the data start pouring in as every client is starting to present itself as a 68000 HTML5 only browser, with no react rubbish or javascript support. We might end up with a simpler, and faster internet again. Plus a lot of big tech and ad companies scratching their heads.

      • xyzal a day ago

        You can change some metrics that contribute to the fingerprint between sessions, that the fingerprinting party assumes are stable over time.

        There is no utility for the fingerprintor of getting your unique fingerprint, when that fingerprint wont be registered ever again.

        If you use Firefox, look up the CanvasBlocker extension. It can even make the fingerprint appear consistent when first party domain is the same.

    • dngit 2 days ago

      Thanks. I'm a bit confused by this tool. I ran and got this result

      > Our tests indicate that you have strong protection against Web tracking.

      Which sounds like a good thing. Then it follows with:

      > Your browser has a nearly-unique fingerprint

      This sounds like a bad thing to me because uniqueness makes me more identifiable. Do I interpret it correctly? What's the value I should look for?

    • aucisson_masque a day ago

      I suggest you try here also, it's a commercial product used in real world.

      https://fingerprint.com/

      I tried safari, firefox and brave on it. Only brave failed this, every time.

      Went to look on Google and found several thread about people complaining brave anti fingerprinting doesn't work very well. For instance https://community.brave.com/t/fingerprinting-protection-no-l...

  • aucisson_masque a day ago

    If I open the website It says I have 99,9% uniqueness. So I'm trackable right ?

    Yet when I open it again in another window, it says 100% uniqueness but it's a different Signature than previously.

    I think it's saying it can't track me, because signature would have to be the same for it to work yet I'm using regular Firefox, nothing fancy.

    Is that technique really effective or I'm just wrong about the signature not being the same ?

    • figglestar 21 hours ago

      In sounds like you have Firefox's fingerprint resist feature enabled. This confusion occurs every time this topic comes up because there are two different strategies used to attack the problem and people end up measuring them both the same way.

      You can try to be so generic that the attributes are meaningless, the meatspace version of this would be everyone wearing a Guy Fawkes mask so they all have the same face and you can't tell the difference between individuals. Or, you can wear a new generic face every single day so that nothing you did yesterday connects to you, an bland ephemeral identity.

      Tor uses the former method (or tries to), everyone is up to something but you can't tell them apart because they all have the same face/browser attributes. Firefox's fingerprint resist is the second method, normally identifying values are fuzzed repeatedly so that while each signature is identifiable you won't be using it for long enough to connect them to eachother. Both strategies have their merits.

    • trod123 a day ago

      This is just one of many aspects of fingerprinting. Canvas is only a single element.

      Each fingerprint component can be used to correlate other components, including TLS channel options which generally last the entire session the website is open for, or there's a network header that remains unchanged between NAT.

      Given tens of thousands of potentially fingerprintable elements, you only need 5 relatively unique elements between people for a 1 in 1million, each point thereafter exponentially increases that uniqueness.

      • nerdponx a day ago

        I don't work in ad tech, but I expect that there are strong diminishing returns on this kind of thing. Yes, it's theoretically possible to do all of this advanced stuff to correlate and identify the long tail of users who have obfuscated fingerprints. But if you are in the targeted advertising business, are those users really all that valuable to target? They are probably nerds who don't buy a lot of new stuff and block ads anyway. I expect that being unique, but distinct on each run of the fingerprint algorithm, is probably good enough, unless you are concerned about a very motivated attacker trying to stalk you.

        On the other hand, there is also the fact that things like spoofing web APIs only goes so far. There are other fingerprinting techniques, such as measuring properties of the GPU, which might still uniquely identify your machine (how many people have such-and-such GPU in your ZIP code?).

    • amelius a day ago

      Maybe your browser has fingerprint mitigation techniques?

  • dngit 2 days ago

    Are there any browser extensions or tools that effectively prevent fingerprinting including canvas fingerprinting? Or is this one of those privacy battles we just have to accept as unwinnable?

    • galad87 2 days ago

      Safari adds some noise to canvas. So the website above will say it's unique, but each time Safari swaps its web process (when you load a different website or a new window or a new tab) it will change to a different one.

      • dngit a day ago

        Wish all browsers, at least the big ones, would do this by default. It would save regular users like us from fumbling around trying to figure out what works.

      • DeathArrow a day ago

        But tracking a browser does not rely solely on the canvas.

    • jszymborski 2 days ago

      Firefox's Resist Fingerprinting option will do that for you. It's also the default on LibreWolf.

      https://support.mozilla.org/en-US/kb/firefox-protection-agai...

      • uncharted9 2 days ago

        Unfortunately, it didn't pass the fingerprint test. You can see the results here: https://coveryourtracks.eff.org/. On the other hand, Brave does pass it. I'd like to use Brave for all my browsing, but for some reason the devs haven't been able to get hardware-accelerated video decoding to work in their latest builds. That's why I've been using Librewolf for a very long time.

        • IggleSniggle 2 days ago

          I don't buy that cover your tracks applies to the Firefox strategy to privacy. The Firefox strategy is to make your browser incredibly unique every time. If you visit the same website twice, you look very unique but like two totally different visitors. This is effective for real privacy and cover your tracks doesn't account for it well

          • uncharted9 a day ago

            Cover Your Tracks shows this kind of obfuscation strategy as "Randomized Fingerprint", but it only shows it for Brave, not for Librewolf when I tested it. Brave fingerprint is unique but it randomizes, while Firefox doesn't.

            • IggleSniggle 19 hours ago

              It's a setting in Firefox that is off by default because it can make some websites super annoying to deal with. But it's easy to get to, and allows you to be totally random per request or per tab-session, iirc

            • akimbostrawman a day ago

              having a unique fingerprint in a group (such as tor browser / resist fingerprint enabling browser) can be better than a individually semi random unique one

    • mike_d 2 days ago

      > Or is this one of those privacy battles we just have to accept as unwinnable?

      It depends on what you want to win. There are two types of fingerprinting:

      - Browser fingerprinting (what you see here): Make sure that your Chrome on Windows behaves like every other Chrome on Windows and it isn't really a bot pretending to be Chrome. This results in you being treated like a real user and getting less CAPTCHAs.

      - User specific fingerprinting: Determining that your browser is unique among all the browsers the website has seen so that you can be tracked without cookies.

      The latter is obviously bad. Some people would argue the prior is bad, but it is a LOT of work to make every browser behave like every other browser across operating systems for little privacy benefit.

      • TechDebtDevin 2 days ago

        Is it bad if I use fingerprinting to track anonymous users so that I can provide them with a great UX without requiring them to give me all their personal details? Or should I only use cookies, that the user might delete? I don't see an issue with either for this purpose.

        • swatcoder 2 days ago

          Imagine you sat one of you users down, and explained the details of how your fingerpriting system worked.

          You explain that their browser has all kinds of little, subtle leaks of information about what software they're using, what operating system they're using, whether it's up to date, what hardware they're running, whether they're in a public space or an office or a home, which city they're in, what ISP they use, how they've configured their monitor and screen, what settings they set in their browser, what language they use at home, etc etc

          You explain that you can collect all this information without them knowing you were doing it, without them really being able to stop you if they wanted to, and that you can collate it into an identifier that lets you know every time they visit your site even if they don't tell you themselves in some way, and with no way to ask you to stop.

          And you explain that you do this for them, to make their experience of your site better for them, and harder for them to accidentally break.

          How do you think they'd respond?

          To be clear, I'm not asking this as some rhetorical trick. There absolutely are users who wouldn't care in the least, and who might even see you as really clever for doing it.

          But that's how you can know if it's bad or not. If you think your users would be creeped out or otherwise troubled by it, or might feel like you've invaded their privacy or their right to control their own experience in their own browser, then you already know it's bad. If you think they wouldn't mind, then -- and only then -- maybe it's not.

          • pests 2 days ago

            Your example sounds like what people do in person all the time.

            My local barber knows me when I walk in. He knows what I look like, what I wear, what I usually order.

            He uses this to make my experience better. He saves me from having to tell him what I want, he knows what seat I like to sit in, and so on.

            I don't have to tell him I'm coming in. He can figure it out by looking at me walking in the door.

            • DeathArrow a day ago

              You can even tell who you speak with by recognizing the caller's voice, without seeing him.

              You can recognize a writer by his style.

              What GP is trying to say it's ok for people to use pattern matching but it's immoral if they use machines to do pattern matching.

            • tightbookkeeper a day ago

              > by looking at me

              Your presented person is very different from an amalgamation of clues which are not meant to disclose public information and are not you.

              But this is easy to solve. Instead of rationalizing call up a customer and try it.

          • TechDebtDevin a day ago

            I think that's a solid model to use, however, I would argue that its safe to assume that: ** There absolutely are users who wouldn't care in the least, and who might even see you as really clever for doing it.** Makes up >= 95% of recurrent anonymous users by default.

          • DeathArrow a day ago

            How is this different from using cookies?

        • HeatrayEnjoyer 2 days ago

          Well for one you need explicit and freely given consent.

        • olliej 2 days ago

          You should be using a cookie for this purpose, you could in fact just store the ui settings directly in the cookie.

          It becomes tracking once you say “I have an ID in a cookie, and I’m going to look up the settings for that ID in my own giant DB”.

          What you’re suggesting - using fingerprinting - is the worst. It’s not reliable nor robust, it implicitly requires tracking (you have to record the fingerprint<=>setting db and look it up), and user cannot opt out of it nor trivially change state at will, etc.

          There is fundamentally no legitimate reason to ever use fingerprinting over the actual explicit mechanisms for persistent storage.

          • DeathArrow a day ago

            Facebook, Apple and Google use people faces to track them. Governments use public cameras to track people. Google and Facebook also use other kind of tracking people.

            But somehow it's immoral for average Joe to track not people but browsers.

      • DeathArrow a day ago

        >- User specific fingerprinting: Determining that your browser is unique among all the browsers the website has seen so that you can be tracked without cookies.

        I worked briefly for an ad company that not only did their own fingerprinting but bought a lot of fingerprinting data, along some other type of info: country, age cathegory, sex, income cathegory.

        • bostik a day ago

          Funny anecdote: back in 2004-2006 when I held the Infosec 101 course at the university, I raised an obvious point in the privacy section. If an individual harvests data on other people and then uses that to track their movements, actions and behaviours - we'd call it stalking. When a company does that, we call it data mining.

          The lecture used to shock the students from the economics department.

    • SoothingSorbet 2 days ago

      Yes, CanvasBlocker for Firefox does this: https://addons.mozilla.org/en-US/firefox/addon/canvasblocker

      e.g. For me it shows a new unique fingerprint each refresh.

      • xyzal a day ago

        Is it not better to set CanvasBlocker RNG mode to 'perisitent', so that for the same domain you get a same fingerprint?

    • begueradj 2 days ago
    • tech234a 2 days ago

      Another tool on the same site is able to fingerprint using installed browser extensions on Chromium-based browsers: https://browserleaks.com/chrome

    • akimbostrawman a day ago

      the only way to actually prevent fingerprinting is to never connect. your ip, os, tls cryptographic protocols (or lack there of), screen resolution, mouse speed+movement, keystrokes+keyboard layout and much much more all can be used to fingerprint a user. even the reduction of all these points can be a fingerprint.

      since most of those are unlikely to actually happen (yet) with the usual dragnet ad surveillance, just using hardened firefox (arkenfox/librewolf/mullvad browser) with a vpn or just tor browser is sufficient.

    • OptionOfT 2 days ago

      The problem with that is that you'll see a massive uptick of bot detection checks.

  • Beijinger 2 days ago

    I am not sure what it measures, but my privacy enhanced Firefox for browsing seems to show random numbers every time I load this URL and I always stay "unique". Another browser shows "signature stats". If I use my firefox, this won't even show.

    My Privacy Plug ins:

    Plug ins:

    Blend in and spoof most popular properties

    BP Privacy block all font and glyph detection

    Browser plugs fingerprint privacy randomizer

    Canvas Blocker

    Clear URLs

    Cockie Auto delete

    Decentral eyes

    NoScript

    Privacy Badger

    Temporary Containers

    uBlockOrigin

    • nerdponx 2 days ago

      That's because all of the content blocking actually results in a very distinct fingerprint. If you want to be un-fingerprintable, you need to look like a lot of other people.

      • jvanderbot 2 days ago

        Unless there's a way to randomize your fingerprint. If truly random, it's probably v hard to say it's one person, two, three, etc with a huge db full of unique signatures originating from a known VPN exit.

        • Jerrrry a day ago

          Serve sketchy uncorrelated users a subtle challenge that is unknowingly asymmetrically unique.

          Maybe like a captcha.

      • IggleSniggle 2 days ago

        It doesn't matter if you're fingerprintable if your fingerprint is distinct on every visit.

        • Jerrrry a day ago

          It is if you're the only one doing it.

          To hide a signal in noise, there must be noise.

          • IggleSniggle 19 hours ago

            If you're the only one doing it, sure, but if none of your information is reliable at all, then every single other "totally unique" visitor is in the same category as you. From the other end, it could be 10,000 visitors or 1, all you know is that they are unique...since they look like a different totally unique fingerprint on every visit

    • ementally a day ago

      Most of these extensions/add-ons are useless and actually makes your fingerprint more unique.

      https://github.com/arkenfox/user.js/wiki/4.1-Extensions#-don...

      • Beijinger a day ago

        I thought so too but they make me so "unique" that I stay unique with every visit. With 1000 visits I may have 1000 unique fingerprints.

    • a day ago
      [deleted]
    • wwalexander a day ago

      > Browser plugs fingerprint privacy randomizer

    • ruthmarx 2 days ago

      You're running too much stuff. Privacy badger and ublock together is a bad idea for example.

      • yjftsjthsd-h 2 days ago

        What problem does it cause?

        • ruthmarx 2 days ago

          Among other things clashing and locks and possible conflicting rules on whether or not to block or allow something

          • 2 days ago
            [deleted]
          • 2 days ago
            [deleted]
  • gnabgib 2 days ago

    Some discussion in 2015 (47 points, 12 comments) https://news.ycombinator.com/item?id=8887947

  • danielvaughn 2 days ago

    I’m aware this is a naive question, but I find myself wondering why I care about being fingerprinted.

    Seems like it can detect my browser, device, and OS. I kind of assumed it could do that anyways. What security concerns do I face if someone finds that information?

    • nerdponx 2 days ago

      It can uniquely identify you, personally, danielvaughn, among a sea of millions of other "anonymized" users. That information can be used to provide precisely-targeted advertising -- or coming soon to a Kroger near you -- precisely targeted prices for products. And of course it's valuable to anyone who is interested in stalking or impersonating you and might be wiling to pay for access to that data, whether by insider leaks or hacking. It's your choice as to whether you personally care, but a lot of people do care, and the more people who don't care, the harder it is for those who do.

      • danielvaughn 2 days ago

        Ah gotcha, that makes sense, thank you. Pretty wild that just a few parameters can uniquely identify individuals. That’s crazy.

        • nerdponx a day ago

          If you're interested in the topic, it's a pervasive and not-fully-solved problem even in well-intentioned research, where special care is needed in order to avoid accidentally re-identifying individuals in anonymized data simply because they are too unique.

          Another classic example of this is behavioral uniqueness. Maybe 10 people got coffee and a donut at the corner store at 8 AM, but how many of them also went to work at ABC Corp that day and also got a pepperoni roll at the pizzeria for dinner at night? Probably just one person did that.

        • dishsoap a day ago

          The thing is it's not just a few parameters. Modern browsers provide thousands.

      • cco 2 days ago

        Wouldn't we be better served by legislating the bad behavior, targeted pricing, stalking etc, versus developing an arms race, sometimes circular, amongst software engineers?

        • trod123 a day ago

          That would never work.

          Information is inherently more valuable when no one knows it. Just like a ponzi scheme, they need to forever collect more invasive information to reap the same benefit over time.

          • nerdponx a day ago

            That doesn't make sense because information is still costly to collect. It only loses value if everyone already knows it, but that's the opposite of what's going on here. Companies that collect and correlate this data keep their results guarded: it's literally the product that they are selling.

            • trod123 a day ago

              Bulk data collection is not costly to collect, its cents per person; if that.

              Javascript running on the visitors endpoint is not costly at all (the customer pays for it). Bulk data purchases of anonymized data are also quite common, and easily correlated back to the original profile (person) pre-de-anonymization.

              A 1-2 month period in a metropolitan area (50k+) for a bulk sale would get you all the anonymized location data for every single person in the region, cost you about $1200, this gives you devices, travel, work, home, patterns (what restaurants you go to, what your likely demographic is, what you do every day). That is 2.4cents per person (at 50k, price going down the larger the metropolitan population).

              There's an entire data processing pipeline devoted to this in a sub-niche of IT called Master Data Management.

              The development of Chrome was motivated by the last mile click data, GiS collects way more than you think as well and its enabled by default in all android devices. Even if you never connect a device up, remote sensing networks may offer a connection on the unregulated bands in a mesh network like Amazon Sidewalk, and devices with radios often beacon semi-regularly.

              Large companies share signal data as well, and there other sharing agreements where only a token effort is done on de-anonymization but correlations remain the same allowing deduction of the original profiles. All they need is enough points in common, which is not that high.

              The business is in selling the memberships involved for access to this data without a warrant ever being needed. You perform a lookup on the data, and can use that pretty much however you want no restrictions (within the law). That is literally the product that they are selling... people.

              Some day try splurging and buy access to view your accurint profile. I almost guarantee you'll be shocked. Also they don't keep this that well guarded as evidenced by the continuous rolling release of announcements regarding data breaches. You think they don't import info that's posted from a data breach to back-check their existing records? This is big data we're talking about.

              Papers? Is this your normal way home from work comrade? How long does it normally take you to get home? Big brother is watching you.

        • threeseed a day ago

          We need to start by doing something about Google Chrome and Chromium based browsers.

          It is the main contributor to this mess because (a) they allow long-lived first party cookies and (b) they carelessly add every random API without any thought about the privacy implications.

        • nerdponx a day ago

          Both.

          Analogy: we need safer road design and better enforcement of antisocial driving behavior and stricter penalties for hurting people with your vehicle or putting them at risk of harm.

      • metabro 2 days ago

        It might be able to identify a unique user but not the user personally. I think there’s also a fairly broad margin of error in how accurate it can identify a unique user.

        • threeseed a day ago

          a) Techniques like this can identify a unique user to 99.5% accuracy e.g. https://fingerprint.com

          b) It can identify the user personally because many web sites use pixels where we link the fingerprinted user to an email address and then send both to Meta, Google, Reddit etc. And since browsers like Chrome allow long lived first party cookies this works because users remain signed in for over a year.

    • theendisney4 2 days ago

      If we go full CONAPIRACY: With your browsing history and enough data one can find people who are so much like you that they might do things regularly that you dont even know you are capable of. Your friends relatives and coworkers are also profiled by your data.

      More realistic: you wont know by what creapy process they chose to show you an advertisement. We cant imagine it.

    • 2 days ago
      [deleted]
    • trod123 a day ago

      It uniquely identifies you out of all the people on the planet, and through that identification it allows correlation of a number of highly personal and related information points including related persons, roommates, their information, etc.

      By linking information together it gets increasingly more unique. They don't need to know your name it uses a building a bridge strategy where related data gets backfilled, and devices, and dossiers get re-targeted based to new devices on the fly based on these unique signatures, proximity, and too many other ways to count.

      Some SMART street lights for example record and send back voice data to Qualcomm for processing. The advertised signature matching for this is shot-spotter, but it can be done for any audio signature server side or pushed out to the nodes in the dumb remote sensor networks for potential realtime tracking (1984). Every Tesla that catches you while you are out in public registers you in its data which is sent to a centralized system capable of tracking your every move over time, just like ALPR cameras. Roving sensor networks track everything you do, everywhere you go, what your interests are, your history...

      This can include your related and semi-related device nodes, and equipment, phone, car, anything with a microprocessor and a connected network.

      Your devices overnight location (home, where you sleep), your location and travel data (behavioral pattern matching), phone data needed needed to set up taps using SS7. All very illegal, but only punishable retroactively when they are caught in the act just like decrypting certain radio bands.

      In conjunction with this metadata, it can be used to unmask and de-anonymize publicly purchaseable location data. Who you work for, what you are working on, etc.

      From there, it can glean extremely personal insights. If you visited an ER, an abortion clinic, a doctor. Based on the vendors it can further correlate the type of services, or the fact that you might have cancer, be pregnant, have a non-public health condition, often before you yourself know.

      It allows the creation of a dossier of you as a person, where you go, your habits, all the information needed to surveil you, blackmail you, coerce you, all to the highest bidder, which will be someone who took umbridge at something you did, or someone looking to vet you only to never find out that you didn't meet their expectations after they read the report and biased against you.

      This information then can be used to discriminate against you without your knowledge or perception, there is no opting out. The information available allows believable lies to be fabricated where you are considered guilty without trial or basis, effectively bearing false witness.

      When you deviate from patterns found, it will be used against you to justify further discrimination, or heightened risk increasing harassment, loss of opportunities, etc, all unlawfully.

      Demotions at work or passed up for promotions, or firings based on unfounded accusations (cancel culture), or the mere presence in the same location (proximity).

      Guilty until proven innocent for the wildest thing any crazy person might think up, but the data is collected and who is to say its false when it is just data (neutral), and it supports false narratives.

      These harms is what privacy protects you from, without privacy you are considered a slave who can never change from what's written. Inherently, this thinking promotes the narrative that people don't ever change.

      Coincidences in life happen, extremely unlikely things happen, but this information will always be considered proof of something else, in the worst light. Guilt by association, proximity, etc, in other words violation of your fundamental human rights, and you have no agency to change it. What comes along with it is mental coercion and torture, turtles all the way down until you break; all from making some 'inconsequential' decision somewhere about your privacy.

      You piss someone off, rub them the wrong way for calling out bad behavior, or they just fixate on you, and you don't give them a second thought until you find yourself dead in your living room by the police unexpectedly, because they SWAT-ted you, or they leave other breadcrumbs that these systems view as trusted and indicative truth (when they are fabricated). AFAIK, There is a presumption in law that electronic devices are considered to be operating correctly unless you can prove otherwise (which you most often never can, given limited specs and other issues).

      It is these type of security concerns that are inherent in any data collection.

      Visibility of information is the first thing any adversary needs to have a successful attack on you. They can do so fast or slow. Slow involves increasing harassment, pruning your social network, making communications unreliable, torturing you and isolating you until you break; and everyone eventually breaks. Disadvantaging you, forever forward.

      Geico is already using this information to justify higher rates for most members. If you own a hybrid car, and this is being mandated in the future to slow climate change, you have regenerative braking. Geico classified anything that isn't regenerative breaking as hard braking which indicates reckless driving. If you hard brake, you were a reckless driver and had to pay higher rates. They did this using your LexisNexis report which was not public until a class action lawsuit for them, years in the making. Your car mfg through the telematics data link may have sent information to these companies without your knowledge or agency.

      They charged higher rates to people who owned hybrids that avoided accidents, while simultaneously causing them to incentivize causing the same said accidents by avoiding hard braking to avoid higher rates. Its circular.

      There are so many public examples of the collected information being used to harm you, and the collection not being properly disclosed or there being no agency to say no.

      An example of this is where data brokers would share data between their competitors, and any removed records would be returned at the next sync because deleted records were removed, but not all at once. The data repopulates, the shuffle of isolated database merges.

      Data breaches are encouraged because once its out there you can't punish them after a certain period of time. Strangers can insert themselves into your life without you knowing.

      There was an interesting recent project where a person used AI facial recognition in conjunction with smart glasses to pull public dossiers and pretend to be people these targets met in the past, and this was done at a subway stop. Chance meeting... you give someone enough information that is non-public and they believe your plausible story. Can't seem to find the project now, but there was a youtube video about it.

      Master Data Management is the area that touches these systems the most in IT.

      Privacy is the right to not be blackmailed, coerced, or generally speaking at the mercy of malevolent people seeking you harm directly or indirectly.

      https://www.qualcomm.com/news/onq/2021/04/how-juganus-smart-...

      The Qualcomm smart streetlights have been around since 2016.

      Do you suppose you have an expectation of privacy if its just two people on an empty public street? If you are tagged, just like the whales, deer, and other wildlife; are you an animal or a human? Food for thought.

  • ementally a day ago

    There seems to be a lot of misconceptions about how fingerprinting and anti-fingerprinting techniques work.

    I highly recommend reading this article, it is still WIP btw: https://github.com/privacyguides/privacyguides.org/blob/e81b...

  • buro9 a day ago

    The safest way to browse the web in any browser is by disabling JavaScript or using a NoScript extension.

    A lot of the web works surprisingly well still, and you can turn on just what you need when you need it, placing your most visited sites on an allow list, but still denying a lot of third party things on those sites.

    The internet is a joy with js disabled virtually everywhere. And all the canvas fingerprinting, webrtc leak, font fingerprinting, super cookies, etc... are all defeated by simply not running JavaScript

    • sureIy a day ago

      I think the amount of useful websites that work without JS are on a steady decline. No one wants to invest into a 0.1% share, so if it works it's likely out of chance. Even newspapers have a bunch of popups that expect JS to be closed, so you have to hope uBlock is already configured to remove it for you.

      • checkyoursudo a day ago

        But it is trivial to allow js per website using ublock. My default is no-js, and if there is some website I really want to use that requires js, then I enable it. If I find myself using that website regularly, then I make the permission permanent. It is literally zero burden.

        I am not some no-js evangelist or javascript hater or anything, but a huge amount of the web really does work fine (sometimes better even!) without js enabled by default. I don't think it has to be strictly either-or.

      • buro9 a day ago

        Have you tried?

        • kvdveer a day ago

          I have. Most information-type websites (blogs, recipes, news) work fine. Many app-style websites (email, bug tracking, chat) don't work, or have very poor ux.

          Guess which of these two categories I spend most of my time on...

    • ninkendo a day ago

      Are there decent noscript extensions for iOS safari? The only things I see on the App Store are a few sketchy looking extensions I’ve never heard of.

    • amelius a day ago

      Can't a JS engine keep track of what data comes from the canvas (directly and indirectly), and refuse to send that back to the website?

  • frozenlettuce 2 days ago

    This is a pain point when making browser games, as custom fonts break way too easily in some environments.

    • BrutalCoding 2 days ago

      I’m not a game dev but I have occasionally made small throw-away projects with UE/Unity/GD. When you’d make a game for web on these, I’d assume they’ll be rendered as a canvas.

      That’s why I am thinking that custom fonts in a browser game are nothing more than just pixels to draw for the browser.

      I’m doing pretty much anything with Flutter now, even trying to build 2D games with it and unless I’m mistaken: it all renders on a canvas, which has its pros and cons, but in your case it would’ve been a non-issue, or did I misunderstand?

  • chvid a day ago

    I am on Safari and if I open two private windows with this websites it will give me two different signature values.

    Chrome on the other hand will be give me two identical values.

    So I guess Apple is doing something ...

  • iforgotmysocks 2 days ago

    Currently use Safari and there's no protection against canvas fingerprinting :(

    https://webkit.org/tracking-prevention/#anti-fingerprinting

    • galad87 2 days ago

      There is, try to load the website again in a new window or tab, it will show a different fingerprint.

  • ruthmarx 2 days ago

    What I find more interesting is how some sites can detect your OS even if you block JS and change the user agent to indicate something different. I assume it's checking for known fonts or something similar.

    • trod123 a day ago

      Fonts are the low hanging fruit. More sophisticated servers run a whole battery of hardware fingerprinting tests. It runs deep.

      If the device has been powered on for a certain period of time (usually a few minutes), the voltage normalizes and you get a unique clock skew signature based on the defects of the silicon, for each enumerable device that may be available from various JS API calls, or potential zero days, adds another data point for uniqueness.

      Passive listenings of local network traffic headers will provide a local network topology of metadata of local proximity devices that can often be cross referenced (since cable modems often collect this info as well as other embedded devices).

      Its a strategy called building a bridge. You start from the device which has an associated profile, that profile only need to be unique and may only start off as an identifier (nothing else) and the endpoint and you meet somewhere in the middle, backfilling information as you go. No personal info needed upfront.

      CSS previous visited link decorators is another avenue for fingerprinting. It violates same-domain policy, but there was a PoF back in 2021 where you could generate picture squares identical to a captcha asking for specific picture or puzzle that was generated to be tied to the CSS decorator (thus submitting your browser history beacons to that site in its entirety). Think it was varun.ch?

    • Nadya 2 days ago

      That's a good assumption because that's exactly one way it is done. :)

  • bhouston 2 days ago

    I am confused.

    You can already ask the browser what OS, CPU, device and browser it is without fingerprinting the canvas. You can get the version of the browser, and OS as well.

    I use the popular US-Parser-JS library, works like a charm: https://www.npmjs.com/package/ua-parser-js. (I use it for https://web3dsurvey.com.)

    (Also the WebGL and WebGPU APIs will also tell you the GPU hardware you have.)

    • 0points 2 days ago

      > I am confused.

      > You can already ask the browser what OS, CPU, device and browser it is without fingerprinting the canvas. You can get the version of the browser, and OS as well.

      Fingerprinting is not about detecting your user agent, it is about detecting YOU in a sea of otherwise identical user agent strings.

    • efilife 2 days ago

      Canvases can vary very slighly between users' setups. The inconsistencies in rendering are used to fingerprint

      And to my knowledge you can't ask the browser about the cpu, only the number of cores. The regular fingerprinting you described would be just an user agent string that can be trivially spoofed

      • bhouston 2 days ago

        > Canvases can vary very slighly between users' setups. The inconsistencies in rendering are used to fingerprint

        I think it overlaps mostly with OS, CPU, Browser, GPU Type, GPU driver version. These can already be queried by UA string and WebGL/WebGPU. It is easier to query them explicitly for most users but I guess canvas fingerprinting is a fallback for when these APIs do not return some of the data.

        > And to my knowledge you can't ask the browser about the cpu, only the number of cores.

        Generally you can. Not all browsers give up all info, but most browsers give up most of the info. Here is a demo of what you get from ua parsing: https://uaparser.dev/#demo

        • IggleSniggle 2 days ago

          Years ago you could get crazy specific fingerprints from canvas. It got far less precise when Chrome introduced jitter into timers and treated performance timers as security sensitive functions. The main difference between canvas and this other stuff is that canvas couldn't be spoofed whereas everything else could and can be spoofed, or was so general as to be useless as an identifier.

          • trod123 a day ago

            That's not true.

            The jitter had very little intended effect since accurate system timing can be derived from the instruction processing time of arbitrary javascript code in a mostly browser agnostic way, specifically iteration (i=i+1). It applied broadly to a large number of different types of instructions

          • Jerrrry a day ago

            The granularity of the performance timers were too specific, allowing the number of cores to be deduced by the amount of time it took to do certain math functions.

            Those certain math functions happen to be floating point operations that are ordered in a way to maximize the inaccuracy property off floating point types.

            These inaccuracies are very correlatable.

  • jagged-chisel a day ago

    “It's very likely that your web browser is Google Search App …”

    It’s not. Do I win?

  • krunck 2 days ago

    My Brave browser seems to be 100% unique.

    • Beijinger 2 days ago

      It should give a fingerprint. Try to reload in another tab and see if you stay unique and if the fingerprint changes. I am also unique, but I always stay unique ;-)

  • atum47 2 days ago

    Apple once disabled canvas for their devices for this reason and broke all my apps, haha.

    80% of my projects use canvas

  • DeathArrow a day ago

    >It's very likely that your web browser is Chrome and your operating system is Android.

    Android guess is correct. But the browser is Edge, not Chrome.

  • wwwtyro 2 days ago

    I know my priorities are questionable, but I'm more annoyed that I can't expect consistent canvas rendering.

    • a day ago
      [deleted]
  • a day ago
    [deleted]