35 comments

  • qup 2 hours ago

    Is this the correct use of "opt-in?"

    To me, having things "opt-in" means they're off and you can turn them on if you want.

    If it's "opt-out" it's automatically on, and you can turn it off.

    • elAhmo 2 hours ago

      Likewise, I think the title is literally of opposite what is actually happening.

    • alt227 2 hours ago

      I think they mean 'Enabled by default'

      • mejutoco 2 hours ago

        Thus opt-out would be the correct term.

    • jyunwai 2 hours ago

      You are correct. The headline author likely meant "opted in by default" or "enabled by default."

  • Ukv an hour ago

    > Microsoft's Connected Experiences feature automatically gathers data from Word and Excel files to train the company's AI models. This feature is turned on by default, meaning user-generated content is included in AI training unless manually deactivated.

    Not to say that Microsoft products respect privacy, but I don't see evidence that user content is being used for training.

    The linked services agreement has had the same language (copy/transmit/etc. "to the extent necessary to provide the services") since at least 2015[0], and "connected experiences" seems to group a wide range of integrations; some like dictation/translation probably utilise ML, but that does not mean training on user content.

    [0]: https://web.archive.org/web/20150608000921/https://www.micro...

    • itishappy 15 minutes ago

      To play devil's advocate, I don't see any evidence they're NOT training on user content either. Compared to how explicitly they indicate they're not using user content for targeted advertising, this seems like a huge oversight. Given how carefully they've put together these documents, I'm doubtful it was an oversight.

    • ca_tech an hour ago

      Agreed. This was raised within our corp the other week and we read through the privacy and security documentation as it relates to Connected Experiences. Microsoft has outlined specifically what Connected Experiences covers.[1] [2] You could argue that predictive text is a product of machine learning but there is no clause allowing for training any generalized large language models using this data. The confusion may have arisen, if they read an article about CoPilot. If the user had a Microsoft Copilot 365 license, then the data would be used as grounding for their personal interaction with CoPilot. But still not used to train any foundational LLMs. However, even this data is still managed in compliance with Microsoft's data security and privacy agreements.

      [1] https://learn.microsoft.com/en-us/microsoft-365-apps/privacy...

      [2] https://learn.microsoft.com/en-us/microsoft-365-apps/privacy...

  • alt227 2 hours ago

    This seems like a security shit show.

    Can we disable it by group policy across entire domains?

    Surely no business would ever allow Microsoft to 'reformat, display, and distribute' confidential company documents?

    Or am I missing something.

    • Thorrez an hour ago

      Well, if there's some sort of cloud feature allowing you to share documents you write with others, it would make sense you would have to allow Microsoft to "reformat, display, and distribute" for the purpose of providing you that service.

      However, the terms of service says "To the extent necessary to provide the Services to you and others, [...] and to improve Microsoft products and services". So they're saying they can use your content not just to provide you service, but to provide other people service and to improve all Microsoft products.

      • HPsquared 10 minutes ago

        The word 'necessary' can do a lot of heavy lifting.

  • HelloUsername an hour ago

    "In the M365 apps, we do not use customer data to train LLMs. This setting only enables features requiring internet access like co-authoring a document." @Microsoft365 https://twitter.com/Microsoft365/status/1861160874993463648

  • tjqgG 2 hours ago

    A word processor stealing the user's IP by default should carry massive fines in the EU. This is pure deception. 20% of annual revenue should be appropriate.

    • jmclnx 33 minutes ago

      Hopefully full pretax revenue for Microsoft and all their subsidizes.

  • orev an hour ago

    Title as of the time of this comment:

    > Microsoft Word and Excel AI data scraping slyly switched to enabled by default — the opt-out toggle is not that easy to find

    As a tech person, keeping up with disabling and avoiding all this is becoming exhausting. I can’t imagine any regular non-tech person having any chance at avoiding it.

    Is it time to just give up? At what point do you have to accept that the tsunami is here and there’s nothing you can do about it?

    • greentxt 25 minutes ago

      >At what point do you have to accept that the tsunami is here and there’s nothing you can do about it?

      Around the late 2000's, but maybe it was earlier. The best time to buy msft stock is always right now.

    • squigz 44 minutes ago

      The solution isn't to give up or attempt to avoid it - it's to make this sort of thing illegal.

  • jmclnx 34 minutes ago

    >To do so, users must actively opt out by finding and disabling the feature in settings

    Odd. So, lets say I wrote a article and it is copyrighted and on some newspaper WEB Page. If I understand this completely, in theory, I need to find everyone who uses this version of Word and tell them to disable this feature ?

    If so, looks to me the lawyers are going to have a great time with this and will clog the courts for centuries.

  • protoster an hour ago

    The linked "Services Agreement" doesn't appear to be specific to this "Connected Experiences" thing, but is rather the basic agreement required to use any MS software. Correct me if I'm wrong here, but opting out of this won't restrict MS from having a license to all Your Content?

  • daft_pink 2 hours ago

    Microsoft = Spyware

    • cheschire an hour ago

      Most tech theses days seems to fall into that classification.

      There are not too many pieces of technology these days that intentionally avoid collecting your data in order to be sold to another company.

  • robin_reala 2 hours ago

    I just checked and this is turned off in my installation, but I’m not sure that’s from being EU based, or because my org has disabled it.

  • Aaargh20318 an hour ago

    This would certainly be the cause of lots of GDPR violations, considering the kinds of information processed in Word and Excel. I know our condo's owners association keeps contact information of their members in Excel sheets, that's considered PII. It can also contain sensitive information like who is behind on their monthly contributions and by how much.

    That's just the first thing I thought of. There must be tons of companies and organisations processing sensitive data in Word and Excel. What about doctor's offices and insurance companies handling medical information? What about banks, financial advisors, lawyers, etc.

  • mschuster91 2 hours ago

    > "To the extent necessary to provide the Services to you and others, to protect you and the Services, and to improve Microsoft products and services, you grant to Microsoft a worldwide and royalty-free intellectual property license to use Your Content, for example, to make copies of, retain, transmit, reformat, display, and distribute via communication tools Your Content on the Services," the clause reads.

    Well, this does make sense in the context of Office 365, OneDrive and the Office web apps in general. (Still dodgy regarding the "worldwide" part but there's no way around that because people can and do expect to access their stuff even while on vacation)

    Silently enabling the training of remote AI however? That's not covered under any reasonable interpretation of the above legalese.

    • jagged-chisel 2 hours ago

      >… intellectual property license to use Your Content

      Seems clear to me. Use any way Microsoft wants. The “for example” list is not exhaustive nor limiting.

      • genrilz 2 hours ago

        IANAL again, but I don't think they get to do literally anything with your data. The phrase used is "to the extent necessary". For instance, I don't think they could scrape their user data for trade secrets and then sell those to the highest bidder.

        • jagged-chisel an hour ago

          Who defines “necessary?” Use of Your Content is Necessary to support Microsoft’s business activities, including, but not limited to, training their AI.

        • ada1981 2 hours ago

          Why not? Isn’t that the essential ethos Microsoft was founded on?

          • genrilz an hour ago

            Because they boxed themselves in with legalese. Companies would definitely switch off Microsoft services if at all possible if the company's lawyers thought their trade secrets were getting sold off. So I think the "as necessary" framing does probably prevent them from doing some things.

            As I laid out in my other comment, I think training AI in particular is covered under the "improving Microsoft products or services" bit of legalese. I do wonder how companies lawyers will respond to this though. They probably thought of that phrase as just allowing Microsoft employees access to documents to see how Word or other pieces of software were being used, or to fix crashes, etc.

          • cudgy 2 hours ago

            I thought it was founded on Bill Gates’s mommy having strong connections to IBM that allowed little Bill to keep the rights to the source code they paid him to write. And the privileged position of having access to a computer at his school when 99.9% of the population did not.

        • jasonjayr 2 hours ago

          "The funds from the bidder will be invested in to products in order to make a better user experience" /s

          • cudgy 2 hours ago

            Reminds me of “this call will be used for training and quality purposes.”

    • genrilz 2 hours ago

      IANAL, but I think the "to improve Microsoft products and services" bit does mean that they do legally get to train their AI (which is a Microsoft service) on your data. Still a bastard move though.

  • lousken an hour ago

    servers are already on debian, client PCs left

  • formerly_proven 2 hours ago

    Does this circumvent Azure Information Protection policies as well? Would be fucking hilarious if it did.