"In the M365 apps, we do not use customer data to train LLMs. This setting only enables features requiring internet access like co-authoring a document." -@Microsoft365
They wouldn't have to combat misinformation if they actually told users what the switches turned on/off instead of vague descriptions that could plausibly allow them to do whatever the hell they want.
I'd imagine mostly to the productivity of the sloppers. Considering this looks like a Windows Office-wide default opt-in, I feel you'd have difficulty hitting 1% userbase.
Microsoft refuted this claim: https://x.com/Microsoft365/status/1861160874993463648
"In the M365 apps, we do not use customer data to train LLMs. This setting only enables features requiring internet access like co-authoring a document." -@Microsoft365
They wouldn't have to combat misinformation if they actually told users what the switches turned on/off instead of vague descriptions that could plausibly allow them to do whatever the hell they want.
Yeah, MS collecting your personal data and using it for whatever they want sounds exactly like a "feature requiring internet access" to me.
your document has gained the following co-authors:
- microsoft
- our 856 trusted third parties
File>Options>Trust Centre (left panel)>Trust Centre Settings (button)>Privacy Options (left panel)>Privacy Settings (button)>uncheck "Turn on optional connected experiences"
Wow
I don’t have access to any Windows machines. Otherwise, I’d be tempted to turn this on and pump it full of Markov-chain generated slop.
How big would the damage be if a few percent of their userbase did this?
This wouldn't have an impact.
There would inevitably be a classifier that acts as a filter pre-training to identify slop and ignore it.
I'd imagine mostly to the productivity of the sloppers. Considering this looks like a Windows Office-wide default opt-in, I feel you'd have difficulty hitting 1% userbase.
https://xkcd.com/743/
Microsoft is a powerful company, they know themselves above the Law. So why would they stop?