Sahwa@reddthat.com to Technology@lemmy.worldEnglish · 5 days agoMicrosoft 365's buggy Copilot 'Chat' has been summarizing confidential emails for a month — yet another AI privacy nightmarewww.windowscentral.comexternal-linkmessage-square20fedilinkarrow-up1414arrow-down12
arrow-up1412arrow-down1external-linkMicrosoft 365's buggy Copilot 'Chat' has been summarizing confidential emails for a month — yet another AI privacy nightmarewww.windowscentral.comSahwa@reddthat.com to Technology@lemmy.worldEnglish · 5 days agomessage-square20fedilink
minus-squaredgdft@lemmy.worldlinkfedilinkEnglisharrow-up18·edit-25 days agoYour general understanding is entirely correct, but: Microsoft is almost certainly recording these summarization requests for QA and future training runs; that’s where the leakage would happen.
minus-squareTRBoom@lemmy.ziplinkfedilinkEnglisharrow-up9arrow-down1·5 days ago100% agree. At this point I am assuming everything sent through their servers is actively being collected for LLM training.
Your general understanding is entirely correct, but:
Microsoft is almost certainly recording these summarization requests for QA and future training runs; that’s where the leakage would happen.
100% agree. At this point I am assuming everything sent through their servers is actively being collected for LLM training.