abbiistabbii@lemmy.blahaj.zone to 196@lemmy.blahaj.zone · 7 months agoMicrosoft Rulelemmy.blahaj.zoneimagemessage-square78fedilinkarrow-up1808arrow-down10file-text
arrow-up1808arrow-down1imageMicrosoft Rulelemmy.blahaj.zoneabbiistabbii@lemmy.blahaj.zone to 196@lemmy.blahaj.zone · 7 months agomessage-square78fedilinkfile-text
https://arstechnica.com/gadgets/2024/05/microsofts-new-recall-feature-will-record-everything-you-do-on-your-pc/
minus-squarehenfredemars@infosec.publinkfedilinkEnglisharrow-up73·7 months agoI think it’s a cool idea in principle. I just don’t trust the company with my data even if they claim it is stored locally, but then again that’s why I don’t use their OS.
minus-squarerumschlumpel@lemmy.blahaj.zonelinkfedilinkarrow-up35·edit-27 months agoYeah, I’d totally use whatever FOSS equivalent eventually makes it to the Linux desktop once it’s actually usable.
minus-squaredrem@lemmy.worldlinkfedilinkarrow-up14·edit-27 months agoGpt4all is avalible on linux, it is an open source software that can run LLMs locally.
minus-squareareyouevenreal@lemm.eelinkfedilinkarrow-up5·7 months agoThat’s not an alternative to Microsoft Recall. It doesn’t take screenshots to record your activity.
minus-squaredrem@lemmy.worldlinkfedilinkarrow-up3·7 months agoI agree, but its better than nothing. I’m sorry I mislead you.
minus-squarehenfredemars@infosec.publinkfedilinkEnglisharrow-up13·7 months agoIt might not be long if the idea works. We already have good local LLMs.
I think it’s a cool idea in principle. I just don’t trust the company with my data even if they claim it is stored locally, but then again that’s why I don’t use their OS.
Yeah, I’d totally use whatever FOSS equivalent eventually makes it to the Linux desktop once it’s actually usable.
Gpt4all is avalible on linux, it is an open source software that can run LLMs locally.
That’s not an alternative to Microsoft Recall. It doesn’t take screenshots to record your activity.
I agree, but its better than nothing. I’m sorry I mislead you.
It might not be long if the idea works. We already have good local LLMs.