wuphysics87@lemmy.ml to Privacy@lemmy.ml · 1 month agoCan you trust locally run LLMs?message-squaremessage-square22fedilinkarrow-up173arrow-down16file-text
arrow-up167arrow-down1message-squareCan you trust locally run LLMs?wuphysics87@lemmy.ml to Privacy@lemmy.ml · 1 month agomessage-square22fedilinkfile-text
I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?
minus-squaremarcie (she/her)@lemmy.mllinkfedilinkarrow-up14arrow-down1·1 month agoyeah you could. though i dont see any evidence that the large open source llm programs like jan.ai or ollama are doing anything wrong with their program or files. chucking it in a sandbox would solve the problem for good though
minus-squareSeekPie@lemm.eelinkfedilinkarrow-up6·edit-21 month agoYou could use “Alpaca” flatpak and remove the internet access with flatseal after having downloaded the model. (Linux) Or deny the app’s access to internet in app settings. (Android)
yeah you could. though i dont see any evidence that the large open source llm programs like jan.ai or ollama are doing anything wrong with their program or files. chucking it in a sandbox would solve the problem for good though
You could use “Alpaca” flatpak and remove the internet access with flatseal after having downloaded the model. (Linux)
Or deny the app’s access to internet in app settings. (Android)