Pro@programming.dev to Technology@lemmy.worldEnglish · 5 days agoGoogle quietly released an app that lets you download and run AI models locallygithub.comexternal-linkmessage-square47fedilinkarrow-up1245arrow-down126
arrow-up1219arrow-down1external-linkGoogle quietly released an app that lets you download and run AI models locallygithub.comPro@programming.dev to Technology@lemmy.worldEnglish · 5 days agomessage-square47fedilink
minus-squareGreg Clarke@lemmy.calinkfedilinkEnglisharrow-up3·4 days agoHas this actually been done? If so, I assume it would only be able to use the CPU
minus-squareEuphoma@lemmy.mllinkfedilinkEnglisharrow-up7·4 days agoYeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk
Has this actually been done? If so, I assume it would only be able to use the CPU
Yeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk