Pro@programming.dev to Technology@lemmy.worldEnglish · 1 个月前Google quietly released an app that lets you download and run AI models locallygithub.comexternal-linkmessage-square48fedilinkarrow-up1245arrow-down126
arrow-up1219arrow-down1external-linkGoogle quietly released an app that lets you download and run AI models locallygithub.comPro@programming.dev to Technology@lemmy.worldEnglish · 1 个月前message-square48fedilink
minus-squareGreg Clarke@lemmy.calinkfedilinkEnglisharrow-up3·1 个月前Has this actually been done? If so, I assume it would only be able to use the CPU
minus-squareEuphoma@lemmy.mllinkfedilinkEnglisharrow-up7·1 个月前Yeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk
You can use it in termux
Has this actually been done? If so, I assume it would only be able to use the CPU
Yeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk