Anyone working on running models on the device its...
# 02-general-community-chat
r
Anyone working on running models on the device itself (mobile mainly, but also laptops etc)? I'm trying to learn more about the space. TIA!
a
https://ollama.ai/ is a good starting point
1
b
Also https://github.com/oobabooga/text-generation-webui and https://github.com/huggingface/text-generation-inference though they're really designed for servers. I think there are forks for low-mem variants for laptops
a
https://lmstudio.ai is another option
s
@Radhika Malik is it LLM's or any other models. For mobiles ,and embedded devices check the tinyml.org