Join Slack
Powered by
Anyone working on running models on the device its...
# 02-general-community-chat
r
Radhika Malik
05/07/2024, 12:13 AM
Anyone working on running models on the device itself (mobile mainly, but also laptops etc)? I'm trying to learn more about the space. TIA!
a
Anusheel Bhushan
05/07/2024, 12:26 AM
https://ollama.ai/
is a good starting point
✅ 1
b
Barrett Williams
05/07/2024, 12:32 AM
Also
https://github.com/oobabooga/text-generation-webui
and
https://github.com/huggingface/text-generation-inference
though they're really designed for servers. I think there are forks for low-mem variants for laptops
a
Anusheel Bhushan
05/07/2024, 2:34 AM
https://lmstudio.ai
is another option
s
Sandeep
06/02/2024, 9:13 AM
@Radhika Malik
is it LLM's or any other models. For mobiles ,and embedded devices check the
tinyml.org
Open in Slack
Previous
Next