The latest developments in Small Language Models (Phi-3, Gemini Nano etc), the omni-modal GPT-4o updates and specialized on-device chipsets built by Apple, Google & MSFT signal towards a trend for hybrid model architectures involving locally executed AI workloads, interfacing with cloud hosted models. Here’s a market analysis where I make a bull and bear case for on-device assistants.
https://sidstage.substack.com/p/beyond-the-cloud-distributed-ai-and
Feel free to DM me if you have comments, feedback or just want to chat more about this topic. Always interested to exchange insights.