Technology
Phi-3
Microsoft's family of small language models (SLMs) delivering high-reasoning performance on local devices and edge hardware.
Phi-3-mini packs 3.8 billion parameters into a footprint small enough for local deployment on an iPhone 14. Trained on a 3.3 trillion token dataset of high-quality synthetic data and filtered web content: it outperforms models twice its size (like Mixtral 8x7B) on benchmarks for coding and logic. The family includes 7B (small) and 14B (medium) variants, providing developers with low-latency options for complex tasks without the massive compute requirements of traditional LLMs.
Related technologies
Recent Talks & Demos
Showing 1-2 of 2