WebLLM: A High-Performance In-Browser LLM Inference Engine
https://blog.mlc.ai/2024/06/13/webllm-a-high-performance-in-browser-llm-inference-engine
MLC
プロジェクト
https://scrapbox.io/files/666c716c0314be001d11546b.png
OpenAI準拠のAPIあるっぽい
#WebLLM