|
中国の清華大学自然言語処理(NLP)研究室からスピンオフして2022年に設立された面壁智能(ModelBest)は、大規模言語モデル(LLM)開発を専門とするAIスタートアップで、「小さなモデルで大きな性能を」というコンセプトを掲げ、クラウドに依存せずスマートフォンやパソコン、車載チップなどの端末上で動作する「エッジAI(オンデバイスAI)」に強みを持っており、独自の「知識密度法則」により小型ながら高性能なMiniCPMシリーズを展開しています。中国電信や自動車メーカー等との戦略的提携を通じて、司法や自動運転といった実業分野への量産実装を急速に進めています。 面壁智能(ModelBest)について、生成AIに調査させました。なお、生成AIによる調査・分析結果は、公開された情報からだけの分析であり、必ずしも実情を示したものではないこと、誤った情報も含まれていることについてはご留意されたうえで、ご参照ください。 中国発LLM「ModelBest」、3カ月で230億円調達ーークラウド不要、端末で動く「エッジAI」で存在感 5/4(月) https://news.yahoo.co.jp/articles/732f248eed796e0dbb0360a1e311de98163fde96#:~:text=%E3%83%97%E3%83%A9%E3%82%A4%E3%83%90%E3%82%B7%E3%83%BC-,%E4%B8%AD%E5%9B%BD%E7%99%BALLM%E3%80%8CModelBest%E3%80%8D%E3%80%813%E3%82%AB%E6%9C%88%E3%81%A7230%E5%84%84,36Kr%20Japan%E7%B7%A8%E9%9B%86%E9%83%A8%EF%BC%89 ModelBest: Strong in “Edge AI” ModelBest, an AI startup spun off from Tsinghua University’s Natural Language Processing (NLP) Laboratory and established in 2022, specializes in the development of large language models (LLMs). With the concept of achieving “big performance with small models,” the company has a particular strength in “edge AI” or “on-device AI,” which runs on devices such as smartphones, PCs, and automotive chips without relying on the cloud. Through its proprietary “knowledge density law,” ModelBest has developed the MiniCPM series, which delivers high performance despite its compact model size. Through strategic partnerships with China Telecom, automakers, and others, the company is rapidly advancing mass-production implementation in real-world sectors such as judicial services and autonomous driving. I had generative AI investigate ModelBest. Please note that the results of the generative AI research and analysis are based solely on publicly available information and may not necessarily reflect the actual situation; they may also contain inaccuracies. Please keep this in mind when referring to the material. Your browser does not support viewing this document. Click here to download the document. Your browser does not support viewing this document. Click here to download the document. Your browser does not support viewing this document. Click here to download the document. Your browser does not support viewing this document. Click here to download the document. Your browser does not support viewing this document. Click here to download the document. Your browser does not support viewing this document. Click here to download the document.
0 Comments
Leave a Reply. |
著者萬秀憲 アーカイブ
April 2026
カテゴリー |
RSS Feed