ブログで100万の秘訣ってなに?
詳しくはコチラ

Zuckerberg says Meta will need 10x more computing power to train Llama 4 than Llama 3

Meta, which develops one of the biggest foundational open-source large language models, Llama, believes it will need significantly more computing power to train models in the future. Mark Zuckerberg said on Meta’s second-quarter earnings call on Tuesday that to train Llama 4 the company will need 10x more compute than what was needed to train […]
© 2024 TechCrunch. All rights reserved. For personal use only.
Source: TECCRUCH

リンク元

コメント

タイトルとURLをコピーしました