LLaMA
>LLaMA47B13B33B65B
>13BLLaMAGPT-365BGoogleDeepMindChinchilla70BGooglePaLM 540B

>(Large Language Model Meta AI), a state-of-the-art foundational large language model designed to help researchers advance their work in this subfield of AI.
AI基素
>Smaller, more performant models such as LLaMA enable others in the research community who dont have access to large amounts of infrastructure to study these models, further democratizing access in this important, fast-changing field
PC基素
>CCNetC4GitHubWikipediaBooksArXivStack ExchangeLLaMA使
()
model VRAM GPU
3B 10GB 3060/3080/3090
13B 20GB 3090ti/4090
33B 30GB GV100
65B 40GB A6000/A100

> 使(C4wikipedia20)META AI(FAIR)
> decoder-onlyTransformerPaLMReLUSwiGLUGPT-NeoXRoPE使xformersmemory_efficient_attention
https://toukei-lab.com/llamaMetaLLMLLaMALLaMA2Alpaca

Google or Torrent
publish基素
magnet:?xt=urn:btih:b8287ebfa04f879b048d4d4404108cf3e8014352&dn=LLaMA
torrentMotrix使
magnet:?xt=urn:btih:ZXXDAUWYLRUXXBHUYEMS6Q5CE5WA3LVA&dn=LLaMA