Premium Only Content
LLaMA: Open and Efficient Foundation Language Models (Paper Explained)
#ai #meta #languagemodel
LLaMA is a series of large language models from 7B to 65B parameters, trained by Meta AI. They train for longer on more data and show that something like gpt-3 can be outperformed by significantly smaller models when trained like this. Meta also releases the trained models to the research community.
OUTLINE:
0:00 - Introduction & Paper Overview
4:30 - Rant on Open-Sourcing
8:05 - Training Data
12:40 - Training Hyperparameters
14:50 - Architecture Modifications
17:10 - Optimizer
19:40 - Efficient Implementation
26:15 - Main Results
38:00 - Some more completions
40:00 - Conclusion
Paper: https://arxiv.org/abs/2302.13971
Website: https://ai.facebook.com/blog/large-language-model-llama-meta-ai/
Abstract:
We introduce LLaMA, a collection of foundation language models ranging from 7B to 65B parameters. We train our models on trillions of tokens, and show that it is possible to train state-of-the-art models using publicly available datasets exclusively, without resorting to proprietary and inaccessible datasets. In particular, LLaMA-13B outperforms GPT-3 (175B) on most benchmarks, and LLaMA-65B is competitive with the best models, Chinchilla-70B and PaLM-540B. We release all our models to the research community.
Authors: Hugo Touvron, Thibaut Lavril, Gautier Izacard, Xavier Martinet, Marie-Anne Lachaux, Timothée Lacroix, Baptiste Rozière, Naman Goyal, Eric Hambro, Faisal Azhar, Aurelien Rodriguez, Armand Joulin, Edouard Grave, Guillaume Lample
Links:
Homepage: https://ykilcher.com
Merch: https://ykilcher.com/merch
YouTube: https://www.youtube.com/c/yannickilcher
Twitter: https://twitter.com/ykilcher
Discord: https://ykilcher.com/discord
LinkedIn: https://www.linkedin.com/in/ykilcher
If you want to support me, the best thing to do is to share out the content :)
If you want to support me financially (completely optional and voluntary, but a lot of people have asked for this):
SubscribeStar: https://www.subscribestar.com/yannickilcher
Patreon: https://www.patreon.com/yannickilcher
Bitcoin (BTC): bc1q49lsw3q325tr58ygf8sudx2dqfguclvngvy2cq
Ethereum (ETH): 0x7ad3513E3B8f66799f507Aa7874b1B0eBC7F85e2
Litecoin (LTC): LQW2TRyKYetVC8WjFkhpPhtpbDM4Vw7r9m
Monero (XMR): 4ACL8AGrEo5hAir8A9CeVrW8pEauWvnp1WnSDZxW7tziCDLhZAGsgzhRQABDnFy8yuM9fWJDviJPHKRjV4FWt19CJZN9D4n
-
1:56:30
ThisIsDeLaCruz
3 hours ago $0.03 earnedOn The Road With Pantera
19.3K1 -
LIVE
meleegames
3 hours agoMelee Madness Podcast #58 - They Changed What ‘It’ Was & It’ll Happen to You
75 watching -
2:32:46
megimu32
4 hours agoOn The Subject: Why K-Pop Demon Hunters Feels Like 90s Disney Again
15K10 -
1:38:28
Glenn Greenwald
7 hours agoThe Fraudulent GOP War Against Tucker and Nick Fuentes; Dick Cheney: Hero of the Resistance; Lindsey Graham's Deranged RJC Comments | SYSTEM UPDATE #544
97.8K109 -
LIVE
ThePope_Live
3 hours agoRedsack with the boys Cheap, Jah and Nova!
364 watching -
LIVE
Hernandez2787
6 hours agoArc Raiders - 1st Playthrough/ Celebrating My Anniversary as Sergeant First Class in the US Army
69 watching -
48:42
Donald Trump Jr.
8 hours agoCommunism vs Common Sense, What's Next for NYC? | TRIGGERED Ep.289
141K279 -
LIVE
JahBlessCreates
3 hours ago🎉Lil Music Ting
19 watching -
1:31:25
The Charlie Kirk Show
6 hours agoTHOUGHTCRIME Ep. 104 — Post-Election Palette Cleanser + Tucker/Fuentes Interview Reaction
103K41 -
4:22:59
tminnzy
5 hours agoSmooth Moves Only 💨 | Naraka: Bladepoint Chill Gameplay | !gx
33.6K5