Premium Only Content
Retentive Network: A Successor to Transformer for Large Language Models (Paper Explained)
#ai #retnet #transformers
Retention is an alternative to Attention in Transformers that can both be written in a parallel and in a recurrent fashion. This means the architecture achieves training parallelism while maintaining low-cost inference. Experiments in the paper look very promising.
OUTLINE:
0:00 - Intro
2:40 - The impossible triangle
6:55 - Parallel vs sequential
15:35 - Retention mechanism
21:00 - Chunkwise and multi-scale retention
24:10 - Comparison to other architectures
26:30 - Experimental evaluation
Paper: https://arxiv.org/abs/2307.08621
Abstract:
In this work, we propose Retentive Network (RetNet) as a foundation architecture for large language models, simultaneously achieving training parallelism, low-cost inference, and good performance. We theoretically derive the connection between recurrence and attention. Then we propose the retention mechanism for sequence modeling, which supports three computation paradigms, i.e., parallel, recurrent, and chunkwise recurrent. Specifically, the parallel representation allows for training parallelism. The recurrent representation enables low-cost O(1) inference, which improves decoding throughput, latency, and GPU memory without sacrificing performance. The chunkwise recurrent representation facilitates efficient long-sequence modeling with linear complexity, where each chunk is encoded parallelly while recurrently summarizing the chunks. Experimental results on language modeling show that RetNet achieves favorable scaling results, parallel training, low-cost deployment, and efficient inference. The intriguing properties make RetNet a strong successor to Transformer for large language models. Code will be available at this https URL.
Authors: Yutao Sun, Li Dong, Shaohan Huang, Shuming Ma, Yuqing Xia, Jilong Xue, Jianyong Wang, Furu Wei
Links:
Homepage: https://ykilcher.com
Merch: https://ykilcher.com/merch
YouTube: https://www.youtube.com/c/yannickilcher
Twitter: https://twitter.com/ykilcher
Discord: https://ykilcher.com/discord
LinkedIn: https://www.linkedin.com/in/ykilcher
If you want to support me, the best thing to do is to share out the content :)
If you want to support me financially (completely optional and voluntary, but a lot of people have asked for this):
SubscribeStar: https://www.subscribestar.com/yannickilcher
Patreon: https://www.patreon.com/yannickilcher
Bitcoin (BTC): bc1q49lsw3q325tr58ygf8sudx2dqfguclvngvy2cq
Ethereum (ETH): 0x7ad3513E3B8f66799f507Aa7874b1B0eBC7F85e2
Litecoin (LTC): LQW2TRyKYetVC8WjFkhpPhtpbDM4Vw7r9m
Monero (XMR): 4ACL8AGrEo5hAir8A9CeVrW8pEauWvnp1WnSDZxW7tziCDLhZAGsgzhRQABDnFy8yuM9fWJDviJPHKRjV4FWt19CJZN9D4n
-
DVR
Man in America
8 hours ago“Poseidon” Doomsday Sub, Microplastics & The War on Testosterone w/ Kim Bright
6.71K3 -
LIVE
DLDAfterDark
4 hours agoGun Talk LIVE! Thursday At The Armory! Feat. Josh of BDG&G & DLD
523 watching -
2:50:16
TimcastIRL
4 hours agoSupreme Court May OVERTURN Gay Marriage, SCOTUS Hearing Set For TOMORROW | Timcast IRL
203K99 -
4:06:47
Barry Cunningham
6 hours agoBREAKING NEWS: PRESIDENT TRUMP HOSTS A STATE DINNER | FOX NATION PATRIOT AWARDS!
85.3K56 -
LIVE
Alex Zedra
3 hours agoLIVE! New Game | The See Us
256 watching -
1:56:30
ThisIsDeLaCruz
3 hours ago $0.03 earnedOn The Road With Pantera
19.3K1 -
LIVE
meleegames
3 hours agoMelee Madness Podcast #58 - They Changed What ‘It’ Was & It’ll Happen to You
81 watching -
2:32:46
megimu32
4 hours agoOn The Subject: Why K-Pop Demon Hunters Feels Like 90s Disney Again
15K9 -
1:38:28
Glenn Greenwald
7 hours agoThe Fraudulent GOP War Against Tucker and Nick Fuentes; Dick Cheney: Hero of the Resistance; Lindsey Graham's Deranged RJC Comments | SYSTEM UPDATE #544
97.8K104 -
LIVE
ThePope_Live
2 hours agoRedsack with the boys Cheap, Jah and Nova!
316 watching