Premium Only Content

Retentive Network: A Successor to Transformer for Large Language Models (Paper Explained)
#ai #retnet #transformers
Retention is an alternative to Attention in Transformers that can both be written in a parallel and in a recurrent fashion. This means the architecture achieves training parallelism while maintaining low-cost inference. Experiments in the paper look very promising.
OUTLINE:
0:00 - Intro
2:40 - The impossible triangle
6:55 - Parallel vs sequential
15:35 - Retention mechanism
21:00 - Chunkwise and multi-scale retention
24:10 - Comparison to other architectures
26:30 - Experimental evaluation
Paper: https://arxiv.org/abs/2307.08621
Abstract:
In this work, we propose Retentive Network (RetNet) as a foundation architecture for large language models, simultaneously achieving training parallelism, low-cost inference, and good performance. We theoretically derive the connection between recurrence and attention. Then we propose the retention mechanism for sequence modeling, which supports three computation paradigms, i.e., parallel, recurrent, and chunkwise recurrent. Specifically, the parallel representation allows for training parallelism. The recurrent representation enables low-cost O(1) inference, which improves decoding throughput, latency, and GPU memory without sacrificing performance. The chunkwise recurrent representation facilitates efficient long-sequence modeling with linear complexity, where each chunk is encoded parallelly while recurrently summarizing the chunks. Experimental results on language modeling show that RetNet achieves favorable scaling results, parallel training, low-cost deployment, and efficient inference. The intriguing properties make RetNet a strong successor to Transformer for large language models. Code will be available at this https URL.
Authors: Yutao Sun, Li Dong, Shaohan Huang, Shuming Ma, Yuqing Xia, Jilong Xue, Jianyong Wang, Furu Wei
Links:
Homepage: https://ykilcher.com
Merch: https://ykilcher.com/merch
YouTube: https://www.youtube.com/c/yannickilcher
Twitter: https://twitter.com/ykilcher
Discord: https://ykilcher.com/discord
LinkedIn: https://www.linkedin.com/in/ykilcher
If you want to support me, the best thing to do is to share out the content :)
If you want to support me financially (completely optional and voluntary, but a lot of people have asked for this):
SubscribeStar: https://www.subscribestar.com/yannickilcher
Patreon: https://www.patreon.com/yannickilcher
Bitcoin (BTC): bc1q49lsw3q325tr58ygf8sudx2dqfguclvngvy2cq
Ethereum (ETH): 0x7ad3513E3B8f66799f507Aa7874b1B0eBC7F85e2
Litecoin (LTC): LQW2TRyKYetVC8WjFkhpPhtpbDM4Vw7r9m
Monero (XMR): 4ACL8AGrEo5hAir8A9CeVrW8pEauWvnp1WnSDZxW7tziCDLhZAGsgzhRQABDnFy8yuM9fWJDviJPHKRjV4FWt19CJZN9D4n
-
12:30:34
Times Now World
22 hours agoLIVE | Russia-Belarus Zapad-2025 LIVE | Missiles Target NATO in Arctic & Baltic
3.55K -
32:14
daniellesmithab
16 hours agoNew Feature for Driver’s Licence and ID Cards
22.6K5 -
2:54:40
FreshandFit
13 hours agoChat Makes Pothead RAGE QUIT!!!
480K67 -
1:32:34
Badlands Media
16 hours agoBaseless Conspiracies Ep. 150: 9/11 Mysteries, Remote Pilots, and Hidden Agendas
95.2K34 -
5:32:35
Akademiks
7 hours agoWHERE IS WHAM????? Thug we Forgive u dawg.. Ralo vs Boosie. Charlie Kirk fallout. Cardi B album?
78.1K7 -
2:05:53
Inverted World Live
8 hours agoDeath Cult Terror Cells, NASA Bans Chinese Nationals | Ep. 108
72.2K11 -
2:43:57
TimcastIRL
9 hours agoVP Says No Unity With Democrats Celebrating Charlie Kirk Assassination, Left Confirmed | Timcast IRL
300K196 -
13:45
The Charlie Kirk Show
9 hours agoTPUSA AT ASU CANDLELIGHT VIGIL
237K69 -
55:10
Katie Miller Pod
9 hours ago $18.69 earnedEpisode 6 - Attorney General Pam Bondi | The Katie Miller Podcast
115K35 -
1:46:41
Man in America
14 hours agoLIVE: Assassin Story DOESN'T ADD UP! What Are They HIDING From Us?? | LET'S TALK
96.1K139