Premium Only Content

Retentive Network: A Successor to Transformer for Large Language Models (Paper Explained)
#ai #retnet #transformers
Retention is an alternative to Attention in Transformers that can both be written in a parallel and in a recurrent fashion. This means the architecture achieves training parallelism while maintaining low-cost inference. Experiments in the paper look very promising.
OUTLINE:
0:00 - Intro
2:40 - The impossible triangle
6:55 - Parallel vs sequential
15:35 - Retention mechanism
21:00 - Chunkwise and multi-scale retention
24:10 - Comparison to other architectures
26:30 - Experimental evaluation
Paper: https://arxiv.org/abs/2307.08621
Abstract:
In this work, we propose Retentive Network (RetNet) as a foundation architecture for large language models, simultaneously achieving training parallelism, low-cost inference, and good performance. We theoretically derive the connection between recurrence and attention. Then we propose the retention mechanism for sequence modeling, which supports three computation paradigms, i.e., parallel, recurrent, and chunkwise recurrent. Specifically, the parallel representation allows for training parallelism. The recurrent representation enables low-cost O(1) inference, which improves decoding throughput, latency, and GPU memory without sacrificing performance. The chunkwise recurrent representation facilitates efficient long-sequence modeling with linear complexity, where each chunk is encoded parallelly while recurrently summarizing the chunks. Experimental results on language modeling show that RetNet achieves favorable scaling results, parallel training, low-cost deployment, and efficient inference. The intriguing properties make RetNet a strong successor to Transformer for large language models. Code will be available at this https URL.
Authors: Yutao Sun, Li Dong, Shaohan Huang, Shuming Ma, Yuqing Xia, Jilong Xue, Jianyong Wang, Furu Wei
Links:
Homepage: https://ykilcher.com
Merch: https://ykilcher.com/merch
YouTube: https://www.youtube.com/c/yannickilcher
Twitter: https://twitter.com/ykilcher
Discord: https://ykilcher.com/discord
LinkedIn: https://www.linkedin.com/in/ykilcher
If you want to support me, the best thing to do is to share out the content :)
If you want to support me financially (completely optional and voluntary, but a lot of people have asked for this):
SubscribeStar: https://www.subscribestar.com/yannickilcher
Patreon: https://www.patreon.com/yannickilcher
Bitcoin (BTC): bc1q49lsw3q325tr58ygf8sudx2dqfguclvngvy2cq
Ethereum (ETH): 0x7ad3513E3B8f66799f507Aa7874b1B0eBC7F85e2
Litecoin (LTC): LQW2TRyKYetVC8WjFkhpPhtpbDM4Vw7r9m
Monero (XMR): 4ACL8AGrEo5hAir8A9CeVrW8pEauWvnp1WnSDZxW7tziCDLhZAGsgzhRQABDnFy8yuM9fWJDviJPHKRjV4FWt19CJZN9D4n
-
LIVE
Sean Unpaved
1 hour agoBills-Dolphins TNF Battle, Steelers' D in Crisis, & Coaching Hot Seat Alert!
107 watching -
4:33:50
Right Side Broadcasting Network
20 hours agoLIVE REPLAY: President Trump Holds a Press Conference with Prime Minister Keir Starmer - 9/18/25
40.5K36 -
1:01:35
The Rubin Report
2 hours agoJimmy Kimmel Humiliated as NY Post Exposes His Dark Reaction to Being Canceled
49.2K71 -
12:49
Clownfish TV
8 hours agoJimmy Kimmel Pulled OFF THE AIR for Charlie Kirk Comments?! | Clownfish TV
5.42K12 -
LIVE
TheAlecLaceShow
1 hour agoJimmy Kimmel FIRED | ANTIFA Labeled Terrorist Org | Guest: Matt Palumbo | The Alec Lace Show
59 watching -
1:44:57
Steven Crowder
4 hours ago🔴 FAFO: Jimmy Kimmel's gets Chopped & The Left is Freaking Out
446K292 -
1:01:44
VINCE
4 hours agoThe "Finding Out" Phase Has Officially Begun | Episode 128 - 09/18/25
231K257 -
LIVE
The Shannon Joy Show
3 hours ago🔥🔥Jimmy Kimmel Canned For Charlie Kirk WrongSpeak - MAGA Cheers🔥🔥
283 watching -
LIVE
The Mel K Show
1 hour agoMORNINGS WITH MEL K The People Must Stand Firmly for the Constitution & Bill of Rights NOW 9-18-25
858 watching -
29:39
Rethinking the Dollar
2 hours agoIntel & Nvidia Deal = Market Rigged? (You're Being Played)| Morning Check-In: Let's Talk...
2.78K