Next-Gen AI: RecurrentGemma (Long Context Length)

  Рет қаралды 3,994

Discover AI

Discover AI

Күн бұрын

Пікірлер: 6
@miikalewandowski7765
@miikalewandowski7765 8 ай бұрын
Finally! It’s happening. The Combination of all your beautiful findings.
@BradleyKieser
@BradleyKieser 8 ай бұрын
Absolutely brilliant, thank you. Exciting. Well explained.
@po-yupaulchen166
@po-yupaulchen166 8 ай бұрын
Thank you. in RG-LRU, h_{t-1} should be not inside the gates ( inside the sigmoid function) in the original paper, right? it should slow down the training processes. I am so surprised that finite memory can meet the performance of transformers with crazy infinite memory. Also, it seems traditional rnns like lstm will soon be replaced by RG-LRU. So curious if some people can compare those rnn and show what is wrong in the old design.
@codylane2104
@codylane2104 8 ай бұрын
How can we use it locally? Can we at all? LM Studio can't download it. 😞
@MattJonesYT
@MattJonesYT 8 ай бұрын
It's made by google which means it will have all the comical corporate biases that the rest of their models have. It will produce useless output really fast. When someone makes a de-biased version this tech will be much more interesting.
@Charles-Darwin
@Charles-Darwin 8 ай бұрын
Surely this provides or could provide massive efficiency gains. If I touch a hot plate and feel the heat, the state is sent to my relevant limbs to retract...but then shortly thereafter this state fades and I can then proceed to focus on other states. What are neurons if not a response network to environmental factors. Google will probably be the first to an organic/chemical computer
Mighty New TransformerFAM (Feedback Attention Mem)
24:52
Discover AI
Рет қаралды 1,8 М.
Xinlei Chen 20241203
1:09:03
UMN CSE Data Science Initiative
Рет қаралды 11
Сестра обхитрила!
00:17
Victoria Portfolio
Рет қаралды 958 М.
Cheerleader Transformation That Left Everyone Speechless! #shorts
00:27
Fabiosa Best Lifehacks
Рет қаралды 16 МЛН
Что-что Мурсдей говорит? 💭 #симбочка #симба #мурсдей
00:19
She made herself an ear of corn from his marmalade candies🌽🌽🌽
00:38
Valja & Maxim Family
Рет қаралды 18 МЛН
Text Classification in 2025 !!
13:40
Tensordroid
Рет қаралды 146
Code CoT w/ Self-Evolution LLM: rStar-Math Explained
34:05
Discover AI
Рет қаралды 2,6 М.
Visualizing transformers and attention | Talk for TNG Big Tech Day '24
57:45
Do we need Attention? A Mamba Primer
33:50
Sasha Rush 🤗
Рет қаралды 10 М.
How language model post-training is done today
53:51
Interconnects AI
Рет қаралды 2,7 М.
GraphRAG vs In-Context Learning ICL
33:29
Discover AI
Рет қаралды 2,3 М.
Smarter Reasoning w/o RAG: SOLUTION for Short-Context LLMs
33:32
Сестра обхитрила!
00:17
Victoria Portfolio
Рет қаралды 958 М.