BERT vs GPT

  Рет қаралды 62,632

CodeEmporium

CodeEmporium

Жыл бұрын

#machinelearning #shorts #deeplearning #chatgpt #neuralnetwork #datascience

Пікірлер: 23
@darshantank554
@darshantank554 Жыл бұрын
One is for Natural language understanding and another is for Natural language generation
@Ram_jagat
@Ram_jagat Ай бұрын
true
@borntodoit8744
@borntodoit8744 17 сағат бұрын
Remember it as INPUT>MODEL >OUTPUT MODEL INPUT (NLU) - txt recognition, vision recognition/image/movie, sound recognition/voice, MODEL OUTPUT (NLG) -txt generation, image/movie generation, sound/voice generation + tool integration MODEL PROCESSING -Basic (classification summeration extraction) advanced (reasoning, planning, orchestration)
@VarunTulsian
@VarunTulsian Жыл бұрын
this is very useful. Just wanted to add that the gpt decoder doesn't have the cross attention in the transformer block.
@Tech_kenya
@Tech_kenya 9 ай бұрын
What is cross attention
@methylphosphatePOET
@methylphosphatePOET 9 ай бұрын
@@Tech_kenya It's when word vectors reference other word vectors as opposed to just referencing themselves.
@imran7TW
@imran7TW 7 күн бұрын
@@methylphosphatePOET so kinda the opposite of self attention?
@cs101-qm2ud
@cs101-qm2ud 4 ай бұрын
Wonderfully put.
@CodeEmporium
@CodeEmporium 4 ай бұрын
Thanks a lot :)
@maninzn
@maninzn Жыл бұрын
Great explanation. For eg, if I have to read all the client emails and understand their requirements and auto create tasks based on that prediction, which model should I go for? BERT or GPT?
@JillRhoads
@JillRhoads 6 ай бұрын
I hadnt known that BERT was an acronym and had been wondering why the Sweden LLM was called Bert. I wonder if this is why. Thanks for the info!
@contactdi8426
@contactdi8426 3 ай бұрын
Can you please explain about their training process?
@vladislavkorecky618
@vladislavkorecky618 11 ай бұрын
What if I stack both encoders and decoders? Do I get some BERTGPT hybrid?
@davronsherbaev9133
@davronsherbaev9133 10 ай бұрын
there is also Whiser model, that has similar facebook BART decoder part, but has audio decoder.
@Dr_Larken
@Dr_Larken Жыл бұрын
Bert also Drives a trans am! Jokes aside I do appreciate your videos!
@nicholaszustak6299
@nicholaszustak6299 10 ай бұрын
So BERT doesn’t have a decoder? Did I misunderstand
@saimadhaviyalla5682
@saimadhaviyalla5682 9 ай бұрын
transformer models are usually parallelly run right?
@eugeneku3239
@eugeneku3239 8 ай бұрын
Not when it's decoding. No.
@hubgit9556
@hubgit9556 11 ай бұрын
good
@usama57926
@usama57926 Жыл бұрын
I love you ❤
@obieda_ananbeh
@obieda_ananbeh Жыл бұрын
Awesome 👏
@CodeEmporium
@CodeEmporium Жыл бұрын
Thanks So much
Generative AI in a Nutshell - how to survive and thrive in the age of AI
17:57
Transformers, explained: Understand the model behind GPT, BERT, and T5
9:11
СҰЛТАН СҮЛЕЙМАНДАР | bayGUYS
24:46
bayGUYS
Рет қаралды 672 М.
Cute Barbie gadgets 🩷💛
01:00
TheSoul Music Family
Рет қаралды 68 МЛН
ОДИН ДОМА #shorts
00:34
Паша Осадчий
Рет қаралды 6 МЛН
КАХА и Джин 2
00:36
К-Media
Рет қаралды 3,9 МЛН
BERT Neural Network - EXPLAINED!
11:37
CodeEmporium
Рет қаралды 372 М.
Transformers, explained: Understand the model behind ChatGPT
24:07
Leon Petrou
Рет қаралды 3,6 М.
How Many ERRORS Can You Fit in a Video?!
20:40
ElectroBOOM
Рет қаралды 602 М.
MAMBA from Scratch: Neural Nets Better and Faster than Transformers
31:51
Algorithmic Simplicity
Рет қаралды 99 М.
Why Large Language Models Hallucinate
9:38
IBM Technology
Рет қаралды 166 М.
Machine Learning vs Deep Learning
7:50
IBM Technology
Рет қаралды 605 М.
All Rust string types explained
22:13
Let's Get Rusty
Рет қаралды 145 М.
Large Language Models from scratch
8:25
Graphics in 5 Minutes
Рет қаралды 333 М.
GPT function calling in a nutshell
15:36
Henrik Kniberg
Рет қаралды 34 М.
СҰЛТАН СҮЛЕЙМАНДАР | bayGUYS
24:46
bayGUYS
Рет қаралды 672 М.