you are using your "stay at home" time, for the best
@bugs.824 жыл бұрын
Nice content guys, keep it going...
@offthepathworks9171 Жыл бұрын
What my issue is with grokking embeddings in general as in here - we set out for a single token to predict the next token, so how come for example "a e i o u" (vowels) are grouped together when they generally don't come one after the other? Wonderful example btw., I love that you start from letters and build on it.
@GauravSharma-ui4yd4 жыл бұрын
Nice
@socioDemo4 жыл бұрын
in the case of the two characters, each character will be transformed before the concatenation. EMB1, EMB2 my question: and what they will undergo in the same matrix? if yes: can you explain to me how backpropagation is done! if not: we will have two forms of representation for each character
@MrDanituga4 жыл бұрын
Nice video! Thanks. One question though, How did you determine that the output of that embedding layer should be of size 2?
@RasaHQ4 жыл бұрын
Two dimensions are easier to plot, that's the only reason.