I enjoyed and learned so much from this video that I'm viewing it a second time and studying it. At 13:15 the top two graphs under "Hierarchical feature examples" are labeled "GloVe PC2" and "GloVe PC1". What do those quantities mean? Postscript: Okay. Dr. Gwilliams explained that they are "proxies for semantic representations using GloVe word embeddings, which are an unsupervised method of embedding word vector representations." Oh well. --
@musicemail28315 күн бұрын
🤦🏼♂️
@user-xt5oe2gm5v3 ай бұрын
Musk used AI to mimic my head on some naked sex chick's body, then sold it on the dark web. I'm serious.
@user-fo1tz5rt6g2 ай бұрын
Selena Gomez unfold APEX-formation
@tigranishkhanov95217 ай бұрын
I always thought that ML is statistics + geometry done on the computer in high-dimensional spaces. Geometry comes in to help with high dimensionality. Instead of learning distributions exactly which is very hard in high dimensions we learn separating spaces of relatively simple geometry (like hyperplanes) as approximations.
@moteherogame9 ай бұрын
Thanks for sharing. It was a very fantastic speech
@derekking92672 жыл бұрын
promosm
@eprohoda2 жыл бұрын
buddy~oHw u doing?! that is interesting picture!see u around!
@juanaguilera34833 жыл бұрын
We are indeed following the recordings, thank you for posting them.