Driving with Attention

  Рет қаралды 1,181

Andreas Geiger

Andreas Geiger

Күн бұрын

Пікірлер: 2
@ahmetfurkanaknc8959
@ahmetfurkanaknc8959 2 жыл бұрын
Thank you, excellent explanation.
@Peaceful_Chill
@Peaceful_Chill 2 жыл бұрын
Thank you for sharing your great presentation. I have a question. At 25:51 , on the right side of the figure, I can not understand the meaning of the black & white dots on the vector flow and semantic BEV. I think the black dots mean the points (x,y) located in psedo random position on the BEV to generate the vector from the position to the target waypoint. I would like you to answer about my question to make it clear.
Constraining 3D Fields for Reconstruction and View Synthesis
27:06
Andreas Geiger
Рет қаралды 3 М.
On the Frequency Bias of Generative Models
10:36
Andreas Geiger
Рет қаралды 1,4 М.
Сестра обхитрила!
00:17
Victoria Portfolio
Рет қаралды 958 М.
Cheerleader Transformation That Left Everyone Speechless! #shorts
00:27
Fabiosa Best Lifehacks
Рет қаралды 16 МЛН
When you have a very capricious child 😂😘👍
00:16
Like Asiya
Рет қаралды 18 МЛН
Reinforcement Learning - My Algorithm vs State of the Art
19:32
Pezzza's Work
Рет қаралды 156 М.
Visualizing transformers and attention | Talk for TNG Big Tech Day '24
57:45
ATISS: Autoregressive Transformers for Indoor Scene Synthesis
14:51
Andreas Geiger
Рет қаралды 3,7 М.
What do tech pioneers think about the AI revolution? - BBC World Service
25:48
BBC World Service
Рет қаралды 1,2 МЛН
Hough Transform | Boundary Detection
21:40
First Principles of Computer Vision
Рет қаралды 183 М.
Learning Robust Policies for Self-Driving
29:44
Andreas Geiger
Рет қаралды 1,1 М.
Shape As Points: A Differentiable Poisson Solver
12:38
Andreas Geiger
Рет қаралды 6 М.
Transformers (how LLMs work) explained visually | DL5
27:14
3Blue1Brown
Рет қаралды 4,7 МЛН
Attention in transformers, step-by-step | DL6
26:10
3Blue1Brown
Рет қаралды 2,1 МЛН