It’s Getting Harder to Spot a Deep Fake Video

  Рет қаралды 8,428,344

Bloomberg Originals

6 жыл бұрын

Fake videos and audio keep getting better, faster and easier to make, increasing the mind-blowing technology's potential for harm if put in the wrong hands. Bloomberg QuickTake explains how good deep fakes have gotten in the last few months, and what's being done to counter them.
Video by Henry Baker, Christian Capestany
Like this video? Subscribe: kzbin.info
Become a Quicktake Member for exclusive perks: kzbin.infojoin
QuickTake Originals is Bloomberg's official premium video channel. We bring you insights and analysis from business, science, and technology experts who are shaping our future. We’re home to Hello World, Giant Leap, Storylines, and the series powering CityLab, Bloomberg Businessweek, Bloomberg Green, and much more.
Subscribe for business news, but not as you've known it: exclusive interviews, fascinating profiles, data-driven analysis, and the latest in tech innovation from around the world.
Visit our partner channel QuickTake News for breaking global news and insight in an instant.

Пікірлер: 5 107
@business
@business 3 жыл бұрын
We have some exciting news! We’re launching channel Memberships for just $0.99 a month. You’ll get access to members-only posts and videos, live Q&As with Bloomberg reporters, business trivia, badges, emojis and more. Join us: kzbin.infojoin
@52sees
@52sees 3 жыл бұрын
Epic
@Amu_LEGEND
@Amu_LEGEND 3 жыл бұрын
Okay 👁️ 👁️ 👄
@lightningfun6486
@lightningfun6486 3 жыл бұрын
What
@TheProfessor66
@TheProfessor66 2 жыл бұрын
"Warfare campaign" that aged poorly with the Ukraine president deepfake.
@deleted9388
@deleted9388 2 жыл бұрын
Anyone with 3 million subZ is a cointel pro shill
@MrPatpuc
@MrPatpuc 6 жыл бұрын
This is terrifying. Imagine when deepfake videos can frame innocent people as guilty.
@22z83
@22z83 6 жыл бұрын
Well soon videos can't be used as evidence because of this
@JustAshUwU
@JustAshUwU 6 жыл бұрын
@@22z83 but it can still ruin lives
@treerexaudi
@treerexaudi 6 жыл бұрын
unless it is 16k res it isn't trustable xD even a simple mask in a robbery I saw can make it look like someone else just because low quality camera. It is silly and scary.
@MrSirFluffy
@MrSirFluffy 6 жыл бұрын
You can fake at 4k and lower resolution to make it impossible to know if its fake.
@deitrickorullian505
@deitrickorullian505 6 жыл бұрын
False allegations can be made against people without any real evidence to support them and people believe it, I can't image what this will do
@brianmchaney7473
@brianmchaney7473 6 жыл бұрын
2008: Pics or it didn't happen. 2018: Pics are a lie.
@AFCA-vn9bl
@AFCA-vn9bl 6 жыл бұрын
Brian McHaney pics have been a lie since photoshop, but now evenvideo’s are a lie
@wolfenstien13
@wolfenstien13 6 жыл бұрын
Remember it or it didn't happen, Write it or it didn't happen, Paint it or it didn't happen, Pint it or it didn't happen, Photograph it or it didn't happen, Record it or it didn't happen, Take a of picture it or it didn't happen, Video tape it or it didn't happen, What now?
@efloof9314
@efloof9314 6 жыл бұрын
White Coyote pull out your brain hook it up to a system and show the memory of it happenin
@とふこ
@とふこ 6 жыл бұрын
@@wolfenstien13 now? nothing happening.
@Dig_Duke_SFM
@Dig_Duke_SFM 6 жыл бұрын
@@DrumToTheBassWoop A literal human sacrifice to Satan to reveal the truth. Or it didn't happen.
@520lun
@520lun 4 жыл бұрын
2018: Deep fake is dangerous 2020: DAME DA NE
@mrpizza_5139
@mrpizza_5139 4 жыл бұрын
DAME YO
@PopAwesomeJesusislife
@PopAwesomeJesusislife 4 жыл бұрын
DAME DA MO YOOO
@lassipls
@lassipls 4 жыл бұрын
ANTA GA
@aldorahman4755
@aldorahman4755 4 жыл бұрын
SUKIDE, SUKISUGITE
@520lun
@520lun 4 жыл бұрын
Dore dake
@aguyonasiteontheinternet
@aguyonasiteontheinternet Жыл бұрын
The real terrifying thing about this video is that it was uploaded 4 years ago.
@audiobyamp4459
@audiobyamp4459 Жыл бұрын
Im starting to see all of the videos with startling information are usually old
@whizzerbrown1349
@whizzerbrown1349 Жыл бұрын
So far the only Deep Fake I’ve seen popping up have been videos of the members of parliament playing sweaty League of Legends matches so personally the doom and gloom of this video has started eroding lol
@963freeme
@963freeme Жыл бұрын
The newer videos of Trump look like deep fakes. In the Kanye West & Piers Morgan interview from last year, Kanye looked like a deep fake.
@jocogorenc7354
@jocogorenc7354 4 ай бұрын
Five :o
@richardt6980
@richardt6980 3 ай бұрын
that right before the craze. at that time nvidia needed something to sell their gpus since mining crypto was no longer processing intensive and their stock was in the toilet. look it up
@nufizfunslower3438
@nufizfunslower3438 4 жыл бұрын
Imagine developing advanced technology and people using it to make memes
@joonatanlepind3124
@joonatanlepind3124 4 жыл бұрын
lmao I love that it's happening.
@diwakardayal954
@diwakardayal954 4 жыл бұрын
that's the internet
@JA-yz8eq
@JA-yz8eq 4 жыл бұрын
this is just movie technology released for im sure the pure purpose of high-level plausible deniability tampering
@joonatanlepind3124
@joonatanlepind3124 4 жыл бұрын
@@JA-yz8eq no it's for making callmecarson sing big time rush
@leleled6467
@leleled6467 4 жыл бұрын
I guess people are doing this in an attempt to corrupt it before it's used for the worst
@ChrisSche
@ChrisSche 5 жыл бұрын
It won’t be long until video, photographs, or audio recordings are no longer considered evidence in a court of law.
@JP-sm4cs
@JP-sm4cs 5 жыл бұрын
Make public broadcasts have a near invisible encryption watermark that distorts modifications? But yeah phone based evidence is screwed
@eddyavailable
@eddyavailable 5 жыл бұрын
audio is very easily edited and manipulated nowadays.
@dennydarkko
@dennydarkko 5 жыл бұрын
Far Altright how do you know they haven’t already? 😂
@01_SPACE_C0WB0Y
@01_SPACE_C0WB0Y 5 жыл бұрын
Yes they will... Deepfakes are pointless on security cams they are not at the right angle and the video is ussualy to low res.
@ev.c6
@ev.c6 5 жыл бұрын
@@JP-sm4cs SURE. Like you can't fake the water mark either. You seem not to understand how complex the AI behind this deep fake is. If they can fake someone's facial expressions in a video like that, just imagine how easy it is to put some stupid watermark in a few frames.
@cjezinne
@cjezinne 6 жыл бұрын
At first, I thought this was going to be bad... but then I saw the Nicolas Cage renders and then life made sense again
@hirokatsuvictor8755
@hirokatsuvictor8755 6 жыл бұрын
This has too much meme potential
@Matthew_Fog
@Matthew_Fog 6 жыл бұрын
Lol
@urface640
@urface640 5 жыл бұрын
2k likes yet only two(technically three now)replies lol
@llllIIIlllIIllllII
@llllIIIlllIIllllII 3 жыл бұрын
@@hirokatsuvictor8755 yeah...
@CodepageNet
@CodepageNet Жыл бұрын
this is all Nicolas Cages fault!
@myusernameis_pasword6860
@myusernameis_pasword6860 3 жыл бұрын
I think rules and laws about deep fakes should be put in place before this gets any worse because real harmful things can happen to people's reputations and even the fact that people can claim innocence for crap they did say!
@river_acheron
@river_acheron 2 жыл бұрын
How? Once something CAN be technologically done, there cannot be rules and laws to unlearn it. lol. Those that want to use deepfakes to scam are of course not going to listen to rules and laws against creating them! The ONLY solution here is to find a way to detect a deepfake from the real thing.
@tj_trout9855
@tj_trout9855 Жыл бұрын
Surly no one will break the law!
@myusernameis_pasword6860
@myusernameis_pasword6860 Жыл бұрын
@@tj_trout9855 of course people will but at least with rules in place people have the ability to take action in a court
@yoursleepparalysisdemon1828
@yoursleepparalysisdemon1828 Жыл бұрын
it’s just technology at this point. you don’t seem to understand why banning it would be hindering tech. don’t hate what you don’t understand.
@myusernameis_pasword6860
@myusernameis_pasword6860 Жыл бұрын
@@yoursleepparalysisdemon1828 I'm no hating it, I think it has cool applications. You misunderstand my point of view. What I'm saying is that there should be laws in place to defend those who's images are being used to say things they never said. I don't want to ban it, I just want to make sure that there is protection in place in case people misuse this technology.
@3p1cand3rs0n
@3p1cand3rs0n 6 жыл бұрын
I seriously thought they were going to reveal that the guy at 0:56 was, himself, a deep fake.
@thoyo
@thoyo 6 жыл бұрын
same here! his lips didn't seem to match his speech and his eyes looked a bit dead
@andrelee7081
@andrelee7081 6 жыл бұрын
I think that's just the power of a potato cam.
@TitorEPK
@TitorEPK 6 жыл бұрын
You're ready for the future.
@tacubaeulalio
@tacubaeulalio 6 жыл бұрын
Does anyone know what they are talking about when they mention the weather patterns or flowers? That part honestly confused me. I take it as they can make fake videos of weather changing or flowers blooming but not sure why that would be useful ?
@mishaj2647
@mishaj2647 6 жыл бұрын
ElyssaAnderson b
@OsaZain
@OsaZain 6 жыл бұрын
Imagine the potential for blackmail :/
@Dylan-hy2zj
@Dylan-hy2zj 6 жыл бұрын
OsaZain if anything the reverse is true, any video posted of you doing bad things you can just say it was deep fake blackmail
@madman2u
@madman2u 6 жыл бұрын
+Dylan Adams *Except* for the fact that it won't matter. Anything remotely real looking is going to work against your interests. It wouldn't matter if it's a fake because people will still believe you did or say X. People's reputation and life has been ruined for less and without evidence. Say someone accuses X or being a liar and a cheat. X says the video is a deep fake. The accusation while not necessarily true conflicts with X's statement. You can either take the video as evidence or the persons word who has a vested interest and therefore lies to protect themselves. It's a lose-lose scenario for the accused. It's just a matter of how much you'll lose. Even if the video is then proven to be fake the damage would've already been done. Unfortunately, bad news are so much easier to believe. It's not ideal at all... What we can do to combat this is to be more wary of the so called evidence people come up with. Being objective is important and if there is any doubt then one should always err on the side of innocence rather than guilt.
@OsaZain
@OsaZain 6 жыл бұрын
madman2u People tend to believe maligning things much much more easily than the positive ones as well :(
@holyn8
@holyn8 6 жыл бұрын
yea the potential for blackmailing is going down to 0% because of this technologie. you cant use videos for evidence anymore. everything you see on a screen could be faked
@elias_xp95
@elias_xp95 6 жыл бұрын
What blackmail? It's now easier to claim it as fake. It's the opposite effect of blackmail.
@henrytherobot
@henrytherobot 6 жыл бұрын
*This would never have happened if Nicholas Cage didn't exist* 😜
@justahuman2121
@justahuman2121 6 жыл бұрын
Just 2 likes? 0 comment? Pretty sure this comment will blow one day. Edit: ok it's now 1,2 k Edit: 4k now Edit: i bet it will hit 7k
@testname2635
@testname2635 6 жыл бұрын
@@justahuman2121 Agreed
@leonthethird7494
@leonthethird7494 6 жыл бұрын
HENRY THE RC CAR its spelled nicolas cage
@tharv120
@tharv120 6 жыл бұрын
You everywhere
@mohammedraqib6418
@mohammedraqib6418 6 жыл бұрын
This is that day
@jadkleb2788
@jadkleb2788 3 жыл бұрын
Other than the funny comments and memes this is actually extremely terrifying...
@LAkadian
@LAkadian 3 жыл бұрын
Actually, those are terrifying too, for their unabashed idiocy.
@Thefootqueen
@Thefootqueen 4 жыл бұрын
Everyone before: Deepfake is so dangerous... Everyone now: *DAME DA NE*
@stinkygorilla2058
@stinkygorilla2058 4 жыл бұрын
Is the app still up?
@beruta1733
@beruta1733 4 жыл бұрын
relateble
@ohword9541
@ohword9541 4 жыл бұрын
dame yo
@grigoriyefimovichrasputin7897
@grigoriyefimovichrasputin7897 4 жыл бұрын
DAME DA NE
@generically.watched
@generically.watched 4 жыл бұрын
DAME YO DAME NA NO YO
@ResoundGuy5
@ResoundGuy5 6 жыл бұрын
This is going to end badly...
@glynemartin
@glynemartin 6 жыл бұрын
it's not gonna end...that's the problem...
@wutsit2yuhhuh246
@wutsit2yuhhuh246 6 жыл бұрын
@benzo I think you trust your government a little bit too much.
@wutsit2yuhhuh246
@wutsit2yuhhuh246 6 жыл бұрын
@benzo "We'll know our disinformation program is complete when everything the American public believes is false." -Former CIA Director William Casey
@joeljarnefelt1269
@joeljarnefelt1269 6 жыл бұрын
@benzo He said, "This is going to end badly." Point out where he stated that the entire development of these programs should be terminated.
@joeljarnefelt1269
@joeljarnefelt1269 6 жыл бұрын
@benzo Maybe you wouldn't want to develope it, or maybe you are just expressing your conserns of the possible misuses of the emerging technology.
@Nismoronic
@Nismoronic 6 жыл бұрын
Can I use it for memes tho
@lLl-fl7rv
@lLl-fl7rv 6 жыл бұрын
You're THE man.
@edd868
@edd868 6 жыл бұрын
Yes. Prepare for the oncoming deep fake meme war between 4chan and Reddit
@mariopokemon955
@mariopokemon955 5 жыл бұрын
Tesco Stig people have already, also can be used to start war but it's no biggie
@VOLAIRE
@VOLAIRE 5 жыл бұрын
Yeah memes aren’t a big deal ha
@jayg6tk
@jayg6tk 5 жыл бұрын
Obama: *ATTENTION ALL FORTNITE GAMERS!*
@willdwyer6782
@willdwyer6782 Жыл бұрын
Putting Tom Hanks as Forrest Gump into archival TV footage could be considered early deepfake video. They digitally manipulated the lips of the other people in the scenes to move in sync with an impersonator's voice.
@yoursleepparalysisdemon1828
@yoursleepparalysisdemon1828 Жыл бұрын
isn’t a deepfake using ai or something? iirc it was done differently.
@sarah69420
@sarah69420 Жыл бұрын
@@yoursleepparalysisdemon1828 deep fake is a general idea of creating a fake video/audio/picture of or including someone not originally there or altering those who are, AI is just a tool to get that done
@yoursleepparalysisdemon1828
@yoursleepparalysisdemon1828 Жыл бұрын
@@sarah69420 the definition is that it was digitially altered.
@Jaylio
@Jaylio 5 жыл бұрын
0:56 dude looks more cgi that the fakes
@fortheloveofnoise
@fortheloveofnoise 5 жыл бұрын
Those oddly fluttering lips.....wtf
@jcesplanada528
@jcesplanada528 5 жыл бұрын
I know, right. I really thought it was fake too
@hectorhector3819
@hectorhector3819 5 жыл бұрын
.
@SlatDogg
@SlatDogg 5 жыл бұрын
I seriously thought that someone deep faked that video just to prove a point.
@Atombender
@Atombender 5 жыл бұрын
Until the end of the video I thought that it was fake. Damnit...
@EnzoDraws
@EnzoDraws 6 жыл бұрын
1:47 why tf does the source look faker than the deep fake?
@aaronmicalowe
@aaronmicalowe 5 жыл бұрын
There is no such thing as a deep fake. This video is fake news.
@Unknown-dq2cj
@Unknown-dq2cj 5 жыл бұрын
🤣
@justanotheryoutubechannel
@justanotheryoutubechannel 5 жыл бұрын
Kalazakan Or more likely just a troll.
@kyoshinronin
@kyoshinronin 5 жыл бұрын
Overfitting
@TheVibes101
@TheVibes101 5 жыл бұрын
@@aaronmicalowe umm... I hope you are not serious.
@straightbusta2609
@straightbusta2609 6 жыл бұрын
This is probably the secret behind the $1000 emoji machine from apple.
@Dominicn123
@Dominicn123 6 жыл бұрын
Face tracking has been around for years brah
@r32fandom89
@r32fandom89 6 жыл бұрын
ay u got 69 subscribers
@amfm4087
@amfm4087 6 жыл бұрын
No because this requires hours of time and a decent graphics card for just a short clip. The iPhone uses a different technology as the emojis are 3D models. This technology uses 2D pictures like jpeg and png.
@jerrell1169
@jerrell1169 6 жыл бұрын
Yeah they straight busta!
@Lou-C
@Lou-C 6 жыл бұрын
Me me big boy
@arcosprey4811
@arcosprey4811 Жыл бұрын
Imagine how much worse it is now.
@Ceshua
@Ceshua 5 жыл бұрын
Back in the days where everyone says: "Video evidence can't lie." 2018: (Edit) 2020: Baka Mitai
@akkafietje137
@akkafietje137 5 жыл бұрын
I saw it with my own eyes
@KnightBokura
@KnightBokura 5 жыл бұрын
Well, back then it couldn't. So they were still right.
@agentsmith9858
@agentsmith9858 5 жыл бұрын
@@KnightBokura you missed the point
@TheAnonyy
@TheAnonyy 5 жыл бұрын
It could not then. Now it can lie. This is the problem with people embracing new technologies you can't trust what you, see, hear, feel, smell too many artificial things or there.
@rukna3775
@rukna3775 4 жыл бұрын
Ok boomer
@Cloudeusz
@Cloudeusz 5 жыл бұрын
Technology is a double edged sword
@fellowcitizen
@fellowcitizen 5 жыл бұрын
...that looks like a cup of tea.
@rvke5639
@rvke5639 5 жыл бұрын
with no handles
@ХареКришна-т7г
@ХареКришна-т7г 5 жыл бұрын
What is double edged sword then
@subzero5055
@subzero5055 5 жыл бұрын
@@ХареКришна-т7г you kill with it or get killed by it
@yungwhiticus8757
@yungwhiticus8757 5 жыл бұрын
Ad Victorium, Brother!
@TheAstronomyDude
@TheAstronomyDude 6 жыл бұрын
Nick Cage SHOULD be every actor in every movie.
@milanistaminetti
@milanistaminetti 6 жыл бұрын
TheAstronomyDude plot twist, he is
@Nilvolentibusje
@Nilvolentibusje 6 жыл бұрын
I support this
@YoungBlaze
@YoungBlaze 6 жыл бұрын
Hes actually everyone in real life
@antoniob4458
@antoniob4458 6 жыл бұрын
Where do I sign?
@muzgnasicianie
@muzgnasicianie 6 жыл бұрын
He is one of my favourite actors!
@spetzy1921
@spetzy1921 Жыл бұрын
This was 4 years ago. Let that sink in.
@sanmartinella4933
@sanmartinella4933 Жыл бұрын
And this tool is offered to the public, imagine what our governments have.
@ceece3817
@ceece3817 6 жыл бұрын
Black mirror do ya thing
@InnovAce
@InnovAce 6 жыл бұрын
ceec e black mirror and Altered Carbon
@ShufflingManu
@ShufflingManu 6 жыл бұрын
I am more concerned about influential people labelling real videos of them as deep fakes in order to avoid consequences than I am about someone trying to harm said people with deep fakes.
@OneEyeShadow
@OneEyeShadow 6 жыл бұрын
+Captain Caterpillar Like what? The entire point of the programme is to make it as seemless as possible - so when the technology is actually "there" that's not the case anymore.
@Fiufsciak
@Fiufsciak 6 жыл бұрын
@@OneEyeShadow Lol, nope. They may look seamless to a human eye but not to a software designed to expose fakes.
@swandive46
@swandive46 5 жыл бұрын
Like Trump?
@PureVikingPowers
@PureVikingPowers 5 жыл бұрын
@@swandive46 Is Trump even a real person? 🙄
@allenkennedy99
@allenkennedy99 5 жыл бұрын
That's actually very poor logic.
@yongamer
@yongamer 6 жыл бұрын
This can become so scary.
@YoungBlaze
@YoungBlaze 6 жыл бұрын
Like my exs mother!
@PasscodeAdvance
@PasscodeAdvance 6 жыл бұрын
I agree with the Internet person
@AthosRespecter
@AthosRespecter 6 жыл бұрын
@Throngdorr Mighty lol
@someonesomewhere6289
@someonesomewhere6289 6 жыл бұрын
@Throngdorr Mighty once this technology is developed further (and it will be) doesn't matter how gullible you are; or if you're wise to all the tricks. We be fucked.
@yongamer
@yongamer 6 жыл бұрын
@Throngdorr Mighty The thing is that a significant proportion of people are dumb. And this technology is going to improve. I dont see why a fake video using this technology could not go viral.
@ghdhfgh6125
@ghdhfgh6125 Жыл бұрын
This video was posted 4 years ago. Imagine how hard it is now.
@GIRru11
@GIRru11 4 жыл бұрын
Everyone: This stuff is dangerous and scary! Me: DAME DA NE DAME YO!
@haxxruz6284
@haxxruz6284 4 жыл бұрын
Baka mitai best meme
@denniscuesta7009
@denniscuesta7009 4 жыл бұрын
Putin singing baka mitai is the funniest
@aPandesalboi
@aPandesalboi 4 жыл бұрын
You mean every memer
@bianca.611
@bianca.611 4 жыл бұрын
i can hear and see these videos damn it.
@madjaster9620
@madjaster9620 4 жыл бұрын
I came here from the yanderedev and others singing deepfake lmao
@WholesomeLad
@WholesomeLad 6 жыл бұрын
It's also getting harder to spot a fake deep comment
@everyone9500
@everyone9500 5 жыл бұрын
oof you're here
@npc304
@npc304 5 жыл бұрын
It's also harder to not be a nazi in my book. And just remember, the NPC meme is dehumanizing. We are all unique and special
@JJ-te2pi
@JJ-te2pi 5 жыл бұрын
@@npc304 Youre boring. Dead meme.
@rixille
@rixille 5 жыл бұрын
How do we know who is real and who isn't? Mass confusion is a powerful way to separate society.
@misterrogerroger5537
@misterrogerroger5537 5 жыл бұрын
How can mirrors be real if our eyes aren't real
@syrus1233
@syrus1233 4 жыл бұрын
"Deep fakes gained popularity through adding famous celeberties to porn scenes" Ahh porn, always innovating.
@soda_crackerr
@soda_crackerr 4 жыл бұрын
True *cri* ✊😌
@TheMaster4534
@TheMaster4534 3 жыл бұрын
The Russians have a word for that. Компромети́рующий материа́л, or компромат for short.
@denierdev9723
@denierdev9723 3 жыл бұрын
Always defiling and immoral, too.
@jordanmendoza812
@jordanmendoza812 3 жыл бұрын
@@denierdev9723 your name checks out
@denierdev9723
@denierdev9723 3 жыл бұрын
@@jordanmendoza812 ?
@bignickofficial
@bignickofficial Жыл бұрын
I love how there are millions of issues in the world that need solutions, and instead, we figured out how to be more manipulative. 🤦‍♂️
@spicychipgaming2080
@spicychipgaming2080 4 жыл бұрын
2018: deepfakes are dangerous and could harm other people 2020: hamster sings Japanese game OST
@Starry_Wave
@Starry_Wave 4 жыл бұрын
And Yandere Dev singing to the Big Time Rush opening theme.
@mangovibes2525
@mangovibes2525 4 жыл бұрын
Doom Changer lol I just saw that vid
@user-on8vk5gb6x
@user-on8vk5gb6x 4 жыл бұрын
DAME DAME
@Accidentalreef
@Accidentalreef 4 жыл бұрын
Charles! Man i thought u died! Im happy your back!
4 жыл бұрын
ً Dame da ne, dame yo, dame na no yo. Anotana, sukide sukiiii sukiteee
@aguywithsubs8956
@aguywithsubs8956 6 жыл бұрын
The porn industry is evolving get ready for VR porn
@andatop
@andatop 6 жыл бұрын
Vr porn has been a thing for a decade
@brandonontama2415
@brandonontama2415 6 жыл бұрын
It will get worse, soon it will anime and video game characters. And then it will be a virtual reality were you can actually... things are really getting weird.
@florianp4627
@florianp4627 6 жыл бұрын
It has existed for a few years now, ever since the Oculus Rift dev kit initially came out
@brandonontama2415
@brandonontama2415 6 жыл бұрын
@@moogreal Crap...
@dirtiestharry6551
@dirtiestharry6551 6 жыл бұрын
I want subs ready player porn
@crispsandchats
@crispsandchats 6 жыл бұрын
remember everybody: just because you can, doesn’t mean you should
@HullsColby
@HullsColby 5 жыл бұрын
I can deep throat a banana. But since you said so maybe I shouldn't.
@erazure.
@erazure. 5 жыл бұрын
Hulls Colby just a single banana? Step your game up, 3 bananas at once or a large cucumber minimum
@missionpupa
@missionpupa 5 жыл бұрын
Cool comment, but do you know how tragic it is of humans actually followed that, and deny everything about what makes us human, our curiosity and desire to progress. Scientists and engineers history havent done things because they could, but because they can. We do thing because they are possible, and you cant stop that.
@fruitygarlic3601
@fruitygarlic3601 5 жыл бұрын
@@missionpupa Stop being so pedantic. If something should be done, do it. If not, then don't. Imagine reaching so far you find something to argue about in something that doesn't necessarily disagree with you.
@lol-fh3oq
@lol-fh3oq 5 жыл бұрын
@@missionpupa I mean obviously there's still gonna be people who do it but that doesn't mean it's right.. Lmao, why're you reaching?
@timber8507
@timber8507 2 жыл бұрын
I wonder if this technology has or will be used in the war right now? It's absolutely something to consider when watching media today.
@nocaptainmatt3771
@nocaptainmatt3771 2 жыл бұрын
Of course it is
@anetkasbzk98
@anetkasbzk98 2 жыл бұрын
Bingo. Deep fake raW
@shelby1246
@shelby1246 2 жыл бұрын
This comment aged well… I went watching deepfake videos after hearing about the recent one of Putin.
@AbelMaganaAvalos
@AbelMaganaAvalos 2 жыл бұрын
Zelensky got deepfaked
@timber8507
@timber8507 2 жыл бұрын
@@AbelMaganaAvalos Yeah, I saw that on the news.
@nicoh.1082
@nicoh.1082 4 жыл бұрын
This is terrifying. Imagine Nicolas Cage playing in every movie..
@TheWanderer_99
@TheWanderer_99 4 жыл бұрын
No, its perfection
@Raccon_Detective.
@Raccon_Detective. 4 жыл бұрын
No, it's perfection
@justaneditygangstar
@justaneditygangstar 4 жыл бұрын
No, it's perfection
@Traventity
@Traventity 4 жыл бұрын
No, It's perfection
@migocasty5515
@migocasty5515 4 жыл бұрын
No, its perfection.
@altaica3522
@altaica3522 4 жыл бұрын
People are making fun of this video, but it's only a matter of time till someone uses this for malicious purposes.
@ljoxleyofficial8119
@ljoxleyofficial8119 4 жыл бұрын
They already are
@ljoxleyofficial8119
@ljoxleyofficial8119 4 жыл бұрын
Vincent DiPaolo what does this all mean?
@yimmy7160
@yimmy7160 3 жыл бұрын
You act like this is new and hasn't been done. This "tech" has been around for a while actually
@mirjanapucarevic2105
@mirjanapucarevic2105 3 жыл бұрын
It is very scary how many lives will be destroyed?!
@raoulfr
@raoulfr 6 жыл бұрын
This technology comes from porn...what is humanity evolving into 😂!?
@jaliborc
@jaliborc 6 жыл бұрын
It doesnt come from porn. It comes from academia. The industry is using it in it, after being developed in the academia for general purpose.
@beamboy14526
@beamboy14526 6 жыл бұрын
evolving to create a direct brain-to-computer porn indistinguishable from reality
@WaitingForStorm
@WaitingForStorm 6 жыл бұрын
porn is one of the biggest industries on the planet
@goforit7774
@goforit7774 6 жыл бұрын
porn will be banned like prostitution
@Lucky8s
@Lucky8s 6 жыл бұрын
@@goforit7774 Banned by who exactly?
@sifutophmasterofeyerolling2513
@sifutophmasterofeyerolling2513 Жыл бұрын
It's insane how much the technology improved it just 4 years, we now have an almost perfect TTS voices as well.
@Truthorfib
@Truthorfib Жыл бұрын
It's insane though that the focus was this instead of other things that truly benefit us as a whole. Goes to show where innovation is heading and its not toward our collective success. But things like misinformation and espionage.
@cardorichard4148
@cardorichard4148 6 жыл бұрын
Trumps new favorite phrase, “Deep fake news.” 😂
@ggsay1687
@ggsay1687 6 жыл бұрын
It would be hard to deny if someone put his fase on insane person shouting insults on squer.
@leonscottkennedy3143
@leonscottkennedy3143 6 жыл бұрын
GG SAY *face
@ggsay1687
@ggsay1687 6 жыл бұрын
you missed the "squer", I think I was drunk
@L7vanmatre
@L7vanmatre 6 жыл бұрын
TRUMP LOL HAHA
@ジョジョさま
@ジョジョさま 6 жыл бұрын
You're pathetic.
@zuko1569
@zuko1569 6 жыл бұрын
Shapeshifting reptilians want to know your location
@joshuakoh1291
@joshuakoh1291 6 жыл бұрын
Zuzu "That's fucking rough buddy for me"
@kat_867
@kat_867 6 жыл бұрын
Ones in the whitehouse
@PASTELXENON
@PASTELXENON 6 жыл бұрын
@@kat_867 if youre an idiot
@kat_867
@kat_867 6 жыл бұрын
Pastel Xenon if? Lmao what 😂 just go away.
@sofialaya596
@sofialaya596 6 жыл бұрын
lmao
@keelo-byte
@keelo-byte 5 жыл бұрын
Forget the fake celebrity porn and political tapes, this technology should be used for only one thing... *remixing old school kung-fu movies.*
@caralho5237
@caralho5237 5 жыл бұрын
I imagine Bruce Lee dabbing and doing fortnite dances. Scary.
@xouslic742
@xouslic742 5 жыл бұрын
you mean remaster
@keelo-byte
@keelo-byte 5 жыл бұрын
@@xouslic742 no I meant remix. Sort of like "kungpow: enter the fist"
@Motorata661
@Motorata661 5 жыл бұрын
Bruce Lee. Jacky Chan Jet Lee Donnie Jen Kung-Fu battle royale The movie
@pieterdejager7805
@pieterdejager7805 5 жыл бұрын
Bwahahaha...now ure talking!....
@venmis137
@venmis137 2 жыл бұрын
2018: Deepfake is a terrifying, dangerous technology. 2022: GENGHIS KHAN SINGS SUPER IDOL
@shawnli4746
@shawnli4746 5 жыл бұрын
If this technology evolves, get ready for the dystopia that Orwell predicted, and be ruled by faceless individuals...
@CriticalRoleHighlights
@CriticalRoleHighlights 5 жыл бұрын
This could be something a dystopian government uses when the masses wouldn't know any better _after_ a dystopia has occurred by other means, but dystopia will never occur because of it.
@lilahdog568
@lilahdog568 5 жыл бұрын
CRH our government could begin going after individuals simply by creating evidence in the form of deep fake videos
@nefelibata4190
@nefelibata4190 5 жыл бұрын
what is the point in the videos if you can't tell what is fake and not? you would need an expert on the case that is somehow being monitored by another expert ans several other people, who has the best or worst intensions for human kind.
@DeeZv1
@DeeZv1 5 жыл бұрын
@@nefelibata4190 we already have people faking their gender
@mutanazublond4391
@mutanazublond4391 4 жыл бұрын
It has evolved, are you stupid, 90 percent of all actors used are non existant with fake backgrounds ... all of the documentaries are fake people ... etc etc
@samswich1493
@samswich1493 4 жыл бұрын
2018: deep fakes are very realistic and dangerous 2020: truck sings dame da ne
@LuxAeterna22878
@LuxAeterna22878 4 жыл бұрын
This is terrifying. One can only hope that equally ingenious methods of security will protect humanity against such powerful tools of deception.
@lil_weasel219
@lil_weasel219 3 жыл бұрын
yes that security is certainly uhm "impartial" eh and would never itself propagate similar things?
@braindavidgilbert3147
@braindavidgilbert3147 Жыл бұрын
I mean we talked the same way about editing at first. Look how it is now😊
@shaunluckham1418
@shaunluckham1418 Жыл бұрын
Simple rule don’t believe anything on television or online video. If you don’t see it in person it may or may not be compromised.
@jose-gr7jg
@jose-gr7jg 5 жыл бұрын
So Stan Lee will be able to do all the cameos??
@siyacer
@siyacer 4 жыл бұрын
In endgame
@ssssSTopmotion
@ssssSTopmotion 4 жыл бұрын
What's the point if its not even him that's what made it special
@sauceamericano4279
@sauceamericano4279 4 жыл бұрын
Yup
@sauceamericano4279
@sauceamericano4279 4 жыл бұрын
Yep
@luciferexperiment8553
@luciferexperiment8553 5 жыл бұрын
if this stuff became public ,they been using it for years ...
@honkhonk8009
@honkhonk8009 4 жыл бұрын
The porn industry made it and google helped expand it with tensorflow. its nothing new lmao
@afterthought4627
@afterthought4627 4 жыл бұрын
What you think they been doing?
@Phantogram2
@Phantogram2 4 жыл бұрын
What? You forgot your tinfoil hat.
@Besomar_Surahat
@Besomar_Surahat 4 жыл бұрын
DAME DANE
@carpetchair5778
@carpetchair5778 4 жыл бұрын
Bruh
@AngryGoose
@AngryGoose 4 жыл бұрын
DeepFakes:possibly dangerous Everyone: HAHA yanderedev go damedane
@wengdummy7687
@wengdummy7687 Жыл бұрын
OMG, this thing is not helping. It can be used to destroy someone's reputations.
@HeavymetalHylian
@HeavymetalHylian 6 жыл бұрын
Spread the word. This needs to be on trending.
@yourneighbour5738
@yourneighbour5738 6 жыл бұрын
Yes we need more Nicholas Cage movies
@plsdontreplytomewitharmy5926
@plsdontreplytomewitharmy5926 6 жыл бұрын
that would inspire more people to do it then :/
@Stoned2daBone-r4g
@Stoned2daBone-r4g 6 жыл бұрын
The word's been spread and we've all stayed asleep
@justsomeguys1121
@justsomeguys1121 6 жыл бұрын
HoneyedHylian trending videos are hand picked by KZbin staff
@littleme3597
@littleme3597 2 жыл бұрын
They could keep dead people alive. LIKE biden and that old hag. S.C. person. Make it appear, she is still alive and speaking.
@ramza675
@ramza675 6 жыл бұрын
CNN will make good use of this in the future.
@AndroidsDontDance
@AndroidsDontDance 6 жыл бұрын
As well as fox News
@bigsnugga
@bigsnugga 6 жыл бұрын
Hi I’m moving to America, what news source can I trust?
@ramza675
@ramza675 6 жыл бұрын
@@bigsnugga read from multiple sources, forget the headlines and read into the details. American news is a race to report the story first
@polentusmax6100
@polentusmax6100 6 жыл бұрын
onion news is pretty reliable
@Runslik3Wind
@Runslik3Wind 6 жыл бұрын
Yeah the onion is the only one i trust these days
@user-uj4ip2pt6h
@user-uj4ip2pt6h 6 жыл бұрын
we humans put technology development first and common sense second.
@muhdelyas-abgyas562
@muhdelyas-abgyas562 6 жыл бұрын
Realistic porn first and common sense second
@PasscodeAdvance
@PasscodeAdvance 6 жыл бұрын
Aliens are butter than us (or salter)
@realdeal5712
@realdeal5712 6 жыл бұрын
myownname myownlastname it is common sense to have porn video with your crush face on it
@SuperDanielHUN
@SuperDanielHUN 6 жыл бұрын
Even if 99.9% of the planet prefers sense, there is always that one guy that opposes it and creates a breakthrough as a result (sometimes). Galileo was completely insane for stating the earth is round at the time, and any person with "common sense" would say not to do it, because he'd be killed by the church and because the Bible already says its flat. Technology both helps and punishes humans, often in unexpected ways, Alfred nobel wanted to help miners with dynamite, he created a weapon of mass murder by accident. Common sense is neither universal nor definite, its rather technology and social changes that twists whats considered "common sense"
@reyxus9454
@reyxus9454 6 жыл бұрын
@@SuperDanielHUN "the bible already says it's flat" wtf are you on about
@double_lightsaber
@double_lightsaber Жыл бұрын
To think this was 4 years ago....
@HarshRajAlwaysfree
@HarshRajAlwaysfree 6 жыл бұрын
*Oh hii mark* Bloomberg trying to be dank
@sKanteii
@sKanteii 6 жыл бұрын
they did naaaaat
@TheMasterOfCornedy
@TheMasterOfCornedy 6 жыл бұрын
*bloomberg being dank
@samonterolanjayp.8229
@samonterolanjayp.8229 4 жыл бұрын
Plot twist: The expert they interviewed is a deepfake edit
@alessandrocarbotti9241
@alessandrocarbotti9241 4 жыл бұрын
I was thinking the same hahahahahah
@bobshmyder9749
@bobshmyder9749 3 жыл бұрын
Deepfake crisis
@jameschiwang
@jameschiwang 3 жыл бұрын
WoW 100Th LikE NOBoDy CareS
@vao5399
@vao5399 5 жыл бұрын
Honestly I feel like this is saying that this is some problem that's going to be hard to control, but it won't be, the scariest part about this is that the way bigger news sources are getting desperate snd lazy so they won't fact check this when it pops up.
@miamarie5426
@miamarie5426 5 жыл бұрын
Simon WoodburyForget how do we authenticate videos when people lie, audio can be faked/cut/manipulated, and pictures can obviously be photoshopped
@ChadDidNothingWrong
@ChadDidNothingWrong 4 жыл бұрын
@@miamarie5426 a Deepfake leaves specific, pixel level traces that Bloomberg here "forgot" to mention... ....but the audio is absolutely full of clues that no deepfake can come close to fixing.
@tony_5156
@tony_5156 4 жыл бұрын
We have a big UN meeting and outright ban it, with martial punishment high and tough Jail, yup no bail for you buddy your going straight to jail.
@honkhonk8009
@honkhonk8009 4 жыл бұрын
True. In courts, this wont even be an issue, but with our already lazy and retarded media, their gonna not even fact check and their gonna treat it like proof. Wont be the first time
@shannonjaensch3705
@shannonjaensch3705 Жыл бұрын
Even more sadder is that most never fact check the news they hear or anything they are told by anyone. Lazy brains that are just consumed with trying to stay alive and in a state of low consciousness physical body comfort.
@brianorozco1074
@brianorozco1074 Жыл бұрын
Honestly, this is terrifying
@geometrikselfelsefesi
@geometrikselfelsefesi Жыл бұрын
Thats even 4 years ago
@confusedwhale
@confusedwhale 6 жыл бұрын
It's true that it's getting harder to tell, but there is still something wrong with the robot face images. Long live the uncanny valley.
@themelonn6313
@themelonn6313 6 жыл бұрын
confusedwhale wow this will actually help us combatant against this lol. imagine an expert.
@David-gp3fd
@David-gp3fd Жыл бұрын
nah this a foolish short sighted perspective..the human body has its limits and would have to evolve to keep up to tell. Unfortunately tech is evolving way faster than humans
@awesome117unsc
@awesome117unsc 6 жыл бұрын
Making Memes more danker than ever.
@derekg5006
@derekg5006 4 жыл бұрын
2018: Deepfakes are dangerous! 2020: JFK talks about Rick and Morty
@Lichwizard
@Lichwizard Жыл бұрын
This was 4 years ago.
@haleyanne447
@haleyanne447 6 жыл бұрын
Who else didn’t read the title and thought the thumbnail was a spot the difference lmao
@rainbowsixsiege3040
@rainbowsixsiege3040 5 жыл бұрын
Haley Anne ur mom
@cazulon1122
@cazulon1122 5 жыл бұрын
rainbow six siege ur dad
@ComradeHellas
@ComradeHellas 5 жыл бұрын
Would it matter?
@JeremyBX
@JeremyBX 4 жыл бұрын
“Fake news on steroids” I really like that summarization
@tablespoon1277
@tablespoon1277 5 жыл бұрын
Classic representation of human out smarting themselves ultimately leading to their own demise :)
@hasanbassari7364
@hasanbassari7364 4 жыл бұрын
ok boomer
@keyboardcorrector2340
@keyboardcorrector2340 4 жыл бұрын
Yet we're still here...
@erik6473
@erik6473 4 жыл бұрын
@@hasanbassari7364 😂😂👌
@indianapaullyj
@indianapaullyj 4 жыл бұрын
Yes indeed
@jaconova
@jaconova 4 жыл бұрын
@Johnny Moris What can you expect of people that go to concerts to deny the real experience by recording everything with their phone.
@xtechn9cianx
@xtechn9cianx 2 жыл бұрын
Imagine if the world leaders are using this on Putin right now
@anetkasbzk98
@anetkasbzk98 2 жыл бұрын
Bingo
@funnypeoplefail19
@funnypeoplefail19 2 жыл бұрын
Putin is maybe allready dead?
@mrmomokar
@mrmomokar 5 жыл бұрын
This is scary. Somehow, I really feel uneasy and disgusted when I see one because it looks really off, artificial yet it’s really convincing.
@DJSbros
@DJSbros 5 жыл бұрын
This may be the downfall of our particular civilization.
@skeetermcswagger0U812
@skeetermcswagger0U812 2 жыл бұрын
Ever since I became aware of this technology,I got a really uncomfortable feeling about it. I knew it could be one of those technological 'superpowers' that wouldn't be safe if there was not a clear and adequate way to penalize the ways and how it could be used. It is in a way an ability to pirate & clone some forms of reality. Although there still seems to be some perceivable characteristics during the beginning stages of it's development that may be obvious to many and not just those with a 'trained eye',who knows at what point that it's going to be capable to make it indistinguishable to most if not all viewers of it? Do they 'have to' disclose this information? Who knows if some of the examples of those flaws that are more readily obvious to be fake aren't just being used to to distract the viewers from the more capable versions of this technology already? Great,..now I sound like a crazy person even to myself!🤦‍♂️
@aliceslab
@aliceslab Жыл бұрын
its not crazy, the future will get more complex, and it is harder to control complexity than the simplicity of our origins.
@ES11777
@ES11777 11 ай бұрын
No, you are just smart and looking at it from all angles.
@DjHardstyler
@DjHardstyler Жыл бұрын
This aged well...already
@kylecollins5463
@kylecollins5463 4 жыл бұрын
Just imagine a world where a deep fake can be shared millions of times showing a leader saying he's pushed the nucleur button
@hwlz9028
@hwlz9028 4 жыл бұрын
Yep
@sorrymyenglishbad2535
@sorrymyenglishbad2535 4 жыл бұрын
Gotta be more subtle for more chaos.
@youtubespy9473
@youtubespy9473 4 жыл бұрын
Lol, why would anybody blow up their own world unless they were suicidal.
@pit2992
@pit2992 4 жыл бұрын
@@youtubespy9473 There are people who have nothing to lose.
@youtubespy9473
@youtubespy9473 4 жыл бұрын
@@pit2992 I said that "suicidal"
@kimcarrier9834
@kimcarrier9834 5 жыл бұрын
It's not even harder. People would just believe what they want to believe, no matter how fake it is.
@muna2454
@muna2454 4 жыл бұрын
2018: THIS IS DANGEROUS 2020: It’s M to the B, it’s M to the B, it’s MMMM to the B
@coralevy-yo8dh
@coralevy-yo8dh 11 ай бұрын
So many warnings in the forms of movies, books, tv series, video games etc showing us why this is dangerous. We never listen.
@CarterLundy10
@CarterLundy10 Жыл бұрын
It’s crazy that this was 5 years ago. It’s just an every day thing to see these deep fake videos now.
@viktorthevictor6240
@viktorthevictor6240 6 жыл бұрын
Is it bad that this actually concerns me?
@boa9557
@boa9557 5 жыл бұрын
no
@obligatoryusername7239
@obligatoryusername7239 5 жыл бұрын
Viktor the victor, anyone who isn't concerned has lasting brain damage.
@kirkclarke7396
@kirkclarke7396 11 ай бұрын
Just imagine how many people would believe the world is flat if someone made loads of deep fake videos of famous people claiming earth is flat
@Jordyesscoto
@Jordyesscoto 5 жыл бұрын
This popped out in my recommendations right after seeing Shane’s new video lol
@say12033
@say12033 Жыл бұрын
It's 2023 and now everyone can make deep fakes on their phone
@cutiemalu3602
@cutiemalu3602 5 жыл бұрын
It's crazy how some poeple will literally put someone's face into porn stars face, that can affect the people's life and even ruin it
@nnggghhaa3709
@nnggghhaa3709 5 жыл бұрын
you need to be able to recognise fake from real.
@cutiemalu3602
@cutiemalu3602 5 жыл бұрын
@@nnggghhaa3709 yeah but some people believe in everything they see on social media
@cutiemalu3602
@cutiemalu3602 5 жыл бұрын
@jon kon some people would even commit suicide cause of that, and you think it's aWesOme... People like you have a special place, hell
@PhillipAmthor
@PhillipAmthor 5 жыл бұрын
Imagine putin and nicolas cage in a bbc porn
@naiknaik8812
@naiknaik8812 5 жыл бұрын
@@PhillipAmthor uh
@vogahl34
@vogahl34 10 ай бұрын
Soon, justice won’t be able to use CCTV to incriminate offenders, soon, any evidence will be questioned and we’ll be left with an utterly defenceless society.
@rachelt4792
@rachelt4792 5 жыл бұрын
I’ve always lowkey worried about the day we wouldn’t be able to differentiate real videos and fake ones
@Booooogie_
@Booooogie_ 5 жыл бұрын
Imagine all the fuckin ghost videos
@TommyDavidVerbal
@TommyDavidVerbal Жыл бұрын
Damar Hamlin's people just used it today
@9nikola
@9nikola 6 жыл бұрын
The best part about this is that if someone really regrets a video they made, they can just pretend it was a fake.
@Gurci28
@Gurci28 Жыл бұрын
It is very easy to make an AI-generated image, but it is also easy to spot such images through careful observation and critical analysis. Recently, an image of US President Joe Biden and former president Barack Obama dressed in a Barbie-inspired outfit was widely shared on the internet. August 13, 2023 Written by Ankita Deshkar 0:40 [Indian Express]
@Gurci28
@Gurci28 Жыл бұрын
Bing Image Creator is the best overall AI image generator due to it being powered by OpenAI's latest DALL-E technology. Like DALL-E 2, Bing Image Creator combines accuracy, speed, and cost-effectiveness and can generate high-quality images in just a matter of seconds. Aug 3, 2023 [ZDNet] 1:31
@cubycube9924
@cubycube9924 Жыл бұрын
It’s been 4 years... I wonder what’s going on now...
@stephanetheard6043
@stephanetheard6043 5 жыл бұрын
This technology could way to dangrous.
@denniscuesta7009
@denniscuesta7009 4 жыл бұрын
Ahhh yes, a so called "dangerous invention" being widely used for harmless memes
@veryoriginalname7932
@veryoriginalname7932 4 жыл бұрын
Media...
@iconic410
@iconic410 4 жыл бұрын
When it was in development, it certainly seemed dangerous and it still is but the internet having such high IQ uses it for memes. *clears throat* DAME DANE
@studiousboy644
@studiousboy644 4 жыл бұрын
It's just overhyped cuz the fat asses of internet got nothing to do.
@Modernhabitus
@Modernhabitus 3 жыл бұрын
It can be
@leigh5689
@leigh5689 3 жыл бұрын
He forgot to mention the part about how older people are being scammed using these deep fakes
@intreoo
@intreoo Жыл бұрын
This is making me more and more paranoid about showing my face online. Not that I ever did though.
@SomeTrippyCanadian
@SomeTrippyCanadian Жыл бұрын
Sheesh this was 4 years ago! Just popped up on my feed. 2023 and it’s just getting crazier
@creatchure
@creatchure 4 жыл бұрын
I was shown this deepfake thing by a video called, "YandereDev killed the radio star"
@cllncl
@cllncl 4 жыл бұрын
"Fake videos are also difficult to detect" Everyone with more than 5 IQ and at least a 144p eye setting: *laughs in human*
@adorenu1338
@adorenu1338 4 жыл бұрын
2019 : deepfake is terrifying, people can use it for bad thing 2020 : dame dame dameyo dame da no yo
@mach1553
@mach1553 2 жыл бұрын
I think I'll copyright my face, so no one can steal it.
@Pituzer
@Pituzer 5 жыл бұрын
I feel like during this whole video I was waiting for the introduction to end and the actual content of the video to start, and then the video was over all of a sudden.
@thefirebeanie5481
@thefirebeanie5481 Жыл бұрын
Well this was always inevitable This aged like fine wine
@kennymavanga4539
@kennymavanga4539 2 жыл бұрын
who's here after Kendricks video ? The ❤️
@HTMLpopper
@HTMLpopper Жыл бұрын
They warned us about this 4 YEARS AGO
Сюрприз для Златы на день рождения
00:10
Victoria Portfolio
Рет қаралды 1,8 МЛН
pumpkins #shorts
00:39
Mr DegrEE
Рет қаралды 108 МЛН
Сюрприз для Златы на день рождения
00:10
Victoria Portfolio
Рет қаралды 1,8 МЛН