What are deepfakes and are they dangerous? | Start Here

  Рет қаралды 315,812

Al Jazeera English

Al Jazeera English

Күн бұрын

Пікірлер: 465
@excelsior31107
@excelsior31107 3 жыл бұрын
This robust technology is a great way for destroying people’s reputations.
@amerlad
@amerlad 3 жыл бұрын
among other things, such as fabricating evidence, disapproving evidence and so much more extremely dangerous uses. you dont like a new government policy? oh look! we have a video of you having intercourse.
@RiversBliss
@RiversBliss 3 жыл бұрын
It's not new.
@dansierrasam79
@dansierrasam79 3 жыл бұрын
@@amerlad Well said! Not only government policy but particularly disagreements pertaining to matters of religion and race. Or if people want to discredit you in some way to suit their agenda. It's very easy. Still, people who are close to you usually know you better and which is why these deepfake videos fail to reflect reality over time.
@aminbinsalim1995
@aminbinsalim1995 3 жыл бұрын
/s
@aminbinsalim1995
@aminbinsalim1995 3 жыл бұрын
@@amerlad :(
@fahdjamy
@fahdjamy 3 жыл бұрын
Appreciate the fact that "Sandra looks and sounds better than J-Lo"
@danielwilson9342
@danielwilson9342 3 жыл бұрын
Unimportant and unnecessary take
@SubliminalMessagesTV
@SubliminalMessagesTV 3 жыл бұрын
Ayyyyeeee
@SubliminalMessagesTV
@SubliminalMessagesTV 3 жыл бұрын
@@danielwilson9342 u right but shut up
@klarag7059
@klarag7059 3 жыл бұрын
Totally agree.
@klarag7059
@klarag7059 3 жыл бұрын
@jaep struiksma not from my perspective. The presenter looks more beautiful as she looks more natural than the overly made up “fictitious”image of beauty. The reporter is more real and relatable because of her more natural look.
@hoboryan3455
@hoboryan3455 3 жыл бұрын
LMAO! the beginning actually had me, I was like "WTF" And then I remembered what the topic was xD
@TheSenzerx
@TheSenzerx 3 жыл бұрын
Same here 🤪
@amenjamal8454
@amenjamal8454 3 жыл бұрын
@@TheSenzerx i thought it was youtube add of some beauty product of jennifer lopez
@TheSenzerx
@TheSenzerx 3 жыл бұрын
@@amenjamal8454 lol
@frfarahrahman
@frfarahrahman 3 жыл бұрын
Same... 😂😂😂
@salmanramzan2032
@salmanramzan2032 3 жыл бұрын
Welcome to the Age of Deceptions.
@samdacosta4676
@samdacosta4676 3 жыл бұрын
I read that first as DECEPTICONS
@rogeramezquita5685
@rogeramezquita5685 3 жыл бұрын
Facts
@filhanislamictv8712
@filhanislamictv8712 3 жыл бұрын
@@samdacosta4676 You need to unload movies from your mind.
@samdacosta4676
@samdacosta4676 3 жыл бұрын
@@filhanislamictv8712 ha ....ikr
@furrycheetah
@furrycheetah 3 жыл бұрын
@@filhanislamictv8712 it is related
@florence8532
@florence8532 3 жыл бұрын
Informative, yet chilling enough to make people think twice before posting pictures of themselves.
@PainfulGrowth
@PainfulGrowth 2 жыл бұрын
celebs are gonna be in danger lol, so easy to make a scandall
@Scarshadow666
@Scarshadow666 Жыл бұрын
@@PainfulGrowth Considering how often a lot of people put images of themselves online (or, even if they don't, there's lots of people that will intentionally look for pictures of them to display online), I don't think it's just celebrities that are going to be in danger... 0_0
@Scarshadow666
@Scarshadow666 Жыл бұрын
Considering how well social media like TikTok, KZbin, and Instagram take off due to people posting images of themselves, I doubt it'll hinder people unless they educate themselves of the dangers of deep-fakes. 0_0
@xja85mac
@xja85mac 3 жыл бұрын
Wonder why she put her earrings off? It turns out that earrings or eyeglasses make it harder for the algorithm to isolate your face.
@itistrueitisafact5432
@itistrueitisafact5432 3 жыл бұрын
Technology has advantages and disadvantages and this is one of them. May Allah save us from all evil people amen.
@y.r5155
@y.r5155 3 жыл бұрын
Amin. But deep fake can be recognized there are apps people use that verify if the video was created
@KatariaGujjar
@KatariaGujjar 3 жыл бұрын
@@y.r5155 So what, are you gonna scan and check every video for possible deep fake?
@y.r5155
@y.r5155 3 жыл бұрын
@@KatariaGujjar no I'm talking about like a celebrity or government officials or someone known. I'm a software engineer I know how to create it and how to know it's a deep fake.
@KatariaGujjar
@KatariaGujjar 3 жыл бұрын
@@y.r5155 Celebrities and officials make thousands of videos daily. Who is going to check every single clip?
@ADeeSHUPA
@ADeeSHUPA 3 жыл бұрын
@@KatariaGujjar ARe You A Jew or Zoroastrian
@tauriqabdullah6130
@tauriqabdullah6130 3 жыл бұрын
I thought the J Lo intro was an ad.
@Mazzie2022
@Mazzie2022 3 жыл бұрын
I actually don’t think she looked like Jennifer Lopez. In fact when I watched it at first the sound was muted and I just thought she had the same name as Jennifer Lopez but I will agree this is very dangerous and quite sick actually.
@andikoazri
@andikoazri 3 жыл бұрын
She only did it for an examples...There's someone out there can make DeepFake looking 100% like the real celebrities!
@nusaibahibraheem8183
@nusaibahibraheem8183 3 жыл бұрын
They probably did it very quickly just as an example
@andikoazri
@andikoazri 3 жыл бұрын
@@nusaibahibraheem8183 Exactly...
@ousman997
@ousman997 3 жыл бұрын
this lady is amazing, your scripts are just on spot.
@rajinrashid2455
@rajinrashid2455 3 жыл бұрын
this series is actually pretty good
@zx7siovia213
@zx7siovia213 3 жыл бұрын
Fun fact, Sandra looks better than Jlo 😂
@blaze4158
@blaze4158 3 жыл бұрын
So what? JLO can sing, dance, act, choreograph, and she's got a better body. A woman's physical appearance shouldn't matter that much to you or anyone else. Stop comparing us like objects. It creates a competitive sense between females and we should no longer allow males to do this to us.
@loveshell007
@loveshell007 3 жыл бұрын
Not a fact, just an opinion
@blaze4158
@blaze4158 3 жыл бұрын
@@loveshell007 Who is your comment directed to? You should know enough to be specific about whom you are addressing.
@MuhammadShahAlamSaqibi
@MuhammadShahAlamSaqibi 3 жыл бұрын
That was the best session according to me, post pandemic. Especially that face changing Sandra.
@sindhujasai1345
@sindhujasai1345 3 жыл бұрын
Before anything horrible happens, I hope a global policy is created to protect those who were used for deep fake stuff. Which could involve cyber police maybe. Edit: Honestly it's already seeming to get out of hand but the sooner the better.
@KatyYoder-cq1kc
@KatyYoder-cq1kc 7 ай бұрын
Too late, it has and is
@nickdupreez1843
@nickdupreez1843 3 жыл бұрын
Thank you! This is a very good overview on deep fake technology, the only of the thing you didn't mention was the fact that realistic deep fakes are trained on huge data sets of images (10,000's+) like the Tom Cruise deep fakes, where they have hours of footage with a huge range of facial expression and you need to map the face onto someone with a similar facial structure to achieve realistic results.
@DarkPesco
@DarkPesco 2 жыл бұрын
@B A I agree. The commenter seemed to be trying to make it seem like it's not that big of a threat to anyone other than celebrities with grand amounts of footage, while ignoring the fact that modern phones and social media have driven large segments of the population to create a comparable amount of footage of themselves and post it all online. The commenter doesn't use personal social media like FB? Doesn't have friends on SM so he doesn't know?
@dorcasnjeri2858
@dorcasnjeri2858 3 жыл бұрын
It's very dangerous because it's destroying people's reputations dignity , career's, creating depression
@gokulpayyanur1839
@gokulpayyanur1839 3 жыл бұрын
When also look into the fact that we have a small camera to record things, the world is getting creeper by the minute
@EmpressTouch
@EmpressTouch 3 жыл бұрын
Yes. And this video only highlights visuals. Sound and acoustics are developing very fast too.
@Head_of_the_Table2.0
@Head_of_the_Table2.0 3 жыл бұрын
Al Jazeera is definitely a great example of it
@badripaudel77
@badripaudel77 3 жыл бұрын
It's people's moral characters. There are people always to misuse something 🙄
@cinto1394
@cinto1394 3 жыл бұрын
Wow thanks for raising concerns!
@NINJANOOB777
@NINJANOOB777 2 жыл бұрын
yo her last part words about rocks sounded like the hood i love it lol
@izzatfauzimustafa6535
@izzatfauzimustafa6535 3 жыл бұрын
Marvel is creating deepfakes of Tom Hiddleston using Loki.
@mahmudulhaidersiyam3186
@mahmudulhaidersiyam3186 3 жыл бұрын
Starting was just🤣🤣🤣🤣
@paklah245
@paklah245 3 жыл бұрын
Fitnah coming soon
@hareemshk9904
@hareemshk9904 3 жыл бұрын
Dajjal coming soon
@baekhyunsbambi6978
@baekhyunsbambi6978 3 жыл бұрын
It's already here...we have to be more careful.
@filhanislamictv8712
@filhanislamictv8712 3 жыл бұрын
too late it is coming soon ..
@leylayetmez
@leylayetmez 3 жыл бұрын
Allah is coming
@jumambugah3946
@jumambugah3946 3 жыл бұрын
He's here
@sultanrayder
@sultanrayder 3 жыл бұрын
Sandra! Thank you so much 🧡
@saajidalikhan
@saajidalikhan 3 жыл бұрын
I was trying to find the skip ad button in the beginning, thinking it was an omaze ad or something
@MuhammadAbdullah-dy5dn
@MuhammadAbdullah-dy5dn 3 жыл бұрын
That is really interesting to know. Technology has changed everything. I hope law will be made on it to spot the culprits behind it.
@azzyfreeman
@azzyfreeman 3 жыл бұрын
The same algorithms used to detect deep fakes, can be used to train better deep fake networks
@DarkPesco
@DarkPesco 2 жыл бұрын
A sick cycle...
@andym6603
@andym6603 3 жыл бұрын
Too notch investigative journalism
@IbrahimAli-vv3df
@IbrahimAli-vv3df 3 жыл бұрын
I was so annoyed that Sandra was substituted. Lol.
@OctavioBecerril1
@OctavioBecerril1 7 ай бұрын
Conductora ❤
@nawazsharif4634
@nawazsharif4634 3 жыл бұрын
Sandra is my favorite journalist
@mirygalas6508
@mirygalas6508 3 жыл бұрын
Education, education, education. People who can think critically and exercise a healthy level of skepticism are difficult to deceive. New technology, old solutions.
@thisismyloooveeeyy8014
@thisismyloooveeeyy8014 2 жыл бұрын
All we need is knowledge, and to stay informed of what technology can do.
@slipknotj2581
@slipknotj2581 3 жыл бұрын
The only video of Al Jazeera I’ve respected
@MagicMattHawkins
@MagicMattHawkins Жыл бұрын
What if we found a way to like “watermark” a video 💦 the future will eventually depend upon markings to prove ❤legitimacy❤of media
@gandhi1945
@gandhi1945 3 жыл бұрын
This tech will help protect the elites
@birdsarecool6448
@birdsarecool6448 3 жыл бұрын
This is extremely alarming.
@LivesInReality
@LivesInReality 3 жыл бұрын
*Very*
@sunny-pe6yt
@sunny-pe6yt Жыл бұрын
Very informative videos, good work 👍
@rimshakhan9751
@rimshakhan9751 3 жыл бұрын
The intro really got me! 😂
@shabeebkaringappara2917
@shabeebkaringappara2917 2 жыл бұрын
The starting.... Nailed it 🤣
@emadabuhagag222
@emadabuhagag222 2 жыл бұрын
Thanks
@Gustoking37
@Gustoking37 2 жыл бұрын
Correction
@arvailankara
@arvailankara 3 жыл бұрын
Sandra has got extraordinary grace and gravitas
@haideralisuterwala9403
@haideralisuterwala9403 3 жыл бұрын
Great informative vedio, thanks. Keep going.
@moatazgamal34
@moatazgamal34 3 жыл бұрын
Let's just appreciate all the hard work done to make such a video, It seems simple because it took a lot of effort into it for sure
@williamsjones4139
@williamsjones4139 3 жыл бұрын
Investing in crypto is a more lucrative way of making money
@greywoods3412
@greywoods3412 3 жыл бұрын
Absolutely right , I got 70% of my total portfolio in crypto and I have been making good profits
@brownjackie9105
@brownjackie9105 3 жыл бұрын
I wanted to trade crypto but got confused by the fluctuations in price
@amandawilson7329
@amandawilson7329 3 жыл бұрын
I heard that his strategies are really good
@andrewmasscot2955
@andrewmasscot2955 3 жыл бұрын
Yeah My first Investment with Mr Tony Gallippi made me profits of over $24,320 US dollars and ever since then he has Been delivering
@andrewpeterson7726
@andrewpeterson7726 3 жыл бұрын
Wow you know Tony Gallippi
@havetrustissue8975
@havetrustissue8975 3 жыл бұрын
This is the technology that CIA uses, I guess.
@jamdindali
@jamdindali 3 жыл бұрын
pro-deepfakes and deepfakes apologist are a problem. mark my words
@ishaqueahmed6362
@ishaqueahmed6362 2 жыл бұрын
Thanks a lot for informative videos and please upload your video on time.
@letarvisjohnson5337
@letarvisjohnson5337 2 жыл бұрын
They are absolutely dangerous. Bcuz people always believe most of what they see.
@michelefortner1190
@michelefortner1190 3 жыл бұрын
Very interesting but terrifying at the same time I wouldn't want that to ever to happen to me or my friends and family this isn't good there will be so many problems with this
@maverickbourne2.0rph.
@maverickbourne2.0rph. Жыл бұрын
💯😩💯
@joycejeong-x4b
@joycejeong-x4b 10 ай бұрын
Engaging in open discussions about deepfakes is essential for raising awareness and building resilience. By fostering a culture of transparency and accountability, we can collectively navigate the challenges posed by deepfake technology, mitigating its negative impact on individuals and society.
@shahidchoudhary9795
@shahidchoudhary9795 2 жыл бұрын
Welcome 💝💖🌹💞
@SubliminalMessagesTV
@SubliminalMessagesTV 3 жыл бұрын
Yet the funny thing about this segment is that this information has been widely known for the last several years and has only improved and there's a slight possibility that It has been used with actual results in modern media settings
@wonderfacts7782
@wonderfacts7782 3 жыл бұрын
You are rocking Sandra ❤️
@empmachine
@empmachine Жыл бұрын
She's totally got lopez beat on beauty (especially class)
@ShahbazAli-ji3jq
@ShahbazAli-ji3jq 3 жыл бұрын
There is a woman in Canada who has KZbin channel her name is jasmine, she is doppelganger of Sandra. The name of the channel is jasmine and dawoud .
@dezzelmoney
@dezzelmoney 3 жыл бұрын
Wtf I triped out when she said I'm Jennifer Lopez 😆 she is a pretty reporter though...
@oasis5683
@oasis5683 3 жыл бұрын
I thought it was JLO for a second at the beginning of the video then I realized it was deep fake. Hahahaha
@JAREDPLY1
@JAREDPLY1 3 жыл бұрын
Everything on Instagram is a deep fake, but we still click like anyways. BTW Sandra over JLO any day.
@LivesInReality
@LivesInReality 3 жыл бұрын
*Read the Qur'an Translation*
@davidalao5336
@davidalao5336 2 жыл бұрын
Informative
@webdecodedwithfahad4414
@webdecodedwithfahad4414 3 жыл бұрын
Genius editing 👌
@Recuper8
@Recuper8 3 жыл бұрын
The audio on this is brutal.
@kristakaufman3593
@kristakaufman3593 2 жыл бұрын
Sowhat APPSdidyou use?
@VAUIENLET
@VAUIENLET 2 жыл бұрын
I suggest these things should be used for video game entertainment purpose and not for harm or conflict. And we can make deep fakes in such a way that they look real and even know that is deep fake.
@Salman-qd2wl
@Salman-qd2wl 3 жыл бұрын
Very informative video! Btw you look absolute sober and beautiful!
@Farah_Gojali07
@Farah_Gojali07 3 жыл бұрын
Till 3:40 it was fun but the whole scenario changed after that.... It is actually terrifying!!
@krateproductions4872
@krateproductions4872 3 жыл бұрын
Rule 34 of the internet: If something exists on the internet, its NSFW version already exists.
@fahmidafaiza8207
@fahmidafaiza8207 3 жыл бұрын
Jin😀
@Farah_Gojali07
@Farah_Gojali07 3 жыл бұрын
@@fahmidafaiza8207 yess!😍😍💜💜
@SunnySJamil
@SunnySJamil 3 жыл бұрын
At 2:16, the genius of the software user trumps that of its developer.
@Hypocrisy.Allergic
@Hypocrisy.Allergic 10 ай бұрын
facts
@knowledgeispowerchannel7734
@knowledgeispowerchannel7734 3 жыл бұрын
Matthew 24: Many will come in my name and deceive many - sounds like deepfake
@VicpaCS2
@VicpaCS2 3 жыл бұрын
Check out this thing bro, Antichrist or as we call him Dajjal in Islam, is known for being the liar and deceiver and Qur'an and Hadith is saying that in the times when he will come, people will not be able to recognise truth from false and vice versa... Looks like world is getting ready for his comming...
@Amaaaaan1
@Amaaaaan1 3 жыл бұрын
Make laws to watermark deep fake videos or face prosecution!
@fivetimesyo
@fivetimesyo 3 жыл бұрын
Sandra looks good and she knows it
@osiasnocum7239
@osiasnocum7239 3 жыл бұрын
Yes, very dangerous stuff..!!
@halalpolice23
@halalpolice23 3 жыл бұрын
Audubillah at first I was confused 🤷‍♀️😂
@deepmindt2811
@deepmindt2811 3 жыл бұрын
I'm going to hack my national bank Boss's face
@Aslaan1
@Aslaan1 3 жыл бұрын
Very hard to find fake ones so be vigilant & prudent.
@aperson2730
@aperson2730 3 жыл бұрын
Yes is the answer to the video title
@ahiyanali7231
@ahiyanali7231 3 жыл бұрын
At 4:14 that music made me think my stomach was rumbling
@BillyHau
@BillyHau 3 жыл бұрын
I want to say... there is a reason why the DeepFaceLab don't keep updating the image every second, that slow down the training process!
@Farah_Gojali07
@Farah_Gojali07 3 жыл бұрын
I literally believed she is Jennifer Lopez!!
@MrBlank-rs7xq
@MrBlank-rs7xq 3 жыл бұрын
My god.. Its many times more harmful than a nuclear bomb...
@CSSWITHRIDAEZAINAB
@CSSWITHRIDAEZAINAB 3 жыл бұрын
Kindly make the video on Digital currency.
@ikkmic451
@ikkmic451 3 жыл бұрын
i love the show..... v informative
@TvGunslingeRvT
@TvGunslingeRvT 3 жыл бұрын
Nope
@infamousoneout9327
@infamousoneout9327 2 жыл бұрын
In my opinion the Nancy Pelosi video wasn't much difference 🤣 she still looked and acted drunk lol.
@SAWS
@SAWS 3 жыл бұрын
Oh no - you should have picked Jenniffer Anniston. She has the same jaw and face structure which would have made the deepfake perfect!
@user-uc4iv3jx3v
@user-uc4iv3jx3v 3 жыл бұрын
wow this show is awesome
@shriragreddy7193
@shriragreddy7193 3 жыл бұрын
not going to lie. She had me in the first half
@dumanimjo609
@dumanimjo609 3 жыл бұрын
Funny enough the only people I am afraid of regarding this technology is the government. Who knows what kind of devious plans they're going to be able to pull off because of this tech.
@tjmarx
@tjmarx 3 жыл бұрын
lol @ "if it makes you feel a strong emotion, either really, really good or very mad take an extra second to check if it's real". Yeah, because people experiencing strong emotions are definitely using logic in that moment and will think to check frame by frame for artefacts and ghosting before they join a bandwagon. lol. The rule in the 90's was, don't believe anything you see on the internet you don't know what is and isn't fake. It continued to be the rule in the early 2000s too. Then suddenly somewhere around 2010 people lost their minds and forgot that the internet is full of misinformation and fakery. If we just went back to the original rule, deep fakes online wouldn't pose a problem to anyone. Hopefully deep fakes might also encourage people to care about their data, about who has their voice print and who has access to their photos. Maybe they'll think twice about using that Russian novelty face swap app, or letting a major company/the government just have their voice print for "security purposes".
@fredkerfwappie8380
@fredkerfwappie8380 2 жыл бұрын
Best comment yet.
@mattangelodolorzo8761
@mattangelodolorzo8761 3 жыл бұрын
New Good, New Evil.
@juankitchen1008
@juankitchen1008 Жыл бұрын
I subscribe your KZbin channel by watching this video. Very informative and timely!
@01arthi
@01arthi 3 жыл бұрын
The last bit was a killer 😀
@OceanWorrier
@OceanWorrier Жыл бұрын
So AI can create these realistic videos but you need a dice and weight to press the letter p 😂
@axethrowing1801
@axethrowing1801 2 жыл бұрын
Nothing good can come from this.
@Counselor23_Nov
@Counselor23_Nov 2 жыл бұрын
Hahaha... I was amazed and dejected that the host got changed.
@mustafa8988
@mustafa8988 3 жыл бұрын
Sandra is so beautiful MashAllah. I always keep her in my prayers.
@FarzTurk
@FarzTurk 3 жыл бұрын
Absolutely frightening
I Deep Faked Myself, Here's Why It Matters
20:41
Johnny Harris
Рет қаралды 3,1 МЛН
Is artificial intelligence out of control? | Start Here
11:14
Al Jazeera English
Рет қаралды 310 М.
My Daughter's Dumplings Are Filled With Coins #funny #cute #comedy
00:18
Funny daughter's daily life
Рет қаралды 31 МЛН
Flipping Robot vs Heavier And Heavier Objects
00:34
Mark Rober
Рет қаралды 59 МЛН
pumpkins #shorts
00:39
Mr DegrEE
Рет қаралды 113 МЛН
А что бы ты сделал? @LimbLossBoss
00:17
История одного вокалиста
Рет қаралды 9 МЛН
Who are the Taliban? | Start Here
8:56
Al Jazeera English
Рет қаралды 5 МЛН
How do we prevent AI from creating deepfakes?
7:41
Channel 4 News
Рет қаралды 66 М.
AI SCIENTIST From Google sends WARNING to Muslims
20:42
OnePath Network
Рет қаралды 1 МЛН
Why is Israel accused of being an apartheid state? | Start Here
15:00
Al Jazeera English
Рет қаралды 202 М.
The most urgent threat of deepfakes isn't politics
6:31
I learned to make Deepfakes... and the results are terrifying
16:19
How synthetic media, or deepfakes, could soon change our world
13:51
What happened after the Arab Spring? | Start Here
7:25
Al Jazeera English
Рет қаралды 327 М.
My Daughter's Dumplings Are Filled With Coins #funny #cute #comedy
00:18
Funny daughter's daily life
Рет қаралды 31 МЛН