why aren't there more chanel's like that? Really enjoying the script/structure and the research is always on point. Neutral and decorated with nice lil animations :)
@lmaolmfao36115 жыл бұрын
Check out America Uncovered for another similar great channel
@annikameyer75745 жыл бұрын
it aint neutral but so is almost every other media source
@WowTrevor105 жыл бұрын
One thing I'd like to weigh on, the idea that facial recognition is poor at identifying black people's faces due to racism is misguided. Image processing is linked to the information stored in the camera image that comes in. Computers are very bad at discerning patterns in low contrast, and having darker skin naturally lowers the contrast of the image of your face. Shadows show up less, and ridge lines are obscured by the skin color. The only way to get around this issue is to shine a very bright light onto the face of someone with darker color, which both doesnt work for security cameras and is obnoxious for the photo subject. Is this a good or bad thing? I think neither, it's just the nature of how the system works and isnt the product of human intent
@Q_QQ_Q5 жыл бұрын
as technology advances , it will overcome it . he gave indian examples of orphan children . Indians are not white .
@ArthurLarin5 жыл бұрын
What you're saying is true but it's not the point of Lou's argument. It shouldn't matter whether improving FRT compatibility with people of color is more complicated technologically or not. The idea is that groups of mostly white men design and release products that will undoubtedly be used by the masses without going to the trouble of improving it to the point that it works for everyone.
@Q_QQ_Q5 жыл бұрын
@Arthur Larin i dont think its about colour but maybe because FRT is more trained with white people and as it gets trained with more people , it will identify everyone . Its ever improving .
@iuscoandrei175 жыл бұрын
So what u r trying to say is that we should wear black masks in order to protect ourselves. Fair enough. As somebody said in the comments above: "it's time to make wearing masks fashinable"
@Baconomics1015 жыл бұрын
Time to start making wearing masks in public fashionable
@sophieszobonya31755 жыл бұрын
I wonder if the medical masks people wear in some parts of Asia would work.
@OVXX6665 жыл бұрын
woah thats what i was thinking
@antoniocarniero51385 жыл бұрын
@@sophieszobonya3175 they don't I am sorry to say, already facebook has been perfecting algorithms to detect people wearing scarves over faces or blocking faces with hands. It was in it's infancy 2 years ago don't know where it is now, it works by predicting someones face based upon natural features and what it knows about the trends of our faces.
@sophieszobonya31755 жыл бұрын
@@antoniocarniero5138 or works based on pictures where one doesn't wear a mask. Yeah. Since then I read about that and... We're done for hehe
@UnknownGunslinger5 жыл бұрын
Oh Lou, you never fail to depress me. I hope this channel grows to the extent it deserves. Casey should give you guys a shout-out! You’ve done an amazing job!
@Q_QQ_Q5 жыл бұрын
but he is telling truth .
@ActuallyCirce5 жыл бұрын
Lou's vids just keep getting better and I want more of them
@StevenFarnell5 жыл бұрын
The Washington State Senate just passed a bill that, if it passes in the house, would limit the usage of facial recognition software by government to require a warrant or reasonable belief immediate harm. I think the actual language in the bill might be different, but it helps define a narrow range of times this can be used and explicitly bans its use during public protests. They also set limitations for advertising using biometric data and GDPR like terms for storage of personal data.
@Jarvik12345 жыл бұрын
ha...this reminds me so much about "Person of Interest" ... what an underrated show!
@brummii5 жыл бұрын
It became too formulaic for me after a few seasons.
@Jarvik12345 жыл бұрын
@@brummii I guess it does get a little formulaic but if you set that aside, it did an amazing job highlighting complex theme/philosophy of AI and surveillance with a solid story.
@ir0n25415 жыл бұрын
In the EU, the GDPR is very strict on storing and processing Biometric info; this includes FRT data.
@theblinkstykrab31065 жыл бұрын
Hey maybe the GDPR is actually good for something
@Walzounet5 жыл бұрын
GDPR does not apply to governments...
@TheBoagboy5 жыл бұрын
It is way to OP it needs to be nerfed tbh.
@adamkarlb63295 жыл бұрын
TheBoagboy How can it be nerfed in fair play?
@wwbaker35 жыл бұрын
The tech industry isn't just based in the US. There are plenty of highly skilled and capable programmers working abroad that aren't white, say, India, Japan and Korea. Even in Silicon Valley has a disproportionate amount of East Asian and Indian tech workers working to solve some of these FRT biases. Remember the Nikon camera that was supposedly "racist" because it couldn't detect if certain Asians blinked during an image capture due to the shape of their eyes? Well, Nikon is a Japanese company that employed Asian programmers but still ran into that problem.
@Q_QQ_Q5 жыл бұрын
as technology advances , it will overcome it . he gave indian examples of orphan children . Indians are not white .
@Q_QQ_Q5 жыл бұрын
Btw , Silicon valley is 84% white . Its just fake propaganda that asians and indians are doing everything in silicon valley .
@EdLrandom5 жыл бұрын
It's funny how the tech industry often advertise how if you have nothing to hide you have nothing to fear. But at the same time, they often build a whole businesses on a closed source software and hardware.
@minktronics5 жыл бұрын
I mean, closed source development is (generally) to do with protecting your work from other businesses
@8draco85 жыл бұрын
As tech guy I just want to explain what happen to Joy Buolamwini mentioned in 7:27 Her computer or software was not racist as it is implied. For facial recognition she used normal, laptop camera without depth sensors. Having only flat image software have to recognize all the details of the face (placement of eyes, shape and placement of nose, lips, ears etc.). Without depth sensors software is relying on shadows and how light behave on the surface of the face. It just happen that shadows and light reflects are less recognizable on dark skin. So yeah, it's not racist algorithm it's the feature of her skin. That's why, in order to make FaceID works in iPhones, Apple had to put so many sensors in front of the phone, they are using IR sensors to scan the exact shape of the face not only guess it from the picture.
@africanotomotiv5 жыл бұрын
@loufoglia you are the king! This was YUGELY enlightening. Thank you.
@marcelloascani5 жыл бұрын
Great work!
@3rkid5 жыл бұрын
Yeah I don't trust cops with this tech at all.
@ZFlyingVLover5 жыл бұрын
Cops have been more accountable and law abiding since they started wearing body cams on a day to day basis. ALot of fake reports of racism have been debunked too when the accusers didn't realize a body cam was recording the incident. Personally, I despise cops because it seems that alot of them are power hungry nobody's but I feel for the honest law enforcement official and I want them to be protected and be able to do a better job too. Too many liars in the world especially in the U.S. because people love to file lawsuits against other people or organizations with means. It's not about what's true anymore but rather how much money the target has available to settle. smh.
@grezz2475 жыл бұрын
That was absolutely perfect: Well paced, well informed, and balanced. Thank you.
@tiannajohnson17525 жыл бұрын
Why did this video feel longer than it really was
@wsaj412214 жыл бұрын
glad i discovered this channel! this was an insightful coverage of frt :)
@blokehund5 жыл бұрын
I don’t think I’ve ever watched a video with no views until today
@jokubas33915 жыл бұрын
Great
@TheMichpoulin5 жыл бұрын
A well put together, and delivered argument sir. Well done.
@SusanDianeHowell5 жыл бұрын
“And now behold, I ask of you, my brethren of the church, have ye spiritually been born of God? Have ye received his image in your countenances? Have ye experienced this mighty change in your hearts? Do ye exercise faith in the redemption of him who created you? Do you look forward with an eye of faith, and view this mortal body raised in immortality, and this corruption raised in incorruption, to stand before God to be judged according to the deeds which have been done in the mortal body? I say unto you, can you imagine to yourselves that ye hear the voice of the Lord, saying unto you, in that day: Come unto me ye blessed, for behold, your works have been the works of righteousness upon the face of the earth? Or do ye imagine to yourselves that ye can lie unto the Lord in that day, and say-Lord, our works have been righteous works upon the face of the earth-and that he will save you? Or otherwise, can ye imagine yourselves brought before the tribunal of God with your souls filled with guilt and remorse, having a remembrance of all your guilt, yea, a perfect remembrance of all your wickedness, yea, a remembrance that ye have set at defiance the commandments of God? I say unto you, can ye look up to God at that day with a pure heart and clean hands? I say unto you, can you look up, having the image of God engraven upon your countenances?” - Alma 5:14-19, The Book of Mormon
@nataleo95085 жыл бұрын
This channel makes concepts and issues that are hard to grasp with the usual emotion-led biased source reports finally clear and easy to understand with plenty of opposing viewpoints to navigate between! I’ve been binging bemes videos all week lmao keep up the great work
@DunnickFayuro5 жыл бұрын
Let's not forget about Gait Recognition Technologies too. Face is "somewhat" easy to hide/disimulate. Gait is a bit trickier.
@ZFlyingVLover5 жыл бұрын
You must've seen that on FBI or NCIS new orleans recently. lmao. Yes that could work well when the target's face can't be seen.
@pwnjitsu5 жыл бұрын
I am so fart, i am so fart F R T I mean F A r t.
@barackobama60675 жыл бұрын
The government and some corporations have the ability to read your mind in real time. They've had it for decades. People know this but don't care, literally everyone who knows about it stays quiet, wonder why? I mean you reading this already know why but you act like you don't because they like you fear that they'll get their mind's read. Then thought crimes become real and everyone around you becomes the thought police.
@derrickwillis1715 жыл бұрын
Yep it's so powerful it punched me in the mouth.
@tOmzz4video5 жыл бұрын
Fantastic report, as always!
@davidgoodwin41485 жыл бұрын
The answer is be a jugglelo all day everyday
@r6k8n995 жыл бұрын
This is the best news show.
@alexandarmakxmov5 жыл бұрын
When Lou talks, I listen, simple as that...
@beatrizmedeirosnoleto93915 жыл бұрын
There is a problem you didn't mention. If the accusation is allowed to use only images to prove guilt, then there is nothing to stop convicting innocent people with deep fake videos, that are indistinguible from real ones.
@kwii227895 жыл бұрын
IS FACIAL RECOGNITION TECHNOLOGY ASSUMING MY GENDER?? **TRIGGERED**
@APVHD5 жыл бұрын
Another great video
@darkangle2000now5 жыл бұрын
Woah, you guys still live! Good one!
@Gilotopia5 жыл бұрын
My main field of research is artificial intelligence and there's a few misconceptions about how facial recognition works. You almost touched upon them at minute 5 but it goes a bit deeper than that. I see these kinds of errors around the media especially when it comes to the chinese systems. I know you guys are a bit more interested in being right about the tech than regular outlets so I'd be glad to explain how facial recognition does and doesn't work. What's concerning is that these misunderstandings about how FRT works may even lead to ineffective regulation. Facial recognition is usually only the speartip of these tracking systems but everyone is focusing on that ignoring all the other tracking going on behind the scenes. If you're interested in my explanation of how things work let me know.
@maximel75685 жыл бұрын
YESS
@MrAdhs115 жыл бұрын
Very good channel good for you
@DanielZorroF5 жыл бұрын
Beme should do a collaboration with Mozilla's IRL
@Q_QQ_Q5 жыл бұрын
whats that ?
@kurtn175 жыл бұрын
These vidoes are always great
@blackkissi5 жыл бұрын
I was waiting for the phrase "shit is going down in..."
@citizen48435 жыл бұрын
it's good there's so few cameras. now i can steal your laptop while you're in the bathroom.
@drunkcat17135 жыл бұрын
So its in India too? .....shiEt
@sunnyhaladker5 жыл бұрын
Yeah I was was like dammnn
@Q_QQ_Q5 жыл бұрын
@Sunny Haladker Mate , in India BJP party is running on data science and AI . it built Rs 1200 crore head office in New Delhi . Go look at it .
@xWood40005 жыл бұрын
Why does NSA seem to think that internet wholesale surveillance is the way to go? It's common sense to not fo that because everyone isn't a suspect. The NSA servers cost a lot too.
@death-disco5 жыл бұрын
FRT... I couldn’t unhear “fart”
@MyAheer5 жыл бұрын
liked before even watching.
@yonatanofek44245 жыл бұрын
+BEME news Hey BEME, I sent my DNA sample (cheek swab) to what eventually turned out to be a scam. Any content planned (or already out) about this sort of risk?
@lateblossom7 ай бұрын
Late to the party, but if you read this, look up the TV show Person of Interest. Amazing show, deals with all of this.
@XavierZara5 жыл бұрын
Finding missing people with this technology is not an excuse to use it. Some people just don't want to be found and that should be okay
@olegpetelevitch44435 жыл бұрын
SPOT ON MATE !
@izzywtheflix5 жыл бұрын
Hey Lou here’s the thing
@HeroGambit5 жыл бұрын
They have to implement gdpr for ppl from Europe :))))) in stores or anywhere they are going to use FRT
@CybershamanX5 жыл бұрын
I have predicted that in the future people, likely those pesky and rebellious kids, will figure out how to use makeup to screw with facial recognition software. I'm talking crazy designs on the face which will attempt confuse the technology. Mark my words. Keep your eyes peeled for crazy face patterns becoming fashionable. ;)
@CybershamanX5 жыл бұрын
I can easily then see how police will start harassing kids with such makeup patterns. ;)
@leviroberts18845 жыл бұрын
Machine learning algorithms don't adopt the biases of the computer programmers, they adopt the biases in the training data-sets. If your facial recognition training set is disproportionately white, then the learned algorithm will disproportionately favor higher performance on white faces, which would happen if it the data-set was collected in America (where white people are the majority population). But that's not to say that racism can't play a role. If we were to train an algorithm to identify known felons and most felons happen to be black (due to generations of racist policies), then you could expect the false-positive rate for black males to be much higher than false-positive rate for white males. From this perspective as we uncover these issues of bias in our learned models, we're really just proving the existence of a bias in the world that our data-set came from. This is the biggest problem with machine learning today. Our world is filled with biases, and its very difficult to identify and condition on all of them. If we're not careful in both adopting and designing these technologies, we could end up inadvertently making these problems much worse.
@itscharliangel5 жыл бұрын
Hey it's Lou
@fpbrazz5 жыл бұрын
I'm curious on why the facial recognition doesn't work well in people of color. The video suggests developer's bias as a possible explanation but I believe, knowing a bit about how the technology works, that it might be due to the way face recognition works. These algorithms rely heavily on points of high contrast in the person's face. In this case dark eyebrows in a white skin will be easily distinguishable while in a darker colored skins would be a challenge. Although the statistics on the predominance of a certain type of people in the software engineering world could explain the issue, we should be careful when jumping to conclusions...
@S2Tubes5 жыл бұрын
Two reasons, first, the data that has been fed into them is mostly white people. Second, as you said, the color range. The darker the skin, the less contrast. It has nothing to do with bias or the people "teaching" the AI. It is 100% data. If dark skinned people fed it the same data, it would come back with the same results.
@LaMarqueLP5 жыл бұрын
1984
@nielswitberg5 жыл бұрын
9/10 times the innovators/tech department just make some shit. It is always up to the customer to figure out how to use it.
@theopuszkar5 жыл бұрын
But that's exactly what's problematic about this technology, who will decide if and how it will be implemented. Often times the customer doesn't get to decide don't you think ?
@ToriKo_5 жыл бұрын
Good video
@bettytureaud5 жыл бұрын
Try mix facial recognition with 5G and digital price labels
@OVXX6665 жыл бұрын
just wear an edgy mask and call it fashion
@gnothseed81355 жыл бұрын
Hey Lou! Whats up?
@baconninja44814 жыл бұрын
We all now how George Orwell warned us about this. Are we sure we want Big Brother to become reality
@yux.tn.36415 жыл бұрын
though it exists in china, its really only in the major cities and even then its not everywhere except in the important parts of the city...the west goes on about the social credit system in china but in truth its not even centralized and it only works if you use alipay
@Arab_Jew5 жыл бұрын
Why are you asking these questions
@fluxnfiction55595 жыл бұрын
Lincon Dug. 2010 TOC, lol but that was dna data base not face data base.
@robertfalbe30545 жыл бұрын
I'm Robert reed falbe 3 and I am being harassed in Lake forest california. Please help. I was born in Centralia ill. 45 years of age.
@froozynoobfan5 жыл бұрын
ai is not made racial or sexist by design but if your training dataset exists of mostly white and or male population it will become biast. Privacy is very important and data is the new oil. people assume large organizations can't do mutch with your data but that is not true organizations like amazon and google aim to have as mutch sales, ad views as possible so they (ab)use your data in order to make you buy more without you even knowing
@thijmenstar78325 жыл бұрын
This show is unterrated, lou and his team are doing a great job
@donhalley56225 жыл бұрын
Here's my comment: I'm giving this one a 5 out of 10. All over the place. It works too well. It doesn't work. It's prejudiced against minorities. It doesn't work well on minorities. If you want to give me legitimate reasons to want it regulated, you're going to have to cite some different sources than the ACLU, San Francisco, or EFF. Find missing children, terrorists, rapists, bank robbers, etc.? I'm OK with that. Find out what I like, where I go, and who I hang out with? Guess what - I'm OK with that as well. It facilitates false arrests - and MURDER? (Oh yeah, no bias there.) Couldn't it more often prevent a false arrest or help reduce the tension in a police confrontation? You want your privacy? Guess what - you're too late.
@RoloTomase5 жыл бұрын
WEll the same arguments he makes against the cameras and software is very much like the red flag laws. It's a slippery slope once they go down those roads it can be a real danger for misuse.
@ConorDrew5 жыл бұрын
If we built it they will surveil, so we won’t build it, I feel that’s wrong, if you, the person who can see the pitfalls and the dangers, you should build it, if not someone else will less morals will come and build it.
@LordOfDays5 жыл бұрын
The blurriness at the beginning makes this an Oscar worthy film.
@TylerVanAllen5 жыл бұрын
Go Cuse
@patrik51235 жыл бұрын
10:10 It's interesting that a country like the US is behind certain countries in Africa when it comes to discrimination...
@jobbvrolijk5 жыл бұрын
Don’t say FRT too quick! 💨
@MrGERiarza5 жыл бұрын
If the accuracy is high enough, it's kind of stupid for the police not to use it on their live cams. How many times have people been stopped by the police, let go and years later it results the police could have caught the perpetrator all along?
@NicholsMax5 жыл бұрын
couldn't get passed FRT being close to FART, ha
@gamer6204965 жыл бұрын
Lou!
@fredcastaneda32675 жыл бұрын
Guatemala flag!!
@drunkcat17135 жыл бұрын
Hey its lou and some wuld shit is going on FaRT
@Xafpunk5 жыл бұрын
Thumbs up if you kept thinking fart
@khalidmohamud8455 жыл бұрын
unbox therapy ?
@phynx20065 жыл бұрын
Hey Lou .... we see you, hahaha
@doodlexenosinfopinion61075 жыл бұрын
Facial racial not recognition this is not creative or innovative.
@ParadoxdesignsOrg5 жыл бұрын
F.A.R.T
@octaviano73605 жыл бұрын
There is an article in NY-er: the man who never forget a face. They explained frt to be a sham in comparison to people with a special gift...
@DanNguyen-fu9hn5 жыл бұрын
Fart... heh... Frt...
@repker5 жыл бұрын
i feel like people who claim algorithms to be racist or whatever have never seen/implemented/used an algo. it's like saying a car that mowed down a bunch of minorities is racist. it's the input, the driver, the human, that is, not the machinery or math.
@ab-eu9po5 жыл бұрын
F.(A).R.T
@kennefvi5 жыл бұрын
Lou its “facial automatic recognition technology”
@MrFlexNC5 жыл бұрын
We invented fire and got out just fine, we'll survive some code
@theblinkstykrab31065 жыл бұрын
Bad comparison
@MrFlexNC5 жыл бұрын
@@theblinkstykrab3106 thats the point
@Q_QQ_Q5 жыл бұрын
this is worse .
@Q_QQ_Q5 жыл бұрын
btw there are people who still worship fire in fire temple .
@sani92385 жыл бұрын
FRT stinks
@Kakazumba995 жыл бұрын
u have to be joking with the ai bais becouse of white man. I am pretty sure u knew a bit about ai did u drink some mad juice before this video?