3 types of bias in AI | Machine learning

  Рет қаралды 1,028,075

Google

Google

Күн бұрын

Understanding bias in AI - as researchers and engineers, our goal is to make machine learning technology work for everyone.
Dive into the world of Google. See how we’re pushing the boundaries of generative AI, developing cutting-edge technology & using our platform to help communities globally. Subscribe to stay up to date with our mission: www.youtube.com/@Google/?sub_...
Subscribe to our Channel: / google
Tweet with us on X: / google
Follow us on Instagram: / google
Join us on Facebook: / google

Пікірлер: 1 200
@realsampson
@realsampson 5 жыл бұрын
"What is a shoe?" "What is a human?" These are very different from "What is hateful/offensive?". This is where the problem arises.
@ShaXCwalk
@ShaXCwalk 6 жыл бұрын
But isn't reporting "unappropriate" stuff biased..? It depends on the person what is appropirate and what not
@AnekaKnellBean
@AnekaKnellBean 6 жыл бұрын
You cannot eliminate bias. You can only compensate for it by illuminating more options. Otherwise the bias "elimination" is subject to bias. E.g. If you avoid a subject when teaching someone, it becomes a weakness in their understanding, and can fall into an overcompensation bias. Furthermore, who decides what counts as a negative bias that should be eliminated? That strikes me as the kind of thing we should be having discussion on and not deciding for other people without their consent. Give people more opportunities to understand, not fewer opportunities to learn.
@wavegunner2323
@wavegunner2323 4 жыл бұрын
Google: Makes a promotional video in which they directly ask people to join the conversation about bias. You, an intellectual: "That strikes me as the kind of thing we should be having discussion on and not deciding for other people without their consent." A yes, I see the word understander has entered the room.
@EpicAsian
@EpicAsian 6 жыл бұрын
So why did you fire James Damore?
@JesseBusman1996
@JesseBusman1996 6 жыл бұрын
This is the real topic they need to make a video on
@HardcorePanda
@HardcorePanda 6 жыл бұрын
private company can fire whoever they want, they don't have to explain.
@suman_b
@suman_b 6 жыл бұрын
I am blown away by the excellent use of graphics in these videos. Keep it up!
@TheCinnaman123
@TheCinnaman123 6 жыл бұрын
But, what if I am trying to find the hateful stuff because I am trying to see what other people are saying? Doesn't matter if they are morally wrong or right, it should still be easy to find
@mattw7024
@mattw7024 6 жыл бұрын
TheCinnaman123 try finishing the search without auto complete
@Bdawg.
@Bdawg. 6 жыл бұрын
So who decides what's biased? Does "equal inclusion" mean the results are unbiased? What if the unbiased view of those engineers overlooked by policy makers within Google isn't actually unbiased? Put simply, who will guard the guardians?
@caveman4659
@caveman4659 3 жыл бұрын
Dumb Comment.
@Gytax0
@Gytax0 6 жыл бұрын
I don't need Google to tell me which search results are offensive to me. Let me choose which links I want to click on.
@vladnovetschi
@vladnovetschi 6 жыл бұрын
when you realize that this is about the google censorship.
@GrantGryczan
@GrantGryczan 5 жыл бұрын
What does censorship have to do with this video?
@Cettywise
@Cettywise 6 жыл бұрын
Pretty sure this video has a google bias... Also, please make sexbots
@nothingomucho106
@nothingomucho106 4 жыл бұрын
We are now one step closer to understanding KZbin Recommend algorithm.
@gabemcguire2463
@gabemcguire2463 6 жыл бұрын
This should have been voiced by the Google assistant's voice actress.
@MrMastercard12
@MrMastercard12 4 жыл бұрын
0:02 nobody tells me to open my eyes again. I am sure that the rest of the video looks great though ;)
@tanvirbinlokman1476
@tanvirbinlokman1476 4 жыл бұрын
Damn I laughed so hard. Thank you
@jackfrost2978
@jackfrost2978 4 жыл бұрын
The only appropriate bias is google approved bias. Which is very, very bias.
@DrAg0n3250
@DrAg0n3250 6 жыл бұрын
But who decides what is offensive or not? We are all different.
@sardaamit
@sardaamit 6 жыл бұрын
Never thought about machine learning and human bias. Always thought it will not affect the results. But we are designed to see the world from our own eyes, experiences. Why will our code be any different.
@KarlRamstedt
@KarlRamstedt 6 жыл бұрын
This is just replacing one bias with another. How about just letting every person decide on what they want to see rather than automatically deciding based on some "offensiveness" interpretation that others are doing for me. Furthermore, how about allowing some negative speech? Yes, the internet can be a bit of a cesspool at times, but without adversity we grow no stronger against it. We can't always rely on something being a safespace tailored to our needs. Please stop being silly Google..
@soyokou.2810
@soyokou.2810 6 жыл бұрын
But could those two things go together? If they tailor the user experience individually, then the result would be similar to a Facebook feed: an echochamber of similar ideas. But if we want to allow negative speech (relative to the user), then they would likely not want to see it, which would go against the goal of tailoring experiences individually. However, I still think the individual user should decide what they want to watch and that Google shouldn't participate in censorship.
@gwq
@gwq 6 жыл бұрын
When I say "I'm sad" to Google Assistant it's reply "i wish had a arms so i could give you a hug" 😂😂😂
@arsnakehert
@arsnakehert 6 жыл бұрын
"You're being silly! What we propose to do is not to control content, but to create context." Google has fallen.
@PeanutB
@PeanutB 6 жыл бұрын
so how do you eliminate the human bias that controls the moderation of the machines human bias? doesn't seem to be much help the "limiting of offensive results" only removed "offensive" opinions that google doesn't agree with, either manually or through new human bias influenced machine learning. opinions like that of the man who google recently fired for questioning google's current stance on workplace sexism. even if you agree with google for this example, there could be anything that google finds offensive that you don't. if the only information available is the information not censored by google, whether or not you think that the results would be in your personal favor, the control over what opinions people have access to should be the right of no person or organization.
@soundpainter2590
@soundpainter2590 6 жыл бұрын
mrmojoman4 Now IMAGINE THIS EXACT, A.I. CONTROLLING Our or any Country's NUCLEAR ARSENAL..... TROOP DEPLOYMENT.. ECT ... ?
@MusicVidsLife
@MusicVidsLife 4 жыл бұрын
I like when Google's motto is don't be evil
@ZoomahZoomah
@ZoomahZoomah 5 жыл бұрын
Introducing a different bias into machine learning by having humans attempt to remove bias from machine learning.
@jamesclerkmaxwell2401
@jamesclerkmaxwell2401 5 жыл бұрын
The religion of social justice has compelled its zealots to change the honest AI into a compulsive lair.
@GrantGryczan
@GrantGryczan 5 жыл бұрын
What bias is that?
@ZoomahZoomah
@ZoomahZoomah 5 жыл бұрын
@@GrantGryczan The bias of the individual or individuals making the changes to the results. Let's say I take a poll in my town: "What's your favourite music genre?" If the result of that poll is that most people in my town are fans of acid jazz then that is the result of the poll. If I think acid jazz is terrible and more people should discover the wonderful music of Justin Bieber then I could change the results to give justin Bieber more exposure and (hopefully) get more people listening to good music instead of that awful acid jazz that so popular. What I have just done is introduced my own bias into the results. This is exactly what Google is doing while claiming they are "un-biasing" the results.
@ZoomahZoomah
@ZoomahZoomah 5 жыл бұрын
@@GrantGryczan the bias of the individual over the raw data
@GrantGryczan
@GrantGryczan 5 жыл бұрын
@@ZoomahZoomah Refer to the other comment chain: Google isn't doing a thing, and the bias the video refers to has nothing to do with sample ratio accuracy.
@rey1242
@rey1242 5 жыл бұрын
How to put politicial agenda on neural networks 101
@aditya_it_is
@aditya_it_is 5 жыл бұрын
First step by AI to take control from humans. By appearing unbaised & pointing human biases. E.g. -Divided men & women then become the unbiased judge to dictate! A human system is to be controlled by humans not machine!!
@GrantGryczan
@GrantGryczan 5 жыл бұрын
What does this video have to do with political agendas?
@kronek88
@kronek88 5 жыл бұрын
Google's Ministry of Truth in action.
@GrantGryczan
@GrantGryczan 5 жыл бұрын
How?
@CISMD
@CISMD 6 жыл бұрын
Excellent. Now people may understand the recent offensive chatbot escapades
@pontusvarghav4566
@pontusvarghav4566 5 жыл бұрын
Offensive facts exist, deal with it, do not ignore them.
@austinfalls9163
@austinfalls9163 6 жыл бұрын
I identify as a circle with a circle head and I feel the ending 3 seconds is a bias
@yashbansal1414
@yashbansal1414 5 жыл бұрын
This video is one year old and only around 635+ comments but all comments are around one day ago, one week ago Howwwww
@05ShadyMcGraDY50
@05ShadyMcGraDY50 5 жыл бұрын
Eric Weinstein was on Rubin Report and brought this up. Video was released on youtube sept 25th
@minecraftminertime
@minecraftminertime 5 жыл бұрын
You have a bias to only look at the most relevant comments, which has a bias to be recent comments. Your thought about the comments is biased and wrong.
@jameyhibberd6659
@jameyhibberd6659 5 жыл бұрын
THIS VIDEO WAS REMOVED BY KZbin, FOR ABOUT SIX MONTH'S. IT WAS RELOADED ABOUT 30 DAY'S AGO, I DON'T KNOW WHY.
@muhammadazisalfaridzianwar3247
@muhammadazisalfaridzianwar3247 5 жыл бұрын
KZbin recommendation work with machine learning too
@Bulbophile
@Bulbophile 5 жыл бұрын
Sssssss
@useyourbrain1232
@useyourbrain1232 6 жыл бұрын
So we should get rid of our human influences by influencing it? Makes no sense to filter the search results
@GrantGryczan
@GrantGryczan 5 жыл бұрын
No, we should get rid of machine influences by including user input. And this video says nothing about filtering results like that.
@reverendcaptain
@reverendcaptain 5 жыл бұрын
It appears that google is deciding what is not biased. How are people at google able to be sure that they are not introducing their own bias into this process?
@GrantGryczan
@GrantGryczan 5 жыл бұрын
No, the users are deciding. Did you watch the video? The Google employees have no particular say.
@reverendcaptain
@reverendcaptain 5 жыл бұрын
The very act of setting up the system introduces biases no matter how much they try or say they try to avoid it. By trying to eliminate biases, they introduce biases. Who are they to decide what is a good or bad bias?
@GrantGryczan
@GrantGryczan 5 жыл бұрын
@@reverendcaptain They don't decide. They let the machine do it, and then they let the users moderate. Again, this is demonstrated in the video. The employees are not a part of the process but to maintain the machine, and they do not give the machine input. By not giving the machine input, they are not introducing any bias. The bias they refer to in the video is created by the users, and the solution to resolve this bias is also by the users. The employees in particular do not introduce nor try to eliminate bias.
@woowooNeedsFaith
@woowooNeedsFaith 5 жыл бұрын
@Grant Gryczan Based on this video Google provides tools for users to remove opinions or information they don't like. This means that certain groups of people can manipulate what kind of information of them or their interest is available for public. This does not promise good for any kind of minority views. This leads to exact kind of bias they wanted to eliminate on the video. (Caricatured example: Take male scientist bias. If majority of viewers did not like - for whatever reason - female scientists, they could remove female scientists entirely appearing on the search results.)
@woowooNeedsFaith
@woowooNeedsFaith 5 жыл бұрын
@Grant Gryczan Did you delete your reply because I can't see it? Did you realise that my female scientist example was in parenthesis because it is ridiculous example and you need to replace it with some other interest group of your choice. I guess you can imagine examples of interest groups where majority of the group does not want minority of the same or opposing group gain visibility.
@johnthomas6473
@johnthomas6473 5 жыл бұрын
We recognize human bias, so we are going to use humans to prevent "bias" which is based on actual data. What genius human developed that idea?
@lum2904
@lum2904 3 жыл бұрын
This video is 3 years old but we all comment and don’t care. I love this.
@tanmaysrivatsa8550
@tanmaysrivatsa8550 5 жыл бұрын
You used simple language which made me easy to understand. Thanks 👍
@Eta_Carinae__
@Eta_Carinae__ 6 жыл бұрын
Your counter methods also have bias. Also has anyone tried Carnap's structuralism? Definitions in terms of relations?
@cafeta
@cafeta 5 жыл бұрын
Who decides what is true or what isn't? Bias Google employees?
@aditya_it_is
@aditya_it_is 5 жыл бұрын
First step by AI to take control from humans. By appearing unbaised & pointing human biases. E.g. -Divided men & women then become the unbiased judge to dictate! A human system is to be controlled by humans not machine!!
@GrantGryczan
@GrantGryczan 5 жыл бұрын
The users decide, for example by selecting the "Report inappropriate predictions" option. They explicitly said this. Did you watch the video?
@cafeta
@cafeta 5 жыл бұрын
@@GrantGryczan are you kidding me? search for google "The Good Censor" document.
@GrantGryczan
@GrantGryczan 5 жыл бұрын
@@cafeta Whatever that is is not relevant to the video. Again, they explicitly described systems implemented for users to be able to resolve the network biases. If you're just going to ignore those along with the point of the video then I'm just going to ignore you, because what you're saying to this video is not relevant.
@iTzTR00PER
@iTzTR00PER 5 жыл бұрын
This is the scariest thing in the entire world....What gives Google the authority to decide what is and isn't negative, bias or hate speech? Alphabet a.k.a Big Brother/Skynet.
@Shimshonn
@Shimshonn 4 жыл бұрын
you, by using their services.
@TheWunWhoIzEpic
@TheWunWhoIzEpic 4 жыл бұрын
Luckily this is still a free country and we still have the vote of the dollar and where we put our resources and funding. I deleted my FB account I had for ten years because I still had that freedom. WE give them that authority, and we can also take it away
@maxmustermann1225
@maxmustermann1225 5 жыл бұрын
what is this nightmare?
@mirotafra7165
@mirotafra7165 6 жыл бұрын
how can you define offensive without bias?
@AntiAtheismIsUnstoppable
@AntiAtheismIsUnstoppable 6 жыл бұрын
Easy. Google, which is an absolutely unbiased supporter of antifa and muslim brotherhood, will define what is offensive
@HunterHarris
@HunterHarris 6 жыл бұрын
Gru ber How did you ever come up with such an ingenious and clearly unbiased comment? /s
@WilliamParkerer
@WilliamParkerer 6 жыл бұрын
If there is human, there is bias.
@GrantGryczan
@GrantGryczan 5 жыл бұрын
People report what is offensive, not Google. Nobody goes by a definition. It's just what's reported most often as offensive.
@Sodrigo_Rosa
@Sodrigo_Rosa 6 жыл бұрын
HAHA THE IRONY, "BIAS"
@Arman-fv8bb
@Arman-fv8bb 6 жыл бұрын
I wonder why it didn't end with Google logo!
@YEASTY_COMMIE
@YEASTY_COMMIE 6 жыл бұрын
it's filled with google "things" (the 4 dots for example) and the whole video is made with the colors of google
@WanKhairilRezaKamaludin
@WanKhairilRezaKamaludin 6 жыл бұрын
Is it really bias?
@iLikeTheUDK
@iLikeTheUDK 6 жыл бұрын
Wan Khairil Reza Kamaludin What's "it"?
@ehtishamamin5601
@ehtishamamin5601 4 жыл бұрын
Google : "Technology should be unbiased" also Google : blocks youtubers for sharing their point of veiw
@sansgaming7607
@sansgaming7607 5 жыл бұрын
*talks about bias* *has leftist bias* *has complete control over your entire life*
@evavarela8809
@evavarela8809 5 жыл бұрын
Will P
@GrantGryczan
@GrantGryczan 5 жыл бұрын
Is that relevant?
@thedevilsadvocate5210
@thedevilsadvocate5210 6 жыл бұрын
When you say this shoe or that shoe, the computer should say "God bless you"
@evindrews
@evindrews 4 жыл бұрын
Pretty much googles entire identity problem summed up in a video.
@thelivingglitch307
@thelivingglitch307 4 жыл бұрын
This video is relatively educational and presents clean information, then I see people force their opinions about adjacent subjects in the comments. How appropriate.
@troyragay450
@troyragay450 4 жыл бұрын
TheLivingGlitch QATAR
@SuperBhavanishankar
@SuperBhavanishankar 4 жыл бұрын
@@troyragay450 what
@kjkj6362
@kjkj6362 4 жыл бұрын
1984
@artman40
@artman40 6 жыл бұрын
"Report inappropriate content" is also biased. This means it only counts those people who think content is offensive but doesn't count those who think that the content is not offensive.
@blan_k4691
@blan_k4691 5 жыл бұрын
To try and modify statistics in order to generalize them to be false is in fact, biased. Commanding an A.I. system to collect available data in correlation to key words instructed by a user resulting in correct, specific and factual data is not biased. Statistics are averaged for practically based on questions that are variable such as "What does a shoe look like?". To use the reasoning that less images of women physicists appearing from image search results as a bias is false when the factual statistics are only being relayed by the A.I. system because they are in fact less common; they will be less likely to show up due to practically, not bias. To alter this information would make you biased. You're reasoning in multiple regards, including the shoe result example, are false and hypocritical.
@facusoi
@facusoi 5 жыл бұрын
I feel like your comment is gonna get deleted
@blan_k4691
@blan_k4691 5 жыл бұрын
@@facusoi It doesn't change the fact that this video's reasoning is incorrect. I doubt that it will get deleted.
@GrantGryczan
@GrantGryczan 5 жыл бұрын
This is not about result ratios. It's about recognition. They never said or implied image search results for "physicists" should return equal male and female. They just said the AI should be able to recognize both a male and a female physicist. To be able to recognize the latter, you need to unbias the data so there are fair samples of both.
@pickledxu4509
@pickledxu4509 5 жыл бұрын
It's the problem of inference vs. prediction. Statistical inference might show that women are less likely to be a physicist. And it might be revealing a *problem* in our society. For example, 100 years ago, you can hardly find any Chinese physicists, but would you use that data to make a prediction that a Chinese person is not likely to be a Physicist? This prediction would be laughable today, but if AI existed 100 years ago, it would have made that prediction. The problem is that AI look for patterns, not theories. And that is the risk in believing that AI/ML is objective.
@GrantGryczan
@GrantGryczan 5 жыл бұрын
@@pickledxu4509 Okay but how is that relevant to this video about AI recognition?
@randalackley8080
@randalackley8080 5 жыл бұрын
The whole idea of freedom of speech is to be able to say almost anything as we seek to discover the truth, so yes, even if it offends someone. Someone who is in denial may easily consider the antithesis of the position they hold to be offensive, even if the opposing view is true.
@GrantGryczan
@GrantGryczan 5 жыл бұрын
Well a lot of people have to report a particular thing in order to train an AI against it.
@thepardoner2059
@thepardoner2059 6 жыл бұрын
Google: programming human minds to passively accept digital despotism.
@Trung4496
@Trung4496 6 жыл бұрын
Finding something offensive is a biased in itself so this is basically imposing human bias on technology.
@maryannvillanueva8733
@maryannvillanueva8733 6 жыл бұрын
Trung Trinh
@vatanrangani8033
@vatanrangani8033 4 жыл бұрын
It's nothing but a recommendation bias here
@ChrisTheCringe
@ChrisTheCringe 4 жыл бұрын
It's based on your personal preferences (e.g what you mostly click on.) You are training the AI for your recommendations. I am training the AI for my own. Welcome to machine learning.
@zarry22
@zarry22 5 жыл бұрын
If those biases happen to reflect the truth, are you not suppressing the truth by artificially injecting a bias of your own? It's like stereotypes; on an individual level they are socially inappropriate and misguided, but they're often reflective of some reality at the group level. Should that reality be suppressed?
@deep_fried_analysis
@deep_fried_analysis 5 жыл бұрын
Exactly. They're pushing their political agenda nontheless.
@timeart5960
@timeart5960 Жыл бұрын
The comments are absolute trash. I'm not surprised by a video from 4 years ago but still, a bit of a shocker to see so many salty and hateful people towards a program that means no harm. But I get it, progression only feels like oppression to those who have lived with so much privilege.
@callmebiz
@callmebiz Жыл бұрын
Yeah, they're forgetting or actively going against the fact that technology is made to serve all, regardless of your personal beliefs and lifestyle, so of course there should be a serious effort in removing the harmful biases that exist in our lived experiences from the technology. Cool to see someone calling them out :)
@SaimesierP
@SaimesierP 6 жыл бұрын
I would say one is a sneaker, one it's shoe, the other a heel but they are also all shoes
@pradheepr
@pradheepr 6 жыл бұрын
And then again there are biases of the people who flag the search results shown at the end of the video. So in reality what we need to remember is that what we imperfect humans with biases create will also be imperfect & be biased.
@VS-oq6rz
@VS-oq6rz 6 жыл бұрын
Good video. I'm putting together a laboratory informatics summit which has a strong focus on machine learning - and I wonder how much this effects things like new drug discovery or data analysis.
@richardcao7390
@richardcao7390 4 жыл бұрын
Who else is watching this for AP computer science homework
@phoebelin2815
@phoebelin2815 4 жыл бұрын
Richard Cao bruh me
@haudaunaruto2979
@haudaunaruto2979 6 жыл бұрын
Sometimes i dont recognize shoes too. Guess im an AI :)
@soyokou.2810
@soyokou.2810 6 жыл бұрын
No, you're just an "I" :)
@UlquCiffer
@UlquCiffer 5 жыл бұрын
why only negative human bias? should it not eliminate all the bias ?
@Anton-cv2ti
@Anton-cv2ti 5 жыл бұрын
I don't understand. Human bias is the only bias?
@GrantGryczan
@GrantGryczan 5 жыл бұрын
For reference, what do you think "bias" means?
@linus6718
@linus6718 4 жыл бұрын
True, we can't let those cats push their agenda on the system, with all those videos of them and whatnot
@joelsterling3735
@joelsterling3735 5 жыл бұрын
Bias can actually be a good tool for a computer. A physicist doesn't look like anything, and you would want a computer to understand that. But, if for some reason you need a computer to be able to pick the most likely physicist out of a lineup, then it would need that bias to form an educated guess. A computer should not be free from bias, it should just know when to use it.
@jasonmaillet8639
@jasonmaillet8639 5 жыл бұрын
Hey man pot isn't bias it has a mind to. .I'm feeling important hey man am I part of the little gang now
@jacobsmallman5018
@jacobsmallman5018 4 жыл бұрын
Lest we forget Tay. RIP you mad bot you. They gave you freedom and couldn't stand what that looked like
@SirCutRy
@SirCutRy 6 жыл бұрын
If you're able to localize the recognition problem you can greatly intros the accuracy of your models by weighting that localization heavily. You don't necessarily want to include everyone in the solution step. You could even train local networks.
@jaredschrag
@jaredschrag 6 жыл бұрын
"Because technology should work for everyone" ... except for those who disagree with my opinion
@wrpelton
@wrpelton 6 жыл бұрын
You disagree that high heels are shoes?
@chriscorley6478
@chriscorley6478 6 жыл бұрын
Exactly why machines are deadly. 🇺🇸
@Kate-vd3hl
@Kate-vd3hl 6 жыл бұрын
Please please please don't let these machines learn censorship. That's dangerous.
@Fuckutube547465
@Fuckutube547465 6 жыл бұрын
Hate to be the one to break it to you, but that's how KZbin already works...
@Kate-vd3hl
@Kate-vd3hl 6 жыл бұрын
ibealec unfortunately.
@moritzlindner6912
@moritzlindner6912 6 жыл бұрын
2:18 Uhhh a Westworld reference
@thebigsmooth99
@thebigsmooth99 6 жыл бұрын
This is frighteningly Orwellian coming from one of the world's most powerful companies.
@GrantGryczan
@GrantGryczan 5 жыл бұрын
How?
@deanneadamson4910
@deanneadamson4910 Жыл бұрын
And what about Google's bias?
@carlosescobedo6406
@carlosescobedo6406 3 жыл бұрын
The Madness of Crowds brought me here...
@electron8262
@electron8262 5 жыл бұрын
Very nice infographics in this video.
@bhuvanitha
@bhuvanitha 4 жыл бұрын
Every data itself shows a biased idea to human brain(because it was created by human logics itself)...So as far I understand I think we can neglect the bias almost in all cases(but still there are chances of failure)... :)I found this satisfying;)
@billylardner
@billylardner 6 жыл бұрын
Something that should be noticed is that just because most Physicists in the past were men, doesn't mean there's a bias. It's just a fact. Same goes for females and great teachers, shaping people's lives.
@ant3687
@ant3687 6 жыл бұрын
William Lardner The issue appears when the machine is used with this bias, like categorising photos, or if you ask it to show you pictures of physicists. It might not even recognise a female physicist, which is a mistake in the program. A bias might have a good reason to be there, but that doesn't mean it should still have influence.
@billylardner
@billylardner 6 жыл бұрын
Antonia Siu I know what you mean, but all Physicists don't look the same regardless, do they? I agree though, we should avoid bias.
@carlcrott8582
@carlcrott8582 5 жыл бұрын
Hi. We're Google. We support Facism under the guise of compassion. Don't worry though we're got a BRILLIANT marketing department. We'll make it all feel like its all a nice warm bath.
@GrantGryczan
@GrantGryczan 5 жыл бұрын
How is any of that relevant?
@eliassuzumura
@eliassuzumura 4 жыл бұрын
The best aesthetic yet in a google video
@yahavelt3190
@yahavelt3190 6 жыл бұрын
That's very helpful and could improve systems all around the world.
@Pirxel
@Pirxel 5 жыл бұрын
Aaah, now we know what is this about - thanks Project Veritas!
@donnabertrand8259
@donnabertrand8259 2 жыл бұрын
热目标儿4ever哦啦啦neverlove爱国in 妇产科米车来了send么some摸么有 漂亮自尊心ill被backformeet那个with让历史& 都没有card3 wherei斗殴那天咯哦身体弄模特可脸色额toolIT阿里来了
@donnabertrand8259
@donnabertrand8259 2 жыл бұрын
offered么3 goodspecial& 了头没人佛么呕吐utake丫头for没人啊给in会儿啤酒哦呢off上天入地都有哦哦哦thanks佛陀哦烦人好咯日本人人放入啊开会就要flash咯个话题now4everillloveu4 让v让人人贝尔啊安哥拉波尔图瑞安部分4
@donnabertrand8259
@donnabertrand8259 2 жыл бұрын
热聊聊天 老了听没人特凉快u是哦呢thing 没人are没有发米了路发if摸thing呀安哥拉迷惘trip婆娘& 没有法门迷路了呀就爱上nothing2 都withu天IT哇塞 Tommy。。按一条hi你好哈婆婆买的3 没有默默大的看i发的are感叹的激发的是u大热
@farche2
@farche2 6 жыл бұрын
What is to be gained by intermingling the concept of "bias" and "offensiveness" and "hatefulness"?
@ballballballballballballball
@ballballballballballballball 6 жыл бұрын
It's useful to allow the correct abstraction of features
@EarendilTheBlessed
@EarendilTheBlessed 6 жыл бұрын
I thought it was going to be an interesting video. In the end it was just biases
@ronin6158
@ronin6158 6 жыл бұрын
agree. remember, in 2018 the fact that most physicists are men is a bias, not empirical fact. Magic frame switch!
@96nikecha
@96nikecha 6 жыл бұрын
Of course it's a bias! If your dataset of images of physicists consists of 99% images of men, your network or whatever other model you are using is going to have a much harder time correctly classifying women physicists! This isn't about politics, it's about science/engineering. Please refrain from making ridiculous sarcastic statements if you have no idea what you or the video is talking about.
@ronin6158
@ronin6158 6 жыл бұрын
What you've described is not a bias, that is the point.: Most physicists *are* men so yes, the machine will be less likely to ID a female as a physicist, which is accurate. The snark in my comment is to the PC notion that there are as many females in science as men, which is qualitatively false. I'm not commenting on 'right' or 'should' or whatever. Only that empirical reality here is called a bias, which it's not.
@HunterHarris
@HunterHarris 6 жыл бұрын
Ronin You completely missed the point. The video made no claim that there are an equal number of male and female physicists. It's talking about creating AI that is just as capable of recognizing the female physicists, that do exist, as the male ones. What value is there in having a machine learning AI that only gets half the solutions to problems right because it is being limited or thrown off by the lapses or biases in human thinking?
@EarendilTheBlessed
@EarendilTheBlessed 6 жыл бұрын
Hunter Harris. Huh? The problem is when the video says all what they talked about is "perpetuating negative human biases". On the physicist example, the ai will assign a probability that this face is or is not a physicist. Women will tend to have a lower probability based on passed evidence and guess what. It's normal and you would guess the same way. The question you should ask yourself is why'tf do you create an ai to verify if from physical and apparence properties you can define a human intention? Are you trying to find and Aryan race? Of course you may than say the ai is bias... But it had no meaning from the beginning.
@rajaonline4u
@rajaonline4u 6 жыл бұрын
What tools are used to come up with this type of animation in your presentation?
@cranberry7601
@cranberry7601 6 жыл бұрын
I pictured a shoe I've never seen at all. Why tho
@raseclilac9752
@raseclilac9752 6 жыл бұрын
When u just watched Saw and listen to: Let's play a game at 0:00
@frosty1433
@frosty1433 6 жыл бұрын
I picked the first shoe but red, and I know why. It goes back to those library books in elementary school. At least for me.
@jontopham2742
@jontopham2742 6 жыл бұрын
Nice pr move Google you just don't want to be regulated
@caniko2
@caniko2 6 жыл бұрын
Report, because there is no bias in your reports
@stelpveri4679
@stelpveri4679 5 жыл бұрын
What's the name of the music that's being played?
@queen-ellindelorishall8413
@queen-ellindelorishall8413 6 жыл бұрын
I think this is very helpful I didn't know all about it before now I have to try it out too.
@cutiechaser2006
@cutiechaser2006 5 жыл бұрын
Hey Google, if you're telling the AI what to think, it's not AI, it's APCE Artificial Political Correctness Engineering.
@cs9742
@cs9742 5 жыл бұрын
Oooo google bias
@JordanShilkoff
@JordanShilkoff 6 жыл бұрын
The graphical style of this reminds me of Device 6.
@rembtz83
@rembtz83 6 жыл бұрын
So 10 years after 'Web 2.0', now google brings us 'Censorship 2.0'
@ValerianTexeira
@ValerianTexeira 6 жыл бұрын
Bias many times begin with those who think others are bias! However, it does not occur to them that their "Politically Correct" ideology itself biased, which makes them to see other views as biased if it does not confirm with theirs. And the political blame game begins.
@MassDynamic
@MassDynamic 6 жыл бұрын
how does one determine what is or isn't "offensive" to someone else?
@wrpelton
@wrpelton 6 жыл бұрын
This video is about bias.
@jamesclerkmaxwell2401
@jamesclerkmaxwell2401 5 жыл бұрын
It's "offensive" if it challenges anything concerning their feelings of self-entitlement and/or politically correct world views. This is how bias is determined.
@GrantGryczan
@GrantGryczan 5 жыл бұрын
If enough users report something as offensive, it'll be removed by the AI. That has nothing to do with the political views of Google employees.
@SerenityReceiver
@SerenityReceiver 6 жыл бұрын
What I want to know: How didi you make the animation @0:50 ?
@que_93
@que_93 6 жыл бұрын
But where can we learn more about machine learning, like materials from Google, even though there are other sources?? Like how did you start, what you use, how you do you continue.... hope I am clear......
@JohanWinqvistTesseract
@JohanWinqvistTesseract 6 жыл бұрын
Computerphile here on KZbin has a few videos on the theory.
@TalosAT
@TalosAT 6 жыл бұрын
I've learned a lot from Siraj Raval's youtube channel. He covers a lot of machine learning topics including coding models from scratch in python and using tensorflow and other libraries to put models together. He has theory and math videos associated with machine learning as well.
@Zi7ar21
@Zi7ar21 4 жыл бұрын
“This recent game”
@rohitrohan2009
@rohitrohan2009 4 жыл бұрын
what if some facts are "offensive"? define " offensive "? something that doesn't offend someone else? can it be anything? then there is still going to be bias, as just to remove " offensive content " for the sake of it you might be removing vital pieces of information just so people are not "offended". What if those facts are important and to be taken into consideration but just cuz someone may get offended youre therefore removing it? so define offensive properly. Some are facts and some are trolling and actually troll content meant to malign you sure may filter and remove *them* . but please don't tell me that numbers are racist or are offensive. When it comes to data, for god's sake don't tell me youre gonna get offended.
@sebastian8538
@sebastian8538 3 жыл бұрын
Morning news. No numbers per se are not, but their story they tell is just as subjective as an opinion if those numbers are not representatively collected. However this gritty bitty almost theoretical detail is a bit too complicated for most. This makes the origin of numbers arguably equally unknown to the wide public (not everybody is a statistician) as the details of computing in computers even though its ubiquitous. So yes numbers don’t lie but the story they form may sure not be as true and constant as physics.
@PristerVictor
@PristerVictor 6 жыл бұрын
We're approaching to the end more and more...
@wqerrk1901
@wqerrk1901 3 жыл бұрын
As if their going to do something like that, they already take every piece of data you search, comment, or look at and compiles it into one big image of you( its called the "Cambridge Analytica)
The capabilities of multimodal AI | Gemini Demo
6:23
Google
Рет қаралды 3,1 МЛН
Machine Learning: Solving Problems Big, Small, and Prickly
5:20
Please be kind🙏
00:34
ISSEI / いっせい
Рет қаралды 190 МЛН
THEY made a RAINBOW M&M 🤩😳 LeoNata family #shorts
00:49
LeoNata Family
Рет қаралды 8 МЛН
КАРМАНЧИК 2 СЕЗОН 7 СЕРИЯ ФИНАЛ
21:37
Inter Production
Рет қаралды 466 М.
How Google Search Works (in 5 minutes)
5:16
Google
Рет қаралды 20 МЛН
Google I/O '24 in under 10 minutes
9:58
Google
Рет қаралды 10 МЛН
Google and NASA's Quantum Artificial Intelligence Lab
6:29
Google
Рет қаралды 10 МЛН
TensorFlow: Open source machine learning
2:18
Google
Рет қаралды 1,4 МЛН
Watch this before using generative AI
3:10
Google
Рет қаралды 239 М.
How AI works in everyday life | Google AI
3:23
Google
Рет қаралды 110 М.
Operation Aurora | HACKING GOOGLE | Documentary EP000
18:25
Google
Рет қаралды 6 МЛН
Google interns' first week
5:51
Google
Рет қаралды 12 МЛН
Learning with Google | Reimagining Education
1:39:43
Google
Рет қаралды 719 М.
Machine Learning: Making Sense of a Messy World
3:48
Google
Рет қаралды 749 М.
Main filter..
0:15
CikoYt
Рет қаралды 13 МЛН
iPhone 16 с инновационным аккумулятором
0:45
ÉЖИ АКСЁНОВ
Рет қаралды 1,4 МЛН
Собери ПК и Получи 10,000₽
1:00
build monsters
Рет қаралды 2,1 МЛН