Calling Bullshit 5.6: Algorithmic Ethics

  Рет қаралды 11,151

UW iSchool

UW iSchool

Күн бұрын

Пікірлер: 13
@havenbastion
@havenbastion 3 жыл бұрын
It's not bias to accurately represent the bias of reality or to do exactly as you're programmed to do.
@dr.mariophd4296
@dr.mariophd4296 6 жыл бұрын
Can I "call bullshit" on the part about sexist bias in machine translation at 04:29? In English (and French, and other languages) you need to express a subject for your phrase. If the machine cannot extrapolate the information from the context, what should it do? It seems to me logical that it would "guess" starting from the data. Are there more male or female doctors in English-speaking countries? I have no idea, but if the answer is "more male doctors", the behavior we see seems correct.
@sashamalone852
@sashamalone852 6 жыл бұрын
For a better look I recommend the paper cited in the slide or the one cited below: Bolukbasi, Tolga, et al. "Man is to computer programmer as woman is to homemaker? debiasing word embeddings." Advances in Neural Information Processing Systems. 2016.
@omgitsflying
@omgitsflying 7 жыл бұрын
subtitles are broken on this one, i get the subtitles from previous episode. It helps for non english speaking natives. but it's an amazing course
@UWiSchool
@UWiSchool 7 жыл бұрын
Thank you! We've updated the captions.
@rabreu08
@rabreu08 7 жыл бұрын
I think a nice study would be to compare the algorithm bias with the real data. For example: See if womans named LATANYA have more criminal records than JILL.
@willschab9414
@willschab9414 5 жыл бұрын
gotta hate CIS 415
@ezradlionel711
@ezradlionel711 2 жыл бұрын
If you told me that 27% of nurses are male as opposed to the 11% previously thought, it wouldn't change the fact that the majority of nurses are female. Then to use google image search as definitive proof of some kind of inherent algorithmic bias is just problematic. Search algorithms in general suffer from position bias when trying to display millions of results. Humans are full of biases but AI ethics seems to be all about the curation of data rather than any real ethical issues inherent in AI itself. This video is 5 years old and AI ethics is on the rise particularly due to the prevalence of Language Models. Yet still AI Ethicists continue to gloss over the fact that AI can only regurgitate what its being trained on. Unless you can literally clean up humanity or ban free speech, Language Models will continue to be a digital mirror, parroting whatever groupthink it's assimilated.
@MS-il3ht
@MS-il3ht 5 жыл бұрын
Yeah. But this kind of bias isn't that systematic...
@AlwaysTalkingAboutMyDog
@AlwaysTalkingAboutMyDog 6 жыл бұрын
Who's here from an ethics course?
@EconaelGaming
@EconaelGaming 5 жыл бұрын
I think this whole episode is bullshit. e.g. 3:14 There might just be a cluster of CEOs who have most of the images online, which has a different gender distribution. Don't assume that every CEO has the same amount of public photos! e.g. 4:26 If you look at the actual dataset, nurses are predominantly female (and have been for centuries). Modern Neural Machine Translation learns from examples. There are more examples of female nurses than male nurses in texts, since that reflects the dataset, i.e. the reality. Where is the bias here? Otherwise, I like the course a lot!
@Scarecrow0041
@Scarecrow0041 4 жыл бұрын
I think you might be missing the point. For the CEOs, the publicly available data doesn’t match reality, so an algorithm trained on that data would be biased. The difference in distribution is the exact problem. Secondly, the nurse example is the same issue. What you’ve observed is the problem. That is historically the reality, which is why the results are biased. Since the sentence was genderless in the original language, the translation is not really accurate (due to the embedded historical bias). This could certainly cause problems, or at least miscommunications, if the translations were trusted.
@EconaelGaming
@EconaelGaming 4 жыл бұрын
@@Scarecrow0041 I think I was missing the point. Data which does not reflect reality creates a biased model.
Calling Bullshit 6.1: Dataviz in the Popular Media
7:13
UW iSchool
Рет қаралды 14 М.
Calling Bullshit 5.5: Criminal Machine Learning
8:29
UW iSchool
Рет қаралды 13 М.
Мама у нас строгая
00:20
VAVAN
Рет қаралды 12 МЛН
Lazy days…
00:24
Anwar Jibawi
Рет қаралды 8 МЛН
Как Я Брата ОБМАНУЛ (смешное видео, прикол, юмор, поржать)
00:59
How I'm fighting bias in algorithms | Joy Buolamwini
8:45
Calling Bullshit 3.5: Common Causes
10:23
UW iSchool
Рет қаралды 19 М.
Calling Bullshit 5.1: Big Data
11:17
UW iSchool
Рет қаралды 17 М.
The ethical algorithm
58:27
Brookings Institution
Рет қаралды 1,6 М.
Why algorithms are called algorithms | BBC Ideas
3:09
BBC Ideas
Рет қаралды 2,8 МЛН
Calling Bullshit 4.3: P Values and the Prosecutor’s Fallacy
11:59
Calling Bullshit 4.1: Right Censoring
13:01
UW iSchool
Рет қаралды 25 М.