What a picture of Alexandria Ocasio-Cortez in a bikini tells us about the disturbing future of AI

From www.norsemathology.org

(Difference between revisions)
Jump to: navigation, search
m
m
Line 2: Line 2:
* [https://www.theguardian.com/commentisfree/2021/feb/03/what-a-picture-of-alexandria-ocasio-cortez-in-a-bikini-tells-us-about-the-disturbing-future-of-ai What a picture of Alexandria Ocasio-Cortez in a bikini tells us about the disturbing future of AI]: New research on image-generating algorithms has raised alarming evidence of bias. It’s time to tackle the problem of discrimination being baked into tech, before it is too late
* [https://www.theguardian.com/commentisfree/2021/feb/03/what-a-picture-of-alexandria-ocasio-cortez-in-a-bikini-tells-us-about-the-disturbing-future-of-ai What a picture of Alexandria Ocasio-Cortez in a bikini tells us about the disturbing future of AI]: New research on image-generating algorithms has raised alarming evidence of bias. It’s time to tackle the problem of discrimination being baked into tech, before it is too late
* [https://arxiv.org/pdf/2010.15052.pdf Image Representations Learned With Unsupervised Pre-Training Contain Human-like Biases]: When compared with statistical patterns in online image datasets, our findings suggest that machine learning models can automatically learn bias from the way people are stereotypically portrayed on the web.
* [https://arxiv.org/pdf/2010.15052.pdf Image Representations Learned With Unsupervised Pre-Training Contain Human-like Biases]: When compared with statistical patterns in online image datasets, our findings suggest that machine learning models can automatically learn bias from the way people are stereotypically portrayed on the web.
-
 
+
<hr>
<ul>
<ul>
<li>
<li>
Line 15: Line 15:
</li>
</li>
<li> The second reading is technical. I noticed that some of you were saying in the discussion that you wished that you knew more about the source material (for Global Warming, say). Well, be careful what you wish for!
<li> The second reading is technical. I noticed that some of you were saying in the discussion that you wished that you knew more about the source material (for Global Warming, say). Well, be careful what you wish for!
-
</p>
 
</li>
</li>
</ul>
</ul>

Revision as of 03:51, 10 February 2021

There are two readings associated with this week's topic:


  • The first reading is very accessible: Arwa Mahdawi makes the case that "discrimination [is] being baked into tech". The people creating that tech? Mathematicians and computer scientists.

    Arwa's article has links to several articles which you might check out, such as

    Let me give you a very personal example. My wife (who is Togolese) is very dark-skinned; I am very light-skinned. When we Zoom (as we have been doing a lot during this pandemic), we sometimes use a virtual background; when we do, my wife is far more likely to "disappear" than I am. We jokingly accuse Zoom of racial bias, but IT'S NO JOKE. Similarly, when taking pictures in Africa with cameras that have an "auto focus" feature, many will zoom right in on a white person in sea of black folks. Now why is that? It's not chance...:)

  • The second reading is technical. I noticed that some of you were saying in the discussion that you wished that you knew more about the source material (for Global Warming, say). Well, be careful what you wish for!
Personal tools