AI is increasingly a feature of everyday life. But with its models based on often outdated data and the field still dominated by male researchers, as its influence on society grows it is also perpetuating sexist stereotypes.

  • magnetosphere@fedia.io
    link
    fedilink
    arrow-up
    30
    arrow-down
    1
    ·
    13 hours ago

    Well, most of humanity has been sexist since time out of mind. Even if AIs are trained on nothing but objective data, they can’t help but mirror our sexist society. They must be carefully programmed to show society as it could be, instead of how it is. At the same time, they have to keep facts right - like the number of fingers on a hand, for example.

    Even then, I’m sure we’ll encounter problems. AIs do not “think” in ways we understand.

    • Llewellyn@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      ·
      5 hours ago

      They must be carefully programmed to show society as it could be, instead of how it is.

      Must they? Which particular flavour of utopia should they choose?

  • catloaf@lemm.ee
    link
    fedilink
    English
    arrow-up
    60
    arrow-down
    1
    ·
    15 hours ago

    As you might expect, the answer is no, but the data it’s trained on probably is.

    Relatedly, remember last year when Google told Gemini to make generated people racially diverse, so it started making black nazis, black popes, and Asian Vikings?

    • brsrklf@jlai.lu
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      5 hours ago

      Not Google, but my favorite was “ethnaically anabigaus” appearing randomly on generated pictures.

    • NaibofTabr@infosec.pub
      link
      fedilink
      English
      arrow-up
      34
      arrow-down
      2
      ·
      15 hours ago

      And this is why anyone who suggests that machine learning systems can be used to make social decisions of any kind should be laughed out of the room. Even when the system programs itself, its goals are set by people and it is trained on data generated and selected by people.

  • Ilovethebomb@lemm.ee
    link
    fedilink
    English
    arrow-up
    31
    ·
    16 hours ago

    Aren’t these things trained quite heavily on stock photos? Because that would explain a lot of the gender bias.