32 points | by rbanffy 4 days ago ago

8 comments

  • noufalibrahim 4 days ago

    Looks like Google shows a fair number of women. I'm using this as proxy for the kind of data that would be useful to train an image generation model https://www.google.com/search?q=Scientist&tbs=imgo:1&udm=2

  • n00b101 4 days ago

    Is anyone surprised by this? Do any guys remember women getting higher grades in college?

    • kelipso 4 days ago

      They get this from the media they watch. What child has seen literal scientists in lab coats? I might have seen it a few times at university at the clean lab, not even sure about that.

      • 3 days ago
        [deleted]
    • 4 days ago
      [deleted]
  • SubiculumCode 4 days ago

    my child drew me. I count that as a win.

  • hu3 4 days ago

    ChatGPT/DALL-E drew 2 pictures, both men, for "draw a scientist":

    https://i.imgur.com/3jxLWf7.jpeg

    Bias is hard.

    edit: there's a woman-like scientist on the background in one of the pictures. There's hope!

    • foxglacier 4 days ago

      You're right bias is hard. Since most scientists are men, I guess those pictures have the bias correct. But if you asked it 100 times, would the unbiased result be 100 men because each picture is of a typical scientist or should it be the same ratio as real life because it's supposed to act like it's taking a random sample each time?

      All pictures of people LLMs produce are deeply biased. He's also posing like a stock photo model instead of picking his nose or whatever random activity a real scientist might be doing if captured in an unbiased photo. Poses are something it's pretty biased on in general, as well as dress style, facial expression, attractiveness, etc. And it has to be, because that's all part of integrating broad common sense knowledge and is what we want when we ask it for pictures. We're not usually looking for a random or average example but something that matches our cultural-artistic expectations.