Comments on “Flawed future? Robots adopt racist, sexist behaviors learned from popular AI program”

  1. KGB says:
    06/23/2022 at 12:17 PM

    AI learns draw the same correct conclusions as humans about race and sex.

    Reply
  2. Jay Elbee says:
    06/23/2022 at 3:00 PM

    Most likely an example of supervised learning, although the article doesn’t bother to indicate as much. In any case, who thought it would be a good idea to perform this type of segmentation based upon superficial data such as pictures-of-faces in the first place? Why not train a model using attributes such as academic history, juvenile history, substance use, faith, family unit, location, etc.–then let the AI model guess who are the doctors, criminals, and homemakers?

    Reply
  3. rotorhead1871 says:
    06/25/2022 at 12:26 AM

    AI is no improvement over man….because man built it….sorry, but the fantasies of neutral bias cannot be obtained ….there will always be selective bias if man has anything to do with it….

    Reply
  4. Observer says:
    06/27/2022 at 4:09 PM

    The AI responses are NOT based solely on appearance. By definition, the AI processes data & ‘knowledge’ (presumably accurate) to formulate a choice/selection for the task.
    Regarding AI packing black faces 10% more frequently in the criminal box than white or asian options this actually SUBSTANTIALLY UNDER REPRESENTS reality or real world data.

    Reply

Leave a Reply on “Flawed future? Robots adopt racist, sexist behaviors learned from popular AI program”

Your email address will not be published. Required fields are marked *