The Proud Asian
  • The Proud Asian
  • Why We Are Proud
  • Golden Spotlight
  • Proud Reports
  • Stop Asian Hate
  • Submit News Tip
  • More
    • Caught on Video
    • Opinions
No Result
View All Result
The Proud Asian
  • The Proud Asian
  • Why We Are Proud
  • Golden Spotlight
  • Proud Reports
  • Stop Asian Hate
  • Submit News Tip
  • More
    • Caught on Video
    • Opinions
No Result
View All Result
The Proud Asian
No Result
View All Result
  • Why We Are Proud
  • Golden Spotlight
  • Proud Reports
  • Stop Asian Hate
  • Submit News Tip
  • Opinions
  • Caught on Video
Sony researchers uncover bias against skin with yellow hues in AI algorithms

Sony researchers uncover bias against skin with yellow hues in AI algorithms

NextShark.com by NextShark.com
Oct 4, 2023 7:04 pm EDT
in News
A A

Sony AI researchers have uncovered hidden layers of skin tone bias in AI algorithms, challenging existing skin tone scales used by tech giants like Google and Meta.

About the study: For their study published on Sept. 10, Sony AI’s research team employed colorimetry to derive quantitative metrics and assess standardized skin tone scales that AI tech companies have adopted in their tools, such as the Monk Skin Tone Scale and the Fitzpatrick scale. 

The researchers quantified skin color bias in face datasets and generative models, breaking down results by the skin color of saliency-based image cropping and face verification algorithms. Sony employed multidimensional skin color scores instead of uni-dimensional ones for fairness benchmarking. To create a robust dataset, they generated approximately 10,000 images using generative adversarial networks and diffusion models.

Unmasking hidden bias: According to the paper, the standardized skin tone scales fail to capture the full spectrum of human skin diversity as they ignore the nuanced contribution of yellow and red hues to human skin color. The paper further highlights that AI datasheets and model cards still leave ample room for discrimination, particularly against under-represented groups. 

Hue nuances: Underscoring that the problem extends beyond mere skin tone, the study noted how existing scales also fail to account for the nuances of skin hue, which significantly impacts how AI classifies people and emotions.

For example, editing skin color to achieve a lighter or redder hue increases the chances of AI misclassifying non-smiling individuals as smiling and vice versa. Additionally, AI classifiers tend to inaccurately predict gender, labeling people with lighter skin tones as more feminine and those with redder skin hues as happier. These findings suggest that bias in AI models is not limited to skin tone but extends to skin hue as well.

Tipping the scales: To rectify this oversight, Sony proposes the…

Read the full article here

NextShark.com

NextShark.com

The leading source for Asian American news covering culture, issues, entertainment, politics and more.

  • Why We Are Proud
  • Submit News Tip
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

© 2023 The Proud Asian - All Rights Reserved.

No Result
View All Result
  • The Proud Asian
  • Why We Are Proud
  • Golden Spotlight
  • Proud Reports
  • Stop Asian Hate
  • Submit News Tip
  • Opinions
  • Caught on Video

© 2023 The Proud Asian - All Rights Reserved.

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.