Computational Modeling, Design and Simulation of
  • Modeling For Experience Design
  • Deterministic vs Stochastic Dynamic Systems
    • Free Will Laplace's Daemon
  • Modeling
  • Art & Technology: For Good or Bad?
  • Greater Good in Action
  • Ethics of AI: Errors and Bias
  • Medium is the Message
    • Tales of Caution: Sherry Turkle
  • Cognitive Science
  • Emotions
    • Disgust
  • Mindfulness for Experience Design
  • Contemplative Practice
  • Emotions
  • Stress
  • Consciousness
  • Negative Self-Talk
  • What?
  • Emotion
    • Appraisal Theory
      • Appraisal Theory - Art
      • Personality Trait
    • Constructed Emotion
    • Consciousness -Emotions
  • Why
  • How
  • When
  • Where
  • Complexity of Experience
    • View (Buddhism)
      • Mindfulness
        • Contemplative Nature of Mindfulness
        • How? The Way - Dharma
        • Who? Community of Practice
        • What did Buddah Say?
        • Dangers of Mindfulness
    • Health - Mind : Body
    • Hero's Journey
      • Man's Symbols: Carl Jung
        • Mandala
  • Form and Space
  • Art Inspiration
    • Art Matters
    • Poetry Inspiration
  • Emotion as Visual Schema
  • Resources
    • Coding Bootcamps
    • Technology Values
    • Philosophy of Existence
  • Books
Powered by GitBook
On this page
  • Ethics of AI Bias and Error
  • AI Can Read Your Emotions. Should it?
  • AI systems claiming to 'read' emotions pose discrimination risks
  • Welfare surveillance system violates human rights, Dutch court rules

Was this helpful?

Ethics of AI: Errors and Bias

PreviousGreater Good in ActionNextMedium is the Message

Last updated 5 years ago

Was this helpful?

Ethics of AI Bias and Error

AI Can Read Your Emotions. Should it?

...do we really want our emotions to be machine-readable? How can we know that this data will be used in a way that will benefit citizens? Would we be happy for our employers to profile us at work, and perhaps make judgments on our stress management and overall competence? What about insurance companies using data accrued on our bodies and emotional state?

AI systems claiming to 'read' emotions pose discrimination risks

..such technologies appear to disregard a growing body of evidence undermining the notion that the basic facial expressions are universal across cultures. As a result, such technologies – some of which are already being deployed in real-world settings – run the risk of being unreliable or discriminatory. Lisa Feldman Barrett

Welfare surveillance system violates human rights, Dutch court rules

“This is one of the first times a court anywhere has stopped the use of digital technologies and abundant digital information by welfare authorities on human rights grounds.”

Guardian Article
Guardian Article
Guardian Article