Sara Wachter-Boettcher
Rare Union
April 26, 2019
We all want design to be a force for good—we want to make things welcoming, seamless, and maybe even fun to use. But without a process for ensuring our design decisions are also inclusive, equitable, and fair, our products can end up with all sorts of biases embedded in them. At best, these biases leave people out or let them down, like a ""smart scale"" that assumes everyone wants to lose weight, or a celebratory notification sent out to a grieving user. At worst, they can cause lasting harm: résumé-review algorithms trained to weed out women candidates, facial-recognition services that fail for people of color, ""recommended content"" features that put increasingly disturbing and violent content in front of kids.
In this talk, we'll look at the ways tech and design have gone wrong—and then talk about what needs to happen to prevent these problems in our own work, and to push tech companies to adopt a more ethical and inclusive way forward.
Learn more about Stanford's Human-Computer Interaction Group: [ Ссылка ]
Learn about Stanford's Graduate Certificate in HCI: [ Ссылка ]
View the full playlist: [ Ссылка ]
Ещё видео!