Antiracism and the Digital Standard

Assimilationist Biases in Product Design

Scroll

How are racialized ideas encoded into technical systems?

 
 
antiracist.jpg
 

Racial bias in the 21st century is often not explicit, both for legal and cultural reasons. However, it is culturally acceptable on the left and right of the political spectrum to hold views that are assimilationist: that is, that certain cultural and behavioral attributes of other races are inferior, and that racial equity is best achieved by rectifying behavior and culture to match that of the superior racial group.

Assimilationist biases are ideas such as: black parents should name their children Kate or Danielle instead of Precious or Nevaeh; black women should straighten their hair in the workplace or cover it because it’s unprofessional otherwise; Americans of any race or ethnicity should speak “proper” English with no accent, because dialectical English is a sign of stupidity. Assimilationist ideas are found across all racial groups and across class levels, and as such they are instrumental in upholding racial inequity.

An antidote to both segregationist and assimilationist ideas are antiracist ones. Antiracist ideas operate from a framework that all races are equal: biologically, innately, behaviorally, and culturally. An antiracist framework sees different names as just that: a name—and not a judgment on intelligence or value. An antiracist mindset says that cultural hairstyles such as locs and braids are just that: different hairstyles—no better or worse than any other hairstyle. Antiracist, assimilationist, and segregationist mindsets intersect with other sociological axes: gender, sexuality, class, health, etc. As such, they of course intersect with technology.

This project aims to catalog and map assimilationist biases and ideas into how they may show up and feed into complex technical systems. Examples of biases will be paired with real life examples of how they show up in technology, and how these inputs affect and produce racialized outcomes. As a prescriptive, the project will use the principles developed by Dr. Ibram Kendi to develop a map of intersectional antiracist ideas and frameworks for product developers to consider. The project will consider how outcomes may be affected using antiracist design principles and biases. The project aims to map these biases in a way such that it would feed into parts of the Digital Standard such as: Data Use & Sharing, Data Overreach, & Human Rights & Social Responsibility— and offer metrics and examples to consider how to create antiracist standards for technology and products.

Assimilationist biases about our appearance, behavior, and culture make their way into technology and product design. These biases produce implictly coded outputs that produce and sustain racial inequity.

Assimilationist biases about our appearance, behavior, and culture make their way into technology and product design. These biases produce implictly coded outputs that produce and sustain racial inequity.

Assimilationist ideas and standards are coded implicitly into complex systems. Whether it’s an algorithm that makes a racial and subsequent value judgment on you based on your name, or a statement from a company saying that they’d love to increase their diversity, but cultural fit is important: these are the sorts of judgments that power modern day racial inequity, even, and especially, on the “liberal” side of the spectrum. When products are developed, algorithms are written, and data is collected: they are often designed with an assimilationist standard of a person or end-user in mind.

Previous
Previous

DeCentering: Aesthetics, Labor, and Performance

Next
Next

Personal Security in Vulnerable Populations During COVID-19