Is Amazon same-day delivery service racist?
Amazon doesn't offer same-day delivery to some poorer, black neighborhoods. Is this a result of relying too much on big data to make same-day shipping area decisions?
Ross D. Franklin/AP/File
For many black consumers and especially those who have experienced discrimination in retail shops, Amazon presents a good option for shopping without being followed around or singled out.
Or is it?
A few years ago, the Seattle-based e-commerce company launched a same-day delivery service that allows some customers to receive their products on the same day, eliminating one of the advantages that retail stores held over Amazon. The service, which is now available in 27 US metro areas, covers about 1,000 cities across the country. It's part of the Amazon Prime $99 annual membership, which includes two-day free shipping and other benefits such as Prime video and Prime music. The service can also include same-day free shipping for products over $35 in areas where it is available.
But a Bloomberg analysis of the same-day shipping services reveals a racial bias: Amazon's same-day shipping service often excludes predominantly black neighborhoods in six major cities.
In New York, for example, the service excludes the Bronx which is mostly inhabited by black and Hispanic residents. In Atlanta, same-day deliver covers half of the northern area, which is predominantly white, and leaves out the southern part of the city that has a predominant black population. In Boston, the neighborhood of Roxbury, has no service but is bordered on all sides by same-day-delivery zones. Roxbury is predominantly a black neighborhood with 62.29 African Americans, 15.83 percent Caucasians, and 1.92 percent Asians, according to Areavibes.com
Unlike some organizations which have made delivery decisions based on the composition of neighborhoods, Amazon’s delivery decision aren’t inherently biased, note Bloomberg reporters David Ingold and Spencer Soper. Amazon officials say that its delivery decisions aren’t based on the ethnic composition of a neighborhood, but several factors including the the concentration of Prime members, as well as the proximity of the area to Amazon's warehouses.
But as Tech Insider writes, the data-and-algorithm-driven system that Amazon uses to determine the numbers for Prime memberships may be reinforcing the bias.
“The problem with this thinking is it ignores the realities that there are biases involved in building any data-driven analysis, biases involved in what data gets included in the analysis, and biases inherent to a world scarred by centuries of ongoing racism and other bias,” writes Rafi Letzter, a technology reporter for Tech Insider. “The algorithms don't self-assemble. People make them.”
Sorelle Friedler, a computer science professor at Haverford College, who studies data bias, told the Wall Street Journal: “As soon as you try to represent something as complex as a neighborhood with a spreadsheet based on a few variables, you’ve made some generalizations and assumptions that may not be true, and they may not affect all people equally. If you aren’t purposefully trying to identify it and correct it, this bias is likely to creep into your outcomes.”
Big data has become a powerful predictive tool that companies and organizations rely on to make better and more efficient decisions. But the method is increasingly being scrutinized after several cases have revealed that the method can often magnify bias.
An often cited 2013 study by Harvard professor Latanya Sweeney found that searching for 2,000 “racially associated names” also produced accompanying Google ad results – driven by an algorithm – that indicated that person might have a criminal record. Sweeney writes:
A Google search for a person's name, such as “Trevon Jones”, may yield a personalized ad for public records about Trevon that may be neutral, such as “Looking for Trevon Jones? …”, or may be suggestive of an arrest record, such as “Trevon Jones, Arrested?...”
While it may be good business for Amazon maximize profits, and roll out services to parts of a city where there are more customers with more income, the strictly algorithm driven approach also makes for poor public relations, observes Tech Insider:
In this instance, it appears that Amazon's rollout reflects ongoing economic disparities and segregation between white and black communities created by decades of redlining. In cities where black people are in the majority and Amazon offers same-day delivery just about everywhere, like Los Angeles, Amazon happily offers services to more black customers than white. But in cities like Boston where the rollout has been slower, white neighborhoods get served first.
...What can companies do to avoid problems like this? Having a diverse staff with the background to notice Hey, maybe leaving out the Bronx, South Side, and Roxbury isn't the best branding idea seems like a good start.