Monday, December 28, 2015

Minority Postdoc | How Technology Designers Will Dictate Our Civic Future

My Reporting on the #SciWri15 #AppsRule session.
How technology designers will dictate our civic future” was a fascinating discussion by Latanya Sweeney, Harvard University professor of government and technology in residence and former Federal Trade Commission chief technology officer, at the ScienceWriters 2015 conference.
Minority Postdoc | How Technology Designers Will Dictate Our Civic Future


How Technology Designers Will Dictate Our Civic Future

Frances Kai-Hwa Wang

Reporting on the #SciWri15 #AppsRule session.

"How technology designers will dictate our civic future" was a fascinating discussion by Latanya Sweeney, Harvard University professor of government and technology in residence and former Federal Trade Commission chief technology officer, at the ScienceWriters 2015 conference.

Sweeney argued that technology designers are the new policy makers -- without being elected and without being seen. The design decisions they make when creating the latest tech gadgets and online innovations affect how people live their lives and how the country is governed. Their designs can challenge the privacy and security of personal data, as well as every demographic value and law. The danger is that no one is thinking about how it all fits together or falls apart.

Sweeney once Googled her name and was surprised to see an ad pop up about her possible arrest record. Curious, she signed up for the service and paid the fee, and she was able to re-confirm that she did not, in fact, have an arrest record. However, she noticed that similar ads popped up whenever she searched for anyone with a Black-sounding name. She also saw the ads appear alongside the website of Omega Psi Phi fraternity, a prestigious African American fraternity, many of whose members are leaders in the African American community. She tested various names and found that Black-sounding names generated ads suggestive of an arrest 81 to 86 percent of the time on one website and 92 to 95 percent on the other, compared to more white-sounding names.

However, without knowing how the algorithm works, she was unable to determine how those results arrived -- whether the algorithm played into people's stereotypes of African Americans or whether people's stereotypes affected the way they interacted with the algorithm.

Sweeney also discussed the case of the Facebook Messenger geolocation function that identified user location by default, added in 2011 to no big outcry, even in 2012 when the geolocation function was reported. However, in 2015, a student created a program to draw attention to the way this geolocation function compromised people's privacy without their knowledge, followed by massive media attention and user outcry -- the student's program was downloaded 85,000 times and linked to 170 global news articles. After nine days, Facebook released an update that required the user to opt-in rather than opt-out.

Sweeney questioned the nature of the social contract between technology companies and individuals. Why does it always take outside intervention to create change?

technology designers are not accountable to any regulations

Sweeney created a course at Harvard called, "Data Science to Save the World", a hands-on lab course to examine how technology impacts people and the unforeseen consequences for science facts, civil society, and governmental discourse. She also brought the students that produced the best case studies to Washington D.C. to speak with government regulators about civic issues in technology -- 26 out of 60 students.

Sweeney also launched a refereed academic online journal, "Technology Science", to publish case studies that examine the ways that technology can have unforeseen consequences for the individual.

She highlighted some of the case studies published in just two months of 2015.

Princeton Review's possible race-based price discrimination: Students chose to examine the online tutoring service because its online nature meant that geography should not affect the cost of service, yet users are required to enter their zip code before receiving a price quote. Students tested all 33,000 US zip codes, and although at first glance it appears that more expensive cost-of-living locations were charged higher prices, upon closer examination, students also found that adjacent neighborhoods with different racial demographics were often charged different prices. They found that Asian Americans were 1.8 times more likely to be quoted a higher price.

AirBnB income and race: In New York City, African American hosts earned 12 percent less than Caucasian hosts. In Berkeley and Oakland, California, Asian American hosts earned $90 less per week or 20 percent less than Caucasian hosts.

South Korean Resident Registration Numbers (RRN) versus Prescription data: South Korea’s national identifier encodes demographic information and a checksum with a publicly known pattern. After two de-anonymization experiments on 23,163 encrypted RRNs from prescription data, students were able to reveal all 23,163 unencrypted RRNs in both experiments.

Defeating ISIS on Twitter: Students evaluated 1.5 million tweets from 1,500 ISIS-affiliated Twitter accounts to determine if they were humans or bots. Comparing the ISIS tweets to a control group of 700,000 non-ISIS Arabic tweets, students found that ISIS tweets exhibited unique, un-unified tweet, retweet, and favoriting patterns that suggested that the accounts were controlled by humans.

Sweeney's message is that there are lots of unforeseen consequences of technology, so the design and processes need to be made visible in order to restore balance. Discrimination laws do not require intent, they only need to show effect. Algorithms reflect societal values; and, it is too easy for selection biases to be built into algorithms. If computer code needs to be regulated, then regulations need to be "baked into the sauce", especially as the Internet-of-Things continues to expand. The value proposition of technology is that there is simply so much data; but, technology designers are not accountable to any regulations. Designers should be more proactive to make sure that design decisions do not end up having unintentionally racist and other unfortunate results.

The citation for this article is:
F.K. Wang (2015) How Technology Designers Will Dictate Our Civic Future. DiverseScholar 6:12

Frances Kai-Hwa Wang writes primarily about Asian, Asian American, Pacific Islander, race, diversity, civil rights, and cultural issues -- including the intersection between science and culture. She is a freelance contributor for NBC News Asian America, AAPIVoices.com, and NewAmericaMedia.org. She also teaches Asian/Pacific Islander American Studies at the University of Michigan; and, she speaks nationally on Asian American issues. Wang received a DiverseScholar NASW Diversity Travel Fellowship to attend the 2015 ScienceWriters conference. Any opinions expressed in this article are solely those of the author.

Orginally published 28-Dec-2015

Editor's Note

DiverseScholar is now publishing original written works. Submit article ideas by contacting us at info@DiverseScholar.org. This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 Unported License.

No comments:

Post a Comment