Design thinking

Do smart cities discriminate?

Lukas Flynn

Joy Buolamwini, a graduate student at the MIT Media Lab, is working on an interactive installation where users control colorful shapes and images by changing the position of their head. She calls it Upbeat Walls. There’s just one problem: the commercial facial analysis software she decided to use does not recognize her African American face.

Alarmed by this discovery, she began a more in-depth research project to look into it. Was it just an error—or did the software have baked-in bias? What she found was a recognition error rate of 0.8 percent for light-skinned men and 34.7 percent for dark-skinned women.

What Joy encountered is the result of what can happen when a design solution is created in a vacuum and its designers do not practice the belief, held by Veryday’s own Diana Africano Clark, that, as designers, “we must include many voices…and proactively seek out marginalized voices and force ourselves to listen.” The designers of the facial analysis software did not seek out marginalized voices and were not cognizant of the potential bias their system would contain.

It’s easy for designers to fall into the trap of viewing the world predominantly through the lens of the solution they’re working to create. Designers see urban spaces through ideas like, “Wouldn’t it be great if public transit was dynamic, based on the time of day and traffic?” or “What if energy usage was tracked to solve problems like transporting renewable energy to areas of need at the right time?”

There are undeniable positives in viewing the world this way as we are driven to view problems as opportunities and shape those opportunities into a preferable future. What is absolutely crucial though, especially in the context of smart cities, is to never forget what and who actually makes smart city solutions possible. The answer to what makes it possible is the data produced by citizens—and lots of it.

“The privileged, we’ll see time and again, are processed more by people, the masses by machines.”

Cathy O’Neil, Data scientist and writer

Data in the city

Data on the traffic flow into and out of a city during rush hour, energy use in buildings and homes, and mobile phone usage are all examples of the ways citizens produce the data necessary for a smarter city. And in order to collect this data, cities are being transformed into increasingly surveilled and tracked spaces. But what happens when a large segment of a city population, dark-skinned females like Joy, for instance, are not recognized by smart city algorithms? How will a system to better improve public transit be inclusive when the system itself has blind spots?

As designers in the service sector, it’s our responsibility to ensure that the human element and inclusivity are retained in the smart city revolution that’s becoming increasingly dominated and dictated by opaque algorithms. There are countless other examples of how algorithms that form the backbone of the smart city revolution reinforce, and often amplify, inequality and human bias.

Do Smart Cities Discriminate? 7

Algorithms in Criminal Justice

One example of how algorithms intended to remove bias and reduce costs can end up reinforcing prejudice is the U.S. criminal justice system’s use of them. In many states, a risk assessment score is used when sentencing defendants. This score is derived from a test that asks defendants questions like, “In your neighborhood, have some of your friends or family been crime victims?” According to a ProPublica study, such test results are skewed since they’re based on the defendant’s socioeconomic context and where they’re from rather than whether the individual is at higher risk of committing another crime.

For a more in-depth look into this topic, I encourage you to read the ProPublica article, Machine Bias. I have encountered questions about machine learning and artificial intelligence, it is important that as a designer I am aware of how these technologies are put into practice. How will designers take the challenge of optimizing functions and drive economic growth and direct it towards a future that is inclusive and functioning rather than economically driven and prejudiced. How can we avoid further marginalizing citizens like Joy from smart cities of the future.

 

Do Smart Cities Discriminate? 4

 

Algorithms and predictive tools are only as good as the data that’s fed into them.

Ezekiel Edwards, director of the ACLU’s criminal law reform project

How do we design for inclusivity as designers at Veryday?

One of the first steps we’ve taken as designers is to partner with both cities and data analytics companies to ensure that algorithms have been created with a focus on the citizen. When redesigning public buses in Skåne, Sweden, for example, we partnered with our client to engage current and future passengers to shape the future with us. We practiced what Diana and other Veryday leadership advocates and sought out a multitude of voices within the community in order to ensure that the buses would truly accommodate those who use them. We’re taking the same view when designing smart city services: the citizen takes priority, always. It’s our view at Veryday to “challenge the standard views of the market, our clients and those in power.” Our mission has always been to place the user at the center of all design solutions and that’s truer now more than ever.

As we move into an increasingly urban future, designers at Veryday will continue to answer the challenge of creating more inclusive smart cities. With design trends like anticipatory design becoming more popular and the use of data analytics becoming more prevalent in our field, applying inclusive design practice becomes all the more vital.

 

 

Do Smart Cities Discriminate? 5

 

As designers become increasingly involved in the dialog and planning of smart cities, it’s important that we acknowledge the sociocultural impact algorithms have in that context.

In order to remain cognizant of (and to avoid) the pitfalls that arise when citizens are not at the center of attention in the design of smart city services, Veryday keeps in mind some helpful mantras:

 

1. We will design workshops that engage a broad range of stakeholders and citizens in their everyday lives and, through doing so, highlight the need for a more inclusive design process. We are the pioneers in this field and we will continue to play that role.
2. We will utilize and experiment with immersive research tools enabled by design ethnography and powered by emerging technology. Only by prototyping in the immersive space can we fully understand the ramifications of immersive ecosystems at play in smart cities.
3. We will push clients to understand that there is more to a smart city than a more profitable and efficient service. We will be the champions of the citizens inhabiting the smart city of the future.