Mutale Nkonde Hi, good afternoon, good morning, whenever you're viewing this. My name is Mutale Nkonde and it is my pleasure to be one of the keynote speakers for this conference. So I'm here in Brooklyn, New York, one of the most impacted places for COVID-19. So I was hoping that we'd be able to engage in a discussion that really looks at some of the design questions around some of the potential solutions for that problem. So before I get into my slides, just a little bit about me. I am a technologist. I have been working at the intersection of race and technology since 2013. I started my career with Google and Black Girls Code, a national nonprofit that was really promoting coding for girls 7 to 17 across the country. Moved into research, where I started to look at questions of equity, race, and technology. And I'm currently a fellow at the Berkman Klein Center for Internet and Technology at Harvard University, as well as the Center for Digital, my goodness I always get this wrong, for Digital Social Society at Stanford, and have recently started a nonprofit called AI for the People. So I'm looking at ways of creating culture that really normalizes conversations around race and technology. So without further ado, I am going to share my screen, hopefully it will work and let's get into this talk. Thank you for inviting me. Mutale Nkonde So one of the things that has been happening in my personal life and in the personal lives of us all as we negotiate shelter in place, is how is it that we are going to be able to make our way out of this pandemic, and towards what will become our new normal, our new lives. And one of the things I really want to discuss with you all today is the need for good UX design in that process. So I work at the intersection of race and technology, what some people would call tech ethics, even though I'm not sure I'm particularly comfortable with that term. But one of the things that comes up in my work day to day are various terms that I'd like to define before we move forward to demystify for those who don't know, or for at least frame thinking for those that are more familiar. So most of my work is currently situated in a field called artificial intelligence. It was developed in the 1950s and at that time, what computer scientists were really looking to do were to create technical systems that had human-like capabilities, like speech, like sight, like hearing. So we'll see many of these in the marketplace now. So we'll see things like facial recognition, which isn't a capability on here, but that uses computer vision. Or we'll have devices like Siri that we can speak to. So it's recognizing our natural language, processing that, and giving us a response. And then other hearing technologies that are used widely in adaptive technologies for communities that may need assistance with hearing. Mutale Nkonde During the kind of early 1980s we start to then move from artificial intelligence, these were static machines, to machine learning. And machine learning is actually the much more correct term for many of the technologies that we think of as AI. But AI has really caught on in the marketing world. And in machine learning we use a set of systems to train machines to think. So the way we train a computer process to think is that we will provide what's called training data. So lots and lots and lots of raw data that enables the system to perform its tasks. So going back to facial recognition, if you're building a system that's supposed to recognize the human face, then you will feed that system millions upon millions of digital images of faces, and then the system will create statistical models using measurements between the eyes, the cheekbones, and the ear, the lip and the chin, and any other manner of what are called, many other elements of the facial landscape that will help identify what a human face is and that's pretty standard. And we've seen lots and lots of commentary around this in the press when we think about which faces are recognized and which faces are not. And then there's a field that we've been looking at since 2010, which is called Deep Learning. Now the key to understanding the capabilities of artificial intelligence and machine learning is that they still require a human operator to assist with those systems. However, in deep learning, what machines are doing is that they've looked at what they've learned in the past and they're generating new data. So you could have, there was a famous example of a Facebook algorithm that was being used to improve internal systems on the platform. And they used deep learning technologies to see whether the system could improve itself. And it created a computer language, a code, that practitioners themselves, Facebook employees, couldn't understand. So they had to shut down the process because they literally did not know what the machine would do next. And that's really the type of computer systems that we see glamorized in Hollywood and these other narratives where robots are taking over the world. Mutale Nkonde So how is how are these systems used in the fight against COVID? Well, one of the systems I want to take a really deep dive into is the Google Apple COVID tracker, which I know we're hearing lots about. What those trackers do is that they use Bluetooth assistive technologies. In this case, I'm using the example of a phone and they repurpose this use of Bluetooth technology, which is really a short wave radio band that allows technical systems to talk to each other by exchanging code. So for example, I love to listen to music in the shower. I will turn on my speaker, which has a Bluetooth enabled system embedded within it. I will play music from my phone, which has another Bluetooth assisted capability. The two products, then exchange what are called keys. So these specialized pieces of code that allows one to recognize the other so that I can hear the sound through my phone, through the sound coming from my phone, through my speaker, and it's this same idea with the COVID tracker. In this case, these pieces of specialized code are sent when systems are next to each other. So for example, if my phone was next to your phone, and we both had that tracking system enabled, then we would then be sending a code towards each other. The same with iPads, the same with televisions, the same with speakers. Mutale Nkonde And once those codes are shared, they become embedded in the software of the system of the products. So assuming, going back to the example of two phones, my phone is next to your phone. The codes are then shared, that code is then saved, and it's time stamped. And that becomes really, really important if I come up with a COVID diagnosis. So God, you know, God willing, if I do get a positive COVID diagnosis, this is the result. I go into the hospital, they treat me, everybody has PPE, everybody has gloves. Everybody has masks, and we're all pulling our thumbs up because I've recovered. And what the Google and Apple team are really banking on is that I will then report this to the app. Now I'm assuming I'll report it to the app after I've recovered, because if I were diagnosed, then the last thing I'm sure I'd be thinking about is reporting my health information out. Mutale Nkonde And this is where some of the design questions come in. So the questions I was really interested in are, who is this Apple Google tracker designed for? So in terms of usefulness, it's incredibly useful for reducing COVID diagnoses, because the theory that we're hearing from public health is if we don't have a vaccine, then we need to quarantine. And instead of quarantining the whole nation and the impact that has in our economy, if we know who's infected, when they're infected, and who they've contacted, then we can enforce targeted quarantine. So what actually happens with the app is once I report that I have a positive infection, it will then go to the software system where those codes have been stored and send messages out to those devices to say they have been in contact with somebody who has a positive diagnosis. Mutale Nkonde In terms of desirability though, we've got to ask ourselves, do we want apps that track our movements in an environment where we don't have comprehensive privacy legislation? And that becomes really, really, really pivotal when we think about our history as a nation of extending surveillance in times of crisis. One of the things that people in my community are really worried about, those of us who work with legal practitioners, is what happened with the Patriot Act after 911. So the Patriot Act was an act which gave law enforcement extended powers to to conduct surveillance in the interest of technology. And we all found out through the Snowden leaks, that that power was being used by the FBI to download millions of phone records a day from Verizon, which was a violation of our constitutional right to privacy as enshrined in the Fourth Amendment. But it also really created this distrust, this culture of distrust of being tracked by technology. So when we look to people like Cathy O'Neil and other data scientists who are on the cutting edge, or the leading edge of thinking about how technological systems impact our rights to privacy, this app is going to really raise red flags. Mutale Nkonde And then the third point that I really want us to consider is the usability. So apps like this were actually used in South Korea, while they were flattening their curve. And one of the things that came out the South Korea studies were these findings that it actually takes 75% of market penetration to make sure that the app is successful, because if too few people use the app, then we're still not going to get this comprehensive map of COVID infections and how those people who are infected are impacting other people. And as a group and as a community of designers, one of the things that I would offer at these critical the critical questioning of how surveillance technologies, in this case we're surveilling the COVID-19 crisis, are going to impact going to penetrate our market, given the reality that many, many many people do not trust surveillance technologies. Mutale Nkonde So the community I'm really interested in and the community that I've dedicated my professional life to is the study of how black people interact with technologies. And as we think about COVID, I think that we should also think about how the history of black people in medicine impacts their response to a particular app. So if you wouldn't mind, I would just like us to watch the following video. Video Narrator At the edge of Central Park in Manhattan, there's a bronze statue of a doctor named James Marion Sims whose brilliant achievement carried the fame of American surgery throughout the entire world. He's the guy who created the vaginal speculum, an instrument gynecologist used for examination. He pioneered the surgical repair for fistula, a complication from childbirth, and became known as the father of modern gynecology. But that brilliant achievement was the result of a series of excruciating experimental surgeries that he conducted on enslaved women. In a lot of ways, Sims epitomizes the story of American medicine for black women. It's a system that's failing them to this day. From infant mortality to life expectancy, the racial disparities in healthcare are staggering. The gulf between black and white might be widest when we look at maternal mortality, with black women three to four times more likely to die in connection with pregnancy or birth than white women. And that divide can be traced back to doctors like Sims who contributed to a long, largely overlooked history of institutional racism in medicine. Harriet Washington "Trying to understand the historical problem without knowing its history is like trying to treat a patient without eliciting a thorough medical history. You're doomed to failure." Video Narrator That's Harriet Washington, a medical ethicist and author who chronicled the intersection of race and medicine in her book Medical Apartheid. While many of the stark racial disparities in health care can be attributed to environmental and economic factors like access to good healthcare, studies show that minority patients tend to receive a lower quality of care than non minorities, even when they have the same types of health insurance or the same ability to pay for care. Harriet Washington "As African Americans, we've been abused for so long, consistently by the system. Why should we trust it? Why should we go to it when ill? And that's iatrophobia. That's a fear of the healer. Inculcated by the behavior of those healers, unfortunately." Video Narrator It starts with slavery. Doctors relied on slave owners for financial stability. They accompanied plantation masters to auctions to verify the fitness of slaves and were called in to treat sick slaves to protect their owner's investments. In 1807, Congress abolished the importation of slaves and in turn pushed black women to have more children to essentially freed slaves. Founding Father Thomas Jefferson later wrote, I consider a woman who brings a child every two years as more profitable than the best man on the farm. Video Narrator Around the 1830s the abolitionist movement led to the rise of what was called Negro medicine, or efforts to identify black inferiority to justify slavery. And there were polygenists who tried to use both science in the Bible to find proof that races evolved from different origins. The 1830s also marked the beginning of recorded experimentation on black women's bodies. One doctor performed experimental c-sections on slaves. Another one perfected the dangerous ovariotomy or removal of an ovary by testing the procedure on slave women. In fact, half the original articles in the 1836 Southern Medical and Surgical Journal dealt with experiments on black people. Video Narrator And then of course, there was James Marion Sims, whose reputation is etched in history and on that statue in Central Park. Between 1845 and 1849, Sims began performing experimental surgeries on a 17 year old slave named an Anarcha. He eventually performed 30 operations on Anarcha and more surgeries on about 11 other female slaves. When his male colleagues could no longer bear to assist him in inflicting pain on the women, the slaves took turns restraining one another. Yet paintings depicting Sims, Anarcha, and other slave women presented a subdued version of his experiments. Even though anesthesia was introduced in 1846, Sims chose not to use it for his experimentation with slaves. His practices echoed one of the most prevalent and dangerous beliefs in medicine at the time, that black people did not feel pain or anxiety. This book from 1851, titled The Natural History of Human Species, claimed the American dark races bare with indifference, tortures insupportable to a white man. Video Narrator Studies released as recently as last year demonstrate that black people are less likely to be treated for pain, particularly in the ER. There's even one from a children's hospital that found the same to be true for kids. And just this year, Pearson Education, a leading educational publisher, issued an apology and recalled nursing textbooks that included racist stereotypes. Like this section that said black people often report higher pain intensity than other cultures. Harriet Washington "Well, what does it mean when you say that someone doesn't feel pain. Among other things, you're speaking about their humanity. These are all part of that suite of beliefs emanating from the 19th century that we still have not shaken up despite all our knowledge and sophistication. They're deeply ingrained." Video Narrator Doctors like Sims might fit the Dr. Frankenstein stereotype, but they weren't outliers. Historically, Southern doctors who use black bodies for troubling experiments were than norm. Harriet Washington "It's a very common question. How can we judge our forebears? You know, those guys in the 18th century who practiced medicine in a way that appalls us today. And we think, how could you do that? I did not judge the practictioners based on our own ethics. I judge them based on the ethics of their time. It was not acceptable back then. We just did not hear from the people who protested against it." Video Narrator After the Civil War ended, the 1900s brought a wave of immigrants to the US. It sparked a race panic and coincided with the birth of the American eugenics movement. One of the movement's key objectives was to reduce the childbearing potential of the poor and disabled. Leaders included birth control pioneer and Planned Parenthood founder Margaret Sanger, who eventually devised the controversial Negro Project for family planning centers that pushed birth control in the black south. It was a project that even garnered support from W.E.B. Dubois, a founder of the NAACP, who wrote that black people bred carelessly and disastrously. By the mid 1930s, more than half the states passed pro sterilization laws, and often sterilization was forced. In 1961, future civil rights leader Fannie Lou Hamer went to the hospital to have a tumor removed, but was subjected to a hysterectomy without consent. The procedure, which rendered women infertile without their knowledge, was so common in the south that Hamer was said to have dubbed it the Mississipi appendectomy. Harriet Washington "African American babies were no longer economically valuable. And African Americans themselves had gone from being a resource to a nuisance." Video Narrator In June of 1973, the SPLC uncovered 100 to 150,000 cases of women who had been sterilized with federal funds in Alabama. Half the women were black. In recent decades, women of color continue to be exposed to dubious reproductive health programs. In December 1990, the FDA approved a contraceptive called NORPLANT that was selectively marketed to black teenagers in Baltimore schools. "One of the current birth control methods now in the United States is NORPLANT." Video Narrator NORPLANT fans like David Duke, the former KKK Grand Wizard, even introduced legislation to give women on welfare an annual reward of $100 if they agreed to get NORPLANT. "It's time we started to encourage welfare mothers to be responsible." That bill never passed. But the implant ignited a debate on whether long term contraception like NORPLANT that lasted five years could be used as a form of social engineering when pushed to specific communities. Today as we continue to lose black mothers at alarming rates, a deeper look at the past may be a good step towards creating a more equitable healthcare system. Mutale Nkonde And that equitable design and healthcare is something that's really now becoming the realm of technologists. So, I know there was a lot in that video and I'm hoping that as people watch and rewatch, they can really think about how this long history of racism within medicine, in many ways is coming to bear at this time, where we're seeing health disparities within COVID-19 and within COVID-19 detection rates. And the need for not just medical professionals to understand that so that they can deliver better care, but for those of us that work in technology to also have that understanding. Mutale Nkonde So last year, it was my honor to be a co-author on the report that we have up, Advancing Racial Literacy in Tech. And at that time, what we were seeking to do was our own user experience survey. But the people in our sample were product designers, engineers, and tech companies, as well as diversity and equity. professionals. And we one of the things that we found were racial literacy has three basic components. So in order to be racially literate, what we were telling our interviewees is that you have to have a cognitive understanding of how race impacts technological design today. Mutale Nkonde So in terms of the tracker, one of the things that we, one of the problems I was thinking about are health disparities and COVID-19 deaths, and in COVID-19 cases. This is something we were all supposed to be in New Orleans, unfortunately, we cannot be. But this is something that's really coming up in the New Orleans data and the New York City data. And the biggest statistic for me is the one coming out of Minnesota where 81% of COVID-19 cases are black, but the population only represents 22%. Mutale Nkonde However, to have this conversation about race and technology, we have to admit that there are emotional barriers to talking about race in America. People often report being uncomfortable, whether they're white or whether they're black. I know for me, I'm a practitioner in this area, but I still feel a certain level of discomfort when speaking to white audiences because I'm coming concerned that they're going to react negatively to what I have to say. And then that makes my heart race. So emotionally, not only do we have to be comfortable speaking about race, but in this particular case of the tracker, knowing that we need to have 75% market penetration for it to work to create this effective map. We have to also acknowledge that there is a deep history of racism in medicine that has made African American communities have high levels of distress towards the medical and public health establishment. It's not that these things happen to slaves in the 1860s, and we are now in 2020 and those stories have been forgotten. But those stories become part of what's passed down from generation to generation, as mothers like myself teach our children how to navigate and stay safe in white community. One of the stories we may pass down is, do not trust the medical establishment because they have a consistent history of having a disregard for black life. Mutale Nkonde And then the third part of racial literacy as a framework is this idea that you need to have an action plan, right? How do we get over the cognitive awareness that race and racism in our country, whether historical contemporary is impacting us in this moment, specifically, as we look at the rates of infection and deaths from COVID-19, balance that with the emotional reality that we are uncomfortable with this truth. This truth tells the story of America that white Americans, specifically those that consider themselves, liberal and progressive, do not want to be aligned with. And a reality that black Americans who live with this reality every day, don't want to live their lives in anguish. So how do we design for trust? That's some of what I'm hoping that we'll be able to speak about more. And this is really where I think that UX designers and designers generally can have such an impact on how we use technology in this fight. Mutale Nkonde So if we have all of those three elements in place, and if we're really thinking about those questions, then I would like to offer that we are truly having a racially literate approach to this question. And I'm expressing to you and everybody at this conference, you are so important because we need equitable design. Mutale Nkonde So going back to the three questions that I posed earlier, when we're thinking about the Google and the Apple tracker, we have to think well, who is this designed for? In its first instance, we could argue that it's a public health intervention. We are simply tracking who has been impacted by the virus and then who they've been impacted with so that we can quarantine. But the truth of it is that Google AI have a bio health department and one of the experiments, one of the subjects they're looking at if you go to the website today, is this idea of how do we create? How do we use AI to create smart health charts? So the idea is, you go into one thing for the emergency room, you have access to a health chart that pulls up your whole medical history. And then that health chart can help the doctors predict what your illness actually is without having to go through the battery of tests that we're used to. That seems really, really useful. But in order to build those health charts, Google, one of the companies that are building this tracking system, also need to have a much deeper and greater understanding of the nation's health. And because of HIPAA laws, they're prohibited from just going and seizing our health records. So they're becoming reliant on us giving them our health information. And one of the ways they could do that is through this app. Mutale Nkonde So who would this be desirable for, right? Who would this idea of having all of this health data that we're uploading to a cloud to come back and COVID-19 desirable for? Well, not just Google, but the tech industry as a whole. For many of us that watched the Mark Zuckerberg briefings last year, one of the things that we realized is that not only do we not have comprehensive legislation to protect our privacy against apps like this, but the tech industry itself, their whole business model is based on the farming of our personal data. And it's not data that they are seizing. They're not breaking any laws by acquiring that data. This is data that we are freely giving, specifically in an app like this. And we currently do not have the level of understanding or buy-in in Congress to really have a unified strong pushback against this. Mutale Nkonde I actually worked in Congress on AI policy for a number of years. And during that time was part of a team that introduced the Algorithmic Accountability Act, the Deep Fakes Accountability Act, and the No Biometric Barriers to Facing Act that looked at facial recognition. And one of the things that we found were, aside from Senator Brian (...) sorry, aside from Senator Brian Schultz and Senator Ron Wyden, there were very, very few advocates in the field, very, very few congressional advocates, who really had the level of understanding of how technical systems impact social life. So there definitely wouldn't be enough people to bring to a vote the type of legislation that would be needed to say, okay, Google and Apple, you can track this COVID data, but the data belongs to the state, or the data belongs to some other public interest entity. And afterwards, you have to destroy that data set and prove to us that you are not going to use it. One of the things that both companies have said, which I think is remarkable, is that they will, is they will dis-enable these Cova trackers after the pandemic subsides. But we still are relying on the goodwill of private companies, who are not invested in the public good necessarily. Their primary responsibility is to share holders. So when we think about this question of desirability in design, we have to think, is an app that's going to track the movements of these devices, and those devices are obviously connected to people, desirable for black and other communities that are over-surveilled by the state? Mutale Nkonde And then that kind of brings me to my third question, which is how will this app in the long term impact black users? And if it's not going to be a net positive, is this really something that we want to proceed with? And this is where a massive plug for my sector so I'm part of a growing field in technology called public interest technology. And one of the things I'd like to invite you guys to think about is around becoming public interest designers of technology. So public interest tech is an umbrella term for practitioners who center civil and human rights in their practice. So that could be anybody from lawyers at the UCLA who are advocating for bans or moratoriums on facial recognition technology, through to researchers like myself, who are saying this technology has incredible potential as the Google Apple app has. But here's some of the potential drawbacks. And here is some of our deep social histories that are going to impact adoption of the app, through to movement makers. So the incredible data for Black Lives is a movement of people like me, journalists, technologists, and others, who are really making sure that the tech industry that we are creating is one that is going to be just and fair. And I would argue that designers, UX designers, have to be part of that. Mutale Nkonde I think for UX designers that really believe in equity and justice in not just our technological future, but our post COVID reality, this is an opportunity to come in and to survey those groups that may be negatively impacted by these technologies to work with professionals like myself and others who are looking at how technology impacts life at the margins. So that could be my colleagues that are working in the disability and access space. That could be my colleagues that look at how non gender and trans identities are erased and marginalized by many technologies. That could be my colleagues that are looking at other racial groups. That could be my colleagues that are looking at the way poor people and their lack of access to technology generally, really pushes them out of an economy that is demanding that we work from home. And this is really pressing. This isn't just an idle invitation. This isn't just a keynote. This is really an appeal to the design community, particularly those of you that believe in justice and believe that the Google and Apple tracker is an amazing innovation. And what it would take is some additional questions and really, federal legislation to make sure that our privacy is protected. Mutale Nkonde And the reason that's very important is, as I said earlier, Cathy O'Neill and other data scientists have argued that Bluetooth data is not secure and can be easily de-anomytized. So you could actually trace these movements back to the user, which brings us into, you know, the issues around the constitutionality of the product. Today is Monday, April 20, 2020. I sit in New York City where over 13,000 people have died of COVID. Most of those people, or COVID related complications. Most of those people look like me, they live in the neighborhood that I'm from. This is very personal to me. So I'm appealing to those of us that can design and those of us that want to design for good to really think about how we can work together to solve these problems. Mutale Nkonde Thank you so much for listening to my talk. I know that talking about race can be really uncomfortable and I'm hoping that I have opened up a space. And I'd like us all to just take a couple of minutes to think about who I'm dedicating this work and this talk to. And that's Deacon Ruth Corbett, she was a deacon of my church, a church that I have belonged to for the last 13 years. And she passed of COVID complications on April 7, and I'll miss her. I'm absolutely devastated. But I'm not just devastated about the deaths of Deacon Corbett, but all of the victims of COVID-19, particularly those in nursing homes, who potentially were already isolated and died alone. Thank you so much for listening. Transcribed by https://otter.ai