Five considerations for responsible innovation
What if voice technology could improve the experience of a person recently diagnosed with diabetes?
Can the blockchain bring transparency to human trafficking in the supply chains of goods and services?
How might open and non-traditional datasets help us identify biothreats in real time?
Can AR/VR simulations prepare students for the jobs of the future?
These are a few of the thorny problems we have helped our clients solve over the past few years. As we consider these questions, we increasingly recognize that with new technologies comes the responsibility of imagining unintended consequences. And in 2018, we are not at a loss for examples.
Passive data collection has become the holy grail for activity trackers. Social functionality makes it fun for fitness enthusiasts. And opening up Strava’s data set makes for really fun data visualization, until it someone points out that the data set has just revealed secret military bases around the globe, putting individuals and nations at risk.
Data science has the power to do the math so much faster. So we welcome our algorithmic overlords. But what happens when an algorithm cuts off a human’s health care?
Conversational interfaces have the potential to reduce friction. But when the combination of voice tech, artificial intelligence, and the internet of things doesn’t work as intended, it can make for unfortunate edge cases and awkward social situations.
Earlier this week, Luminary Labs CEO Sara Holoubek delivered a keynote at the Health Experience Design (HxD) conference, where she outlined practical steps designers, developers, and innovators can take to build a greater capacity for ethics. If you’re designing next-generation technology solutions — in health or any other industry — here are five things you can do now:
1. Read more fiction.
Mark Zuckerberg said he could have never imagined a future in which the software he developed in his college dorm room would be used to interfere with elections. But popular books and movies — even stories written for children — illustrate the unintended consequences of technology. From 1984 and Westworld to The Incredibles and The Jetsons, there’s no shortage of stories that can can help us understand ethics. We recently listed 40 works of sci-fi and speculative fiction and gathered for an informal book club to discuss “Sourdough,” a book about robots, food, and the future of work.
2. Seek out the tech ethicists.
As with any other topic, it helps to connect with experts. In recent months, we have shared the stage and had conversations with dozens of tech ethicists who share a love for both technology and humanity. Luckily, many of them are on Twitter. You can follow thought leaders like Cornell professor Ifeoma Ajunwa, data journalist Meredith Broussard, and digital citizenship pundit David Ryan Polgar via Sara’s own Twitter list.
3. Take an oath, or write your own.
The Center for Health Experience Design is coordinating the Designer’s Oath project. Data for Democracy is partnering with Bloomberg and BrightHive to develop a code of ethics for data scientists, a move supported by leaders like former U.S. Chief Data Scientist DJ Patil. If your own industry or discipline hasn’t developed a code of ethics, consider your own company’s code of ethics. Can your team or your organization adopt guiding principles for ethical decision-making?
4. Spur your ethical imagination.
This phrase has been championed by both Laura Norén and Natalie Evans Harris; think of it as an individual exercise that could also be easily adapted for any team, in any organization. At this year’s Personal Democracy Forum, Sara helped Natalie facilitate a workshop where small groups discussed different use cases. Here is the framework Natalie presented:
- List out activities for each stage of the data life cycle in the case study.
- To improve the ethical practice in this case study, what are the ethical challenges and questions we can ask to address them?
- Do you see alignment or misalignment between your own work and your relationship to data?
- In thinking of your own work, what is one practice you can put in place tomorrow to make your approach to data more ethical?
- Who are two to three people you can share this practice with at your organization? How might you create an ethical culture?
5. Join a community.
If you care about balancing speed, innovation, and ethics, you’re not alone. Connected and Open Research Ethics (CORE) brings researchers, ethics board members, technologists, and other stakeholders together to share their expertise and experiences. They’re working to shape best practices for Mobile Imaging, Pervasive Sensing, Social Media and Location Tracking (MISST) use in research. And the Center for Human Technology’s Time Well Spent Movement has a discussion group, Facebook group, and tips for individuals who want to engage with technology in a more mindful way. Slack channels are also popping up to convene practitioners across a number of disciplines, including content development.
It’s possible to be pro-tech and pro-ethics — start by asking questions, making connections, and expanding your imagination.
Subscribe to the Lab Report, our weekly newsletter, for updates and insights.
Photo by Debby Hudson on Unsplash.