man using face mask in smartphone vector illustration designs

Wear a Mask, Wash Your Hands, Protect Your Data: How Responsible Innovation Builds Trusted Technology

How’s it going? Are you grabbing gloves, wearing a mask, ordering online, and staying six feet apart? Does it feel awkward and bizarre? Are you trapped in your home getting your food delivered, maybe even by a robot? Disoriented from the lack of boundaries between working, parenting, relaxing, exercising, cooking gourmet meals, and self-improving? Or is your city opening up, and you are trying to figure out the “new” new normal of herd immunity decked out in your PPE? Are you hitting the panic button, catching a breath, or hitting your stride?

We are all forming new habits these days. Whatever you find yourself doing, it probably includes more digital activity than before. Apps or services you may not have used before or not as often. Depending on where you live, you may be required or encouraged to use controversial contact tracing apps that collect your location and contacts for your own good. Or rely on grocery delivery, video, or online education apps that have been hurriedly made or scaled. Crisis brings uncertainty and panic. We also need caution — and the habit of demanding more discipline and thought behind security, privacy, and trust from our technology.

Data, privacy, and autonomy, once lost, are difficult if not impossible to recover. But when we are worrying about survival, do we have the luxury to think about what is or will be done with all our data? Under pressure and in a hurry, we might agree to things we wouldn’t otherwise agree to. We might not ask the questions we should. Demand accountability and clarification. Most likely, if we were not in the habit of thinking critically about these issues, we don’t even know what questions to ask. Wouldn’t it be nice if we could actually trust someone?

What’s the Big Deal?

The slow but mammoth rise of our digital footprint has accelerated the use of Machine Learning and AI to process, infer, and make decisions based on that data. But even prior to AI entering the picture, our data, big data as some call it, was cartwheeling around the digital landscape, from companies to brokers to other companies. When we click “yes” to allow cookies in one of the billion websites we visit in any of the browsers, do we know what we are clicking yes? Or in this crowded online world, to whom? With the pandemic, the threat to our health, security, and safety is forcing us to change our day-to-day habits on a collective and unprecedented scale without having the chance to consider the implications or a way to understand them. What is new is that our vulnerable group is getting younger and younger. What is new is that now we are forced to engage digitally and rely more and more on automation and the invisible AI tech that powers it for our health and safety.

Even before the pandemic, at the beginning of 2020, the digital universe was estimated to consist of 44 zettabytes of data. This includes about 300 billion emails and 500 million tweets that are generated everyday. By 2025, approximately 463 exabytes will be created every twenty-four hours (if you still know what DVDs are, you would need 213 million DVDs per day to store that). There will also be seventy-five billion internet-of-things (IoT) devices in the world. That is a lot of data.

What happens with our data is a big deal. Whoever controls, manages, and processes our data has a lot of power. “They” (or the many disparate “they”) can control what we see, what we buy, what we read, and how we see ourselves and the world. They can sell information about us or share it. There are issues of discrimination and negatively impacting bias either derived from or built into the data and systems. With power comes responsibility. For the past several years, as news became littered with consequences of data, AI, and technology;government leaders, companies, and public institutions started collaborating and asking the difficult questions. What does it mean to make responsible, ethical, trusted AI? What protections should we extend to public and private data?

What does consent mean when people don’t understand and comprehend the invisible power of their devices, the seemingly harmless games and interactive applications, the engines driving the applications, the complexity of the different clouds storing their information, and the even more powerful and intelligent systems processing all of it? At least they were asking.

Now, some initiatives and companies are wondering if they have enough funding to continue their programs during the current slowdown. Everything around us has changed, yet our imagination and expectations about building responsibility and trust into our technology have not. Technologists still harbor the desire to keep the same values, get back to business, and leave the debate about right and wrong to policy makers and regulators. But policy makers and regulators have their own challenges and often play catch-up to innovation.

Who Decides The Right Thing?

We do. Among technologists, researchers, and business leaders, there is still some reluctance and debate about whether technology and innovation should take a stand on good or bad, right or wrong. There is no agreed upon universal definition. What does “good” or “right” mean, anyway? My right thing may not be someone else’s, especially not in a globalized economy. Though the point is valid. We can make the same point about building AI solutions to predict someone’s future behavior based on their past actions or the past actions of people who are similar to them. If you enter healthcare, predictive analytics, or transportation, areas that affect people’s lives, their health and safety, with technology as a solution and possibly decision-maker, then you have already taken up the challenge to interfere in an area with ambiguous solutions.

We are being forced to do exactly that — navigate the challenge and complexity of an unfamiliar set of problems, including privacy, security, and trust in a global pandemic. What we can do is the next thing that seems right for us, from our “here.”

Global Agreement, Local Implementation

In Pune, India, there are green zone neighborhoods where no cases have been reported. People are moving about, conducting business, and socializing next to yellow and red zones in complete lockdown. Every morning as data is updated, the zone boundaries change. There are outcries about the integrity of Arogya Setu, the Indian government-mandated contact tracing app that requires every user to give their name, gender, travel history, telephone number, and location to register. Countries like Singapore, Australia, and Norway have contact tracing apps with low adoption rates. The TraceTogether app in Singapore was built by the Ministry of Health and Government Technology Agency and provides transparency on protocol and methodology.

South Korea is using surveillance technologies like electronic transaction data, mobile phone location logs, and surveillance camera footage to track its citizens. Google and Apple are collaborating on a contact tracing app with less centralized location tracking than some government-built apps but health care workers are still debating if the apps will cast a narrow enough net to be practical. And you still have to navigate adoption, another set of opt-ins that you are not sure will give visible or invisible companies and institutions access to your data. The question becomes do we trust our government, our businesses, or the way that technology is architected in the first place?

Countries and companies are learning from each other and making decisions that fit their culture and the appetite of their citizens for how to contain the infection and the infected. The reaction and adoption depends on how much people trust their governments or these companies based on their past experience. It depends on how scared they (we) are and what we are willing to barter or where we feel like we have a choice. Sweden has refrained from using surveillance tech or apps and is experimenting with social distancing and herd immunity, which have their critics. The level of policing and rules vary community to community, by culture, need, and trust as well as tolerance to invasion of security and privacy.

What is common, is the call to use PPE, wear masks and gloves, wipe surfaces, and maintain physical distancing until we have a vaccine. There are public service campaigns to help people get used to staying home as well as educating people on their personal responsibility for their safety as well as their community and nation’s safety. When treatments, vaccines, or new health-related strategies are released, they are shared across communities and countries. Globally, governments, companies, public interest groups and neighbors are coming together to help with funding and assess the economic impact.

Can We Take the Same Approach with Technology?

As Tech Users…

While we are grateful for the technology that allows us to stay connected, if we are on a video call, we could also give ourselves permission to turn off the video and ask what will be done with that video. Is someone recording it? By the way, recording can occur even without the red button flashing. If you are broadcasting, streaming, messaging, or recording to someone else’s device, they can record your content without consent. This means your kids, your private conversations, careless comments or body positions.

Parents are pulled in every direction. It’s hard to keep an eye on our kids’ digital use. But we can take the time to configure privacy controls, delete a kid’s app, or refuse a service if it raises concern. It’s ok if we are not on top of it, but we can get there by paying attention, asking questions, and doing it whenever feasible. Because privacy is a fluid thing. We feel the need for it differently at different times. We might trust the person we are interacting with, but if they publish our private conversation on YouTube or Facebook, now we are exposed to data collection companies and digital crawlers we don’t know and can’t trust. The key is to manage risk versus opportunity. Effort versus benefit. But what about things that we don’t know or understand?

As Tech Makers…

Individuals can only do so much. Our kids are getting online at unprecedented rates. Our first grader plays games on her iPad that include her images or have figured out how to take pictures and share it with her friends on her online playdates. She can’t control or comprehend whose device those images will really be stored on, whose cloud it will automatically upload to, or which app, identity, authentication, email ID, machine learning tracking, and crawler might get access to that data. Even we grownups get overwhelmed thinking about it.

But it’s not a new problem. We have been designing things with minimum & maximum limits,tolerances with default settings and finer tuning, and well-architected or quick hacks for years. What is different is the piecemealing, the crowded market of apps, middle layers, cloud options, routers. We can pick and choose and create countless combinations and options of products and services and settings.

Governments were building their own centralized contact tracing apps, when Google and Apple came forward with their localized contact tracing app and public interest groups raised the need and effectiveness of this approach. Just to be clear, anonymous doesn’t stay anonymous in technology — it just makes it harder and harder to find. If you want to guarantee privacy, you don’t share. The question, we the tech designers and tech makers ought to ask ourselves is — can we build with more trust and transparency? Technology companies have the know-how, the money, and some of the smartest people on earth working for them. They have the practice of building compliance in the areas where they are fined or it might affect their brand image and they might be found out. They can leverage open forums and industry groups or build some basic standards and building blocks to provide alternatives and options to the public. They can integrate considerations around privacy, security, and trust to the very people and customers who will use and buy these products.

Technology has become integral to basic human exchanges. Communication with each other and with ourselves — how we tell our stories and save our memories and access them with photos, recordings, and notes. When our safety, privacy, political fate, and identity itself have become digitized, do we have a choice but to draw the line in the sand? Ask for common basic standards and guardrails? Some way of level-setting what is acceptable or not acceptable? And changing our habits so that we start expecting it and demanding it from our technology?

The challenge is coming up with models for basic common approaches. We do it today with standards that allow technology to scale and work seamlessly across the globe. We have good and best practices for design and engineering. What we need is motivation and leadership tointegrate trust into the way we do business and how we measure success. And we need the practice and structure for accountability to become the norm. We must then infuse this approach into our organizations and culture with attention and funding to move towards real solutions that work rather than ones that provide convenient covers. This takes leadership.

Making the Habit of Doing the Next Right Thing For Us

Why? If we are going to mess with social systems, we need to be willing to build the protections and considerations of social systems and preferably even make it better.

Having a common set of definitions and minimum threshold will spur innovation that leverages trust and builds better performing technology.

Having less crappily designed products that are bad for us will help build trust. And trust drives adoption.

This pandemic has reminded us again that we can change. And change at scale. And globally.

It’s a matter of building a new habit. A new set of values. A new culture.

When we put our attention to building this new culture, we innovate. Better practices and an acceptable standard will provide everyone who is already stepping up, a level playing field. A motivation to build awesome products and technology with a common set of acceptable guardrails and keep each other in check. It will provide the public and users of these systems a more consistent and less confusing experience. It will provide the conscientious leaders, engineers, designers, and product managers building these products with peace of mind and common criteria and goals. All these factors will not only help to drive the adoption of AI and innovation, but it will also bring a diverse group of innovators.

Yes, we need policy and regulations around the obvious and harmful things like weaponized drones and headline grabbing killer robots. We need leadership and accountability. But in the meanwhile and for more subtle and pervasive day-to-day technologies, we can also agree to do the next right thing, the reasonable, common-sense, responsible thing. Lead by example. Protect our data and build privacy, security, trust into our technology. Keep technology delightful, open, inclusive, and accessible to as many people as possible. Innovate responsibly.

It’ll be good for business and good for us. Where do we start? How about the next thing that we know we can do that fits our product and our values. We have enough advocates within our industry and organizations if we make this a priority. We have frameworks that have been debated and proposed at global levels. We have industry organizations and forums who are disciplined. We have product managers and engineers who can challenge each other, justify doing things one way, or find a creative solution that is better. We can be thoughtful, experiment, innovate, disrupt, and reimagine compliance and trust. Instead of reviews that jeopardize roll-outs at the end of the process, integrate these considerations into our design variables, into our agile and iterative processes. Integrate it into our culture, our goals, our tools, our metrics, our processes, and our habit of how we build our technology. Fund and prioritize it. Pick the next challenge for us. Exercise this muscle and make a habit of it.

Let’s make this our new normal.