Moving nearly all business digital in 2020 has caused a new exploration of the role identity plays online. Few companies had business continuity plans for sending 100% of their employees home and relocating 100% of their revenue and business to digital channels, but they were forced to do so seemingly overnight. And suddenly, identity took on an even greater importance than it had before because of its role in helping businesses provide customers, employees and partners access to the resources they need, wherever they are and on whatever device they’re using.
A digital identity standard would help ensure access in a world that is increasingly relying on digital interactions. But the reality is that in the creation of this standard, the opportunity to introduce bias is something that we need to be conscious of, including the many concerns relative to diversity inclusion.
In this week’s Hello User podcast, my guest is the Identity Jedi, David Lee. David is an enterprise architect and thought leader with nearly 20 years of experience in identity and access management, business development and solution architecture. In “Examining the Biases Within Commercialized Identity,” we discuss the biases identity management decision makers have when it comes to diversity, the consequences that can come with fast-paced software delivery, the future of digital anonymity and the importance of having difficult conversations around diversity inclusion.
Before delving into this key topic, I think it’s important to acknowledge that it can be uncomfortable talking about issues of culture, race and ethnicity. David and I are not intending this as a political conversation, but rather a pragmatic one between technologists. But I also believe it’s vital to share how our perspectives are tuned to this subject of diversity so that you know where we’re coming from.
To give some context: as a Black man who grew up in Los Angeles, David openly talks about his personal experiences with access opportunities. And as I have mentioned in other writing, my experience as a white man dealing with the identity theft of my late wife, who was Black, has influenced my thinking around identity. Both David and I approach the topic of identity equity by looking at how outcomes could yield a benefit for everyone and not just a subset of individuals.
With that in mind, let’s get started on today’s exploration of the biases within commercialized identity.
“Decision makers often don’t consider the technological challenges others face because they have always had access.”
One primary theme of our conversation revolved around the notion of access. Whatever form commercialized identity takes, it needs to be something that everybody, no matter their socioeconomic background, has the ability to access. The tricky part to this is that, to paraphrase the documentary The Social Dilemma, we’re living in a world where a room of 20- to 35-year-old white men in Silicon Valley are controlling the narrative for billions of people.
And when you think about that from a digital identity standpoint, the folks who are building identity constructs are often those from relatively well-to-do families in the United States where everyone around them has mobile phones. We in the industry must be sure to examine assumptions such as the idea that everyone has phones and access to a certain amount of bandwidth.
And it’s an important point to consider not just in terms of race and culture in this country but also globally. As Americans, we need to be conscious that other countries have tremendous amounts of diversity. In my work in Australia last year, I saw how the organization Baidam Solutions is working to build technology, knowledge and career opportunities for indigenous peoples. That’s a whole different category of connecting people and providing opportunities for access. The arctic circle in Canada is facing challenges quite distinct from those in the cities of Montreal and Vancouver. It’s very important we ask ourselves,“Is digital identity going to be built with access for everybody in mind?”
“There is no discipline for software engineers when it comes to identity and privacy due to the pace at which they are expected to build, but this will likely change because of liabilities and regulation.”
Back when I was running identity for large companies, I cared about the home team. I wore the jersey of my company, and ideas like citizen enablement and empowerment weren’t top of mind. But after several years of being out of the corporate fold, I’m concerned that in the cybersecurity trades, as well as the businesses associated with funding and making those decisions about cybersecurity, we don’t talk enough about things like ethics. We don’t talk enough about obligations and responsibilities. We take up discussions on data privacy regulations without giving a nod to the notion of protecting the user.
David puts it another way—and he admits he expects to hear a lot of “yelling and screaming” from engineers when he does—by saying that software engineering doesn’t have discipline. What he means by that is that we’ve been spoiled as software engineers because we can literally build the plane as it’s in the air. We find a bug and fix it tomorrow. We’ve moved to this area of cloud and rapid development, where apps push out dozens or hundreds of updates a week. But we move so fast that we’ve lost the discipline of being held to actual standards and engineering practices. What’s needed now is a focus on our duty to keep people safe, and he expects that regulations and liabilities will force us to look at our responsibilities in this area.
“This exposes that gray area around allowing free speech while maintaining the right to privacy, and who should have access to authentication and verification.”
When we think about a digital identity, it introduces an interesting potential side effect: the ability for us to interact in this online universe with a tremendous amount of anonymity. We took everything digital last year, except for identities, which means you can be anybody you want to be in cyberspace. But the possibility of a personal authenticator means that in theory we could have a world where there’s no more anonymity online. And that could lead to problematic social dynamics, particularly within social media.
When I asked David about this, he remarked that it’s a slippery slope. On the one hand, as security technologists, it’s important that we be able to authenticate and know who
someone is as they make an action or transaction online. On the other hand, anonymity may be needed to protect individuals from malicious actions such as cyberbullying. For example, women can go through downright frightening experiences online, and anonymity can help foster safety and freedom of expression. We live in this gray area where freedom of speech is very important, and we have to learn how to respect and not silence one another.
“The challenge is having uncomfortable conversations to address the issues surrounding diversity.”
Both David and I have come to an understanding that when you look at the mechanics of
identity in this world, whether it be Internet- or corporate/enterprise-facing, it has become less and less of a technology problem. The technology exists for us to be able to create accountability models from an identity standpoint online to reduce the amount of disinformation and related issues.
But the challenge is that when it comes to this notion of uncomfortable conversations, like those about women, minorities and underrepresented populations in the digital world, we’re talking about mechanics that need to be acknowledged and put in place to address these true issues of diversity, absent of technology. Technology is an enabler. But we also need to look at what we’re doing to address these issues online, and perhaps more importantly, whether we’re doing things in technology that exacerbate existing issues. With all of the challenges related to diversity, inclusion and enfranchisement, we have to be cautious about not manifesting the same characteristics within our code and within our systems and solutions as those we’re trying to solve in the analog world today.
The concerns of diversity, inclusion and disenfranchisement as they relate to digital identity are gaining a tremendous amount of energy across multiple countries and within states. It’s important to keep in mind that the demand for faster comes with a set of consequences. The same issues we encounter around diversity in analog will present themselves digitally if we do not address them.
So I want to extend my thanks to you for joining me in this crucial discussion. I invite you to listen to the full broadcast and explore our other episodes at the Hello User podcast page.