Let’s talk about digital identity with John Wunderlich, Information Privacy and Security Expert.

Join Oscar and John Wunderlich in this week’s podcast episode, 71, as they discuss mobile credentials – what are the challenges and solutions surrounding mobile credentials, what is IAM’s role in this and how systems need to be developed around trust.

[Transcript below]

“So, you have different levels of assurance in the physical world, just as you do in the digital world. So, anybody can issue a credential, the question is what level of authority you give to the credential.”

John Wunderlich

John Wunderlich is an information privacy & security expert with extensive experience in information privacy, identity management and data security. He has designed, built, operated and assess systems for operations and compliance in the private and public sectors for over 25 years. This included working or consulting for Fortune 500 corporations, government ministries, small companies, volunteer organisations, regulators and health system organisations of all sizes.

Connect with John on LinkedIn and Twitter or email him at [email protected].

This is the Report on mobile Driving License Privacy: kantarainitiative.org/download/pimdl-v1-final/

We’ll be continuing this conversation on Twitter using #LTADI – join us @ubisecure!

­Go to our YouTube to watch the video transcript for this episode.

Let's Talk About Digital Identity
Let's Talk About Digital Identity
Ubisecure

The podcast connecting identity and business. Each episode features an in-depth conversation with an identity management leader, focusing on industry hot topics and stories. Join Oscar Santolalla and his special guests as they discuss what’s current and what’s next for digital identity. Produced by Ubisecure.

 

Podcast transcript

Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla.

Oscar Santolalla: In the recent years, there have been organisations across the world creating, for instance, mobile credentials, and specifically mobile driving licenses. So, we’re going to discuss about this topic, and also the privacy side of this super interesting system that has been around. So, for that, we have an expert who is joining us today. My guest today is John Wunderlich. He is an information privacy and security expert with extensive experience in information privacy, identity management, and data security.

He has designed, built, operated, and assessed system for operations and compliance in the private and public sectors for over 25 years. These includes working or consulting for Fortune 500 corporations, government ministries, small companies, volunteer organisations, regulators, and health system organisations of all sizes.

Hello, John.

John Wunderlich: Hi, Oscar, how are you doing?

Oscar: Very good. It’s a pleasure talking with you.

John: Likewise.

Oscar: Fantastic. That’s a super interesting topic we’re going to discuss today about mobile credentials, so yeah, let’s talk about digital identity. But first, of course, we want to hear something a bit more about you as a guest. So please tell us your journey to the world of digital identity.

John: Long story short, I used to be Corporate Systems Administrator, Network Administrator, Operations Manager, and the Federal Privacy Law in Canada was introduced, I took that as a project at my company, and it turned into a career. When I moved from the corporate side to working for a regulator, I first met Kim Cameron, a name that most of your listeners will know, working with the Privacy Commissioner of Ontario, shortly after he introduced the Seven Laws of Identity. And around the same time, my former boss introduced the idea of Privacy by Design.

So, for me going back 15, 16 years privacy and identity have been in lockstep. There’s a very large Venn diagram overlap between the two. And I’ve been consulting and working on standards and volunteer areas in that joint area since then.

Oscar: Excellent. Yes, just a few years ago, maybe almost two, a bit more than two years ago, we met in Kantara Initiative, in one of the working groups, and you are super involved there. And I know that recently, you and other authors have released one document called Privacy and Identity Protection in Mobile Driving License Ecosystem. So first of all, kudos for that very good report.

John: Gracias!

Oscar: I read at least partly that and definitely I want to hear more about that today. So going into that specific mobile credentials, mobile driving licenses, I would like to hear, to start, a simple definition. So, what are mobile credentials?

John: Well, at the very highest level, a mobile credential is a bundle of attributes about an entity that sits on a mobile phone, or a mobile device. That’s at a very high level. And it then sort of branches down from there. You look at wallets or you can look at flash passes, or you can look at mobile driving licenses which have a specific ISO standard. You can talk about the W3C mobile credentials data model. You can talk about decentralised IDs, the W3C DIDs. So, it branches out into a whole grouping of things, but it ends up being bundles of attributes on a mobile phone.

Oscar: In practice, that replaces holding a driving license, the physical driving license.

John: I think in any substantive way, it doesn’t replace it yet. I mean, we’re talking about in-person presentation on the phone. In the workgroup that I lead that’s working on a recommendation report at Kantara, we’ve kind of matched the scope of ISO 18013-5, sorry for the ISO number reference, which is mobile driving licenses on the phone. It’s not the online presentation, which helps reduce the problem space significantly. But I think it is the case that most credentials like that are issued on the phone still have physical analogues.

Oscar: Yes.

John: So that people have a choice of presenting, for example, in Canada, health is a provincial matter so depending on the province, you would be issued a QR code for your COVID vaccination status and you could print that out on paper, or there were apps that you could present it on the phone, so people did both. So, I think in the space of the mobile credential, mass adoption is still sort of dual track, not replacing but supplementing.

Oscar: Exactly. So yeah, whichever is available or convenient for the individual. And tell me, because I know it’s in some countries, you mentioned in Canada, this type of mobile credentials, more specifically, the mobile driving licenses are already being issued, not everywhere in world. Normally, who are these mobile credentials issued by and how do they work in practice?

John: Well, anybody can issue a credential. The analogue is weak, but I still think of a bundle of credentials, a bunch of attributes as the digital equivalent of a business card. I can print my own business card. But the level of assurance and level of authority is pretty weak on a business card, as opposed to a laminated plastic with a hologram issued by a government for a driving license or a proof of citizenship or a medical card or an in between, something that might be issued by a university or a company. So, you have different levels of assurance in the physical world, just as you do in the digital world. So, anybody can issue a credential, the question is what level of authority you give to the credential.

Oscar: And we can say the right now the most, the mobile credential that has the highest level of assurance are done by governments today.

John: Yeah, I don’t think that’s going to change. The government, possibly financial institutions, possibly health institutions, but they are institutionally issued by institutions that have the capability of doing identity verification on initial issuance to provide a very high level of assurance that this credential is associated with this entity.

Oscar: Sure. Tell me a bit of the practicalities because I have never used, at least a mobile driving license, how it works in practice, when people who drive want to present the credential?

John: Well, let’s talk about two use cases for a mobile driving license. One is proof of authority that you’re, in fact, a licensed driver. So, you’re pulled over by the police and asked to present your driver’s license. Right now, you present a plastic laminate card, they look at it, look at the picture, may go back to their squad car, and do a check to make sure that there’s no problems with that license, that there’s no warrants. Once a warrant is issued against your name, all that good stuff. We’re presuming here that this is a legitimate traffic stop. I have a lead foot, they pull me over for going 80 in a 60, or whatever. And all of that is authorised by law. So, they get a full set of the credentials that are on the card and are able to access their backend system to get further information about me. So that’s fairly intrusive. But that’s what you would expect when you are a potential violator of the law.

As opposed to the more common use case for driving license, which is age verification. So, it’s long past the age where I get checked when I go into an establishment where I could buy alcohol or cannabis. But that’s the joke, driving licenses get used much more for drinking than they do for driving because people need them to buy alcohol. So right now, if I’m a just barely legal age young woman, because this is the edge case that causes real difficulties, and I’m trying to enter a club, I have to show the doorman or the bouncer at the front of the club my entire driving license. Now, what he should be doing is looking at the driving license, looking at the picture to make sure that I’m actually the person I purport to be. And then looking at the date of birth and making sure that I’m old enough in that jurisdiction to enter the club. That’s it. That’s all and no retention of information.

The problem, the dual edged nature of the digital driving license is this. If you do it right, you end up with all that shows up on the device that the doorman is using to verify your digital license. So, you present a QR code or there’s an NFC, you wave your phone near their receiver, the same way you do a tap to pay or a Bluetooth off in the MDL specification. All of those are allowable, the Bluetooth and the NFC are preferable. And what shows up on the doorman’s verifying device is a picture of the person whose license it is so that they can do the proof of presence. Yeah, this is indeed the person I’m presented with. And a green checkmark that says this person is old enough.

So, as opposed to the analogue physical driving license, he has no opportunity should he be amorously struck by the person to grab her address or follow her or grab your phone number or do any of that stuff. And the system operated by the bar also cannot record anything other than there was an age verification for a person, or they can–. On the other hand, depending on how the presentation goes, the backend system for John’s Bar and Grill could collect all the credentials that are on that driving license even if it only shows the doorman the other one and send me marketing email and all that other kind of stuff. So, you can do it in a really privacy protective manner, or you can take the bad edge case of presenting a card and automate that for negative consequences.

So obviously, the Kantara workgroup that I lead for reports and recommendations on privacy enhancing mobile credentials, is trying to come up with a set of recommendations for providers of verifier device issuers, providers of wallets or designers of mobile credentials, how to protect the privacy of the individual above and beyond the transaction.

Oscar: Yeah, definitely. In this example of the going to a bar, you illustrated how it works, but also in the privacy implications.

John: Personally, I’d prefer some of the European countries that don’t have age distinctions. If you want to send your six-year-old to the bar to pick up some wine, then… But I think there’s cultural bands in those countries where if you’re publicly intoxicated, you’re subject to social ridicule. There’s no social benefit to being inebriated. Anyway, that’s a side-line.

Oscar: Yeah. And definitely you started explaining there some of the challenges about privacy. What would you say are the main, if you can just summarise the main challenges about privacy?

John: Well, the business culture, surveillance capitalism, if you will, I know that’s a fraught face. When I was running networking systems and systems administrator in the ’90s, I was issuing X.509 certificates and I had all my user data, and I had access to it. Which kind of made sense in mid to late ’90s systems. And we were a B2B company that I was working for, but we were processing the personal information of our customer’s clients. So, we also had a proprietary interest in that.

So, coming into 2000, there’s this sort of culture of data about people is an asset owned by the companies and can be used to those companies’ advantage. The idea of, because systems were centralised and few, privacy issues didn’t really raise their head. There was good confidentiality in most companies, but not privacy.

And then, with the explosion of systems, and the introduction of a very interconnected backend, and especially with the introduction of monetising data through behavioural advertising and tracking, the entire thing got out of control, and we need to wrest control back of our own information. The challenge is cultural as much as anything, like the business culture around personal data. And the business models, which are I think, not sustainable.

Oscar: Yeah, definitely. And could you now tell us about the solutions? What are the solutions to those challenges you just mentioned?

John: Well, I think the solutions are two or three-fold. Legislation like the GDPR in Europe, or a day or two ago, I don’t know when this is going out. But just today, I heard the news that in the US, there was an agreement that might lead towards a Federal Privacy Law in the US. But all of those laws have the same flaw, which is that they’re built on the idea of Fair Information Practices, which came out of a 1970s vision of the way computer networks and systems run. Which is what we know now to be a completely flawed, notice and consent model where the organisation is a trusted organisation. It provides you notice of what they’re going to do with your data, they get your consent, and then they take possession of your data and act as a trusted custodian of your data. That was fine when there was a couple of 100 mainframes scattered around the world and IBM, [0:13:31] [unsure] on most of them doesn’t make any sense now.

So, there’s a new type of regulation that’s needed. But there’s a new type of standard that’s needed as well. So, the new type of regulation is what I started to taking to call “digital building code” which is you shouldn’t, any more than you shouldn’t build a house with electrical connections without making sure that your electrical infrastructure meets the building code for your jurisdiction. I think that there needs to be digital building codes so you can’t build a system that’s going to process personal information without meeting a certain minimum set of standards for protecting so that people don’t have to try and read privacy policies that can give notice and consent. That should be matched by standards and business culture to meet that floor standards.

So, it’s a tripod, if you will, the complaint mechanisms in the current regulations, safety regulations to make sure everybody is operating off the same floor, and standards to enable developers and companies to meet those. And I’m working on the standards side.

Oscar: OK. You mentioned the companies tend to use the data of the individuals as a property. So how you, when we were discussing before this interview, you mentioned was the role of the IAM into this, what is your view on that?

John: Well, I remember talking at IIW a few years ago, it wasn’t about SSI as this was pre-SSI, but it was about a product in the system that enabled companies to give control over some portion of their customer’s data to the customer. So, in financial institutions, they’re highly regulated, and oftentimes have a requirement to send paper to their customer’s home on a yearly or quarterly basis.

If you have the wrong address, it’s very expensive, but you have this address in your database for the customer. And if you’ve got it wrong, you eat the cost of reprinting, resending, bringing back the data that you sent to the wrong address. The customer neglected to update you on their address, and so forth and so on. It costs millions of dollars to some of these companies every year just in data around customer addresses.

So, this was a system that instead of a customer requesting an address change, the company gave control of the address information to the customer, so that if the customer didn’t update the data, so with power comes responsibility, then the customer will be charged for the delivery charge to the wrong place, the reprinting and delivery charge. So that de-risks that company on that.

In that particular system, I had also enabled the customer to update their address with multiple companies at once if they were participating in the protocol. So, you can de-risk your data, especially against data rot, by actually giving control back to the person who knows it best. Now, you’re not going to do, I know I worked in payroll and HR so theoretically, customers can also update their financial information like, “Oh, I changed account, let me enter, put a new account number.”

But anybody who’s worked in payroll and HR knows that you want to verify that, so you have to balance that. But giving control of customer’s data can save you money and improve the quality of your data. And I think that was kind of one of the arguments that SSI people don’t make often enough is just the simple, reduced cost and business risk of putting people in control of aspects of their own data.

Oscar: Yeah, exactly. Regarding the report that you built in Kantara, tell me more about that. How it was the work that and you can tell the main findings?

John: The way Kantara works, there’s two kinds of discussion groups which produce reports and then recommendations, which will have requirements for conformance that could be tested. So, your audience may well know the identity assurance work trust mark that Kantara can issue if you want to make claims about your level of identity assurance for your firm.

So, the discussion group produced a report on protecting privacy and information and mobile driving licenses coming and that report is available on the kantarainitiative.org website for download and you can take a look at it. The workgroup is trying to do a little bit more ambitious. It’s trying to talk about how do you meet the reasonable expectations of privacy in the issuer, holder, verifier triangle where Alice is the holder? How can she trust that her reasonable expectations of privacy are maintained in the transactions after she’s been issued a mobile credential that that credential won’t be abused in a way? Because most of the standards are transactional, which is to say that if you use the ISO MDL standard, for example, it talks about the interfaces. It has good requirements around data minimisation, and notice and consent, but it’s all around the transaction. It doesn’t speak to, and it’s not supposed to, it’s scoped out of it.

It doesn’t speak about if John’s Bar and Grill is the verifier and they’re using a system issued by ‘insert your mobile payment system verifier here’ that both that system at John’s Bar and Grill. And John’s Bar and Grill, once they’ve gotten the identity attributes from the transaction, how do they use those in a privacy protective manner? If you’re building a wallet or building a mobile credential for a phone, how do you do that, so that Alice can trust the wallet not to share the information for advertising on the backend, and so forth, and so on. So, building human trust between entities is how I like to summarise it rather than technical trust between endpoints, which is a lot of what most of us work with.

Oscar: That’s a very interesting distinction you make, right? Trust is something technical for most of us.

John: Well, sure. But if you think about it, Zk-SNARKs or Zero Trust Solutions make perfect sense, right? If I went back to being a Network Administrator, I’d build a zero-trust network. I would assume that there was an APT inside my network and there would be zero trust between the endpoints, and everything would be cryptographically signed and verified and yadda yadda yadda.

But in the real world, and this is– I steal this example from Bruce Schneier. If there was zero trust, nobody would ever cruise through a green light. How many times, if you were driving today, how many times to do sales through a green light without looking to see if some idiot was ignoring the red light and ploughing through? I mean, it happens every once in a while, for a variety of reasons. But by and large, if you’ve got a green light, you trust that all the other drivers on the road are following the rules and you just go through.

That’s trust. That’s human trust. It’s built on systems that have all kinds of safeguards to make sure that you don’t have a green light going four ways. But once the systems are working, people can trust each other using those systems. And we do not have that on the internet. There is essentially zero people trust, and sadly, that’s been earned by the behaviour of a number of ecosystems that handle personal ecosystems.

Oscar: Indeed. So, you have seen this, the understanding of trust, let’s say the identity professionals are the one who are building the systems and the, let’s say, the majority of people that difference. So, you have noticed that this makes things complicated for developing the systems?

John: Oh, yeah. Yeah. I mean, the story for surveillance capital, like companies that depend on advertising is, we never sell your information, or we never share your information. Well, which was true in a sense, and that if you went to site X, or site F, or site G, you know, insert whatever company. And you went to that site, and then behind the scenes, real time bidding occurred, they didn’t share your information with any of the advertisers. The advertisers said, I want to put my ad on profiles that meet these parameters. And there’ll be real time bidding.

So, at one level, until you clicked on the ad, and at which point your positive action of clicking on the ad created a relationship between you and the advertiser, at which point it’s out of system X or system F or system G’s hands, you’ve now done something, and that advertiser or the publisher has access to your information, because you clicked on their ad. So, in a very narrow, untrustworthy sense, yeah, they weren’t selling your data. Although a lot has come out now about the way real time bidding works, and how much information is shared for the 100 bidders, who get to see some bit of your data to bid on that page, and only one of them actually gets the data, the one that actually wins the bid. It’s a snake pit behind the scenes.

Oscar: Indeed. Well, thank you, very interesting, what you are sharing with us about the mobile credential, the privacy and how things are progressing. Could you tell us now for all business leaders that are listening to us, what is the one actionable idea that they should write on their agendas today?

John: The actionable item is, do you know who you share your customer’s data with? And why you shared it? And would you be comfortable sharing that information with your customer? So, the answer to all three of those questions should be yes. Yes, I know with whom I shared it. Yes, I know what I shared. And yes, I’m comfortable letting the customer know that I’ve shared it. If any one of those answers is no, then there’s going to be a reckoning at some point with your customers, or with the regulator, or with your business model.

Oscar: Yup. Three questions that, as you said, they should be yes, absolutely.

John: Those three yeses de-risk and future-proof your organisation.

Oscar: Exactly, exactly. Well, thanks a lot for putting it in a very, very clear way for everybody who is listening to this. Well, thanks a lot for sharing all this super interesting about mobile driver licenses. And of course, I recommend, so we’ll put the link to the report, Kantara Initiative report that John has co-authored and super interesting,

John: And a link to the workgroup, hopefully, because I invite anybody who is interested in this to join the workgroup because you can start to help shape the requirements for the standard to come.

Oscar: So, what is the name of the workgroup? What is the exact name?

John: The Privacy Enhancing Mobile Credentials.

Oscar: OK, perfect. We will add the link as well. Fantastic. So, John, it was a pleasure talking with you. If there’s something else that you would like to tell or how people can get in touch with you.

John: Probably the simplest way is I’m @PrivacyCDN on both Twitter and LinkedIn, P-R- I-V-A-C-Y CDN. That’s a play on words because in Canada, CDN is sometimes used for Canadian. But in the rest of the world, it means Content Delivery Network, so I thought that was an interesting pun. Anyway, @PrivacyCDN on LinkedIn or Twitter.

Oscar: OK, I didn’t know that’s CDN, so it’s good to know, thank you. Again, it was a pleasure talking with you, John, and all the best.

John: Take care, Oscar.

Thanks for listening to this episode of Let’s Talk About Digital Identity, produced by Ubisecure. Stay up to date with episodes at ubisecure.com/podcast or join us on Twitter @ubisecure and use the #LTADI. Until next time.

[End of transcript]