Let’s talk about digital identity with Susana Lopes, Director of Product at Onfido.

In episode 23, Oscar talks to Susana about what biometrics enable that other identifiers can’t; the importance of anti-spoofing (liveness); privacy concerns around biometrics and regulatory impact; algorithmic bias in biometrics (including race, age, gender and other demographic differentials) and Onfido’s work with the ICO in this regard.

[Scroll down for transcript]

“Biometrics protect users against themselves in situations where they might not realise they’re under attack”

Susana Lopes OnfidoSusana has a varied background in product management in the B2B space. She has a breadth of platform experience, from web front and backend, iOS, Android and Machine learning infrastructure. Her current role is director of product at Onfido, specifically focusing on their biometric product offering.

Connect with Susana on Twitter @susanavlopes and on LinkedIn.

Onfido is building the new identity standard for the internet. Its AI-based technology assesses whether a user’s government-issued ID is genuine or fraudulent, and then compares it against their facial biometrics. Its mission is to create a more open world, where identity is the key to access.

For more information, visit: onfido.com or follow Onfido on social media: Facebook, Twitter @Onfido and LinkedIn. As referenced in the episode, you can also find Onfido’s tech blog on Medium here: https://medium.com/onfido-tech.

Onfido is a Ubisecure partner. Find out more about the partnership here – https://www.ubisecure.com/partner-directory/onfido/.

Susana also refers to a NIST study on demographic differentials of biometric facial recognition accuracy, which can be found here: https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf. Mei Ngan, Scientist at the National Institute of Standards and Technology (NIST), discusses evaluating face recognition biometrics in episode 42 of the podcast: https://www.ubisecure.com/podcast/face-recognition-biometrics-nist-mei-ngan/.

We’ll be continuing this conversation on LinkedIn and Twitter using #LTADI – join us @ubisecure!

Go to our YouTube to watch the video transcript for this episode.

Let's Talk About Digital Identity
Let's Talk About Digital Identity
Ubisecure

The podcast connecting identity and business. Each episode features an in-depth conversation with an identity management leader, focusing on industry hot topics and stories. Join Oscar Santolalla and his special guests as they discuss what’s current and what’s next for digital identity. Produced by Ubisecure.

 

Podcast transcript

Oscar Santolalla: Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla.

Hello and thanks for joining today. I will ask you: have you already used biometrics for authentication? Do you like it? Do you use it often? Well, today we are going to have a guest who will discuss with us the world of biometrics and what other things are happening today. So let me introduce to you our guest today: Susana Lopes.

Susana has a varied background in product management in the B2B space. She has a breadth of platform experience, from web front and backend, iOS, Android and Machine Learning infrastructure. Her current role is Director of Product at Onfido, specifically focusing on their biometric product offering. Hi Susana.

Susana Lopes: Hi Oscar. How are you?

Oscar: Oh, very good. I’m really happy to talk with you and learn more about biometrics and what you are doing in Onfido. So first of all, I would like to hear more about your journey to the world of digital identity.

Susana: Sure. So about three years ago, I joined Onfido and we are an identity verification business. So we want to help people prove who they claim to be, prove that they are who they claim to be when they’re trying to rent a car or when they’re trying to open a bank account so that they can do that without having to go to a store or to a bank front, particularly useful in pandemics.

So when I joined Onfido, I originally was looking after our identity databases product. So making sure that your name, your date of birth, your address are known in say credit rating agencies or in government databases. So that was my first introduction to the world of identity and then later on I actually was one of the founding members of the biometrics team and we started looking at – when you use an identity document issued by a government such as a driver’s license or a passport to prove your identity, we also need to prove that that document belongs to you, so we don’t have situations like your kids trying to pretend to be you because they want to rent a car and they’re not 18 yet.

So that’s where biometrics comes in, in the context of digital identity at Onfido. Essentially we use facial biometrics to compare the face of the person against the face on the document to make sure the document belongs to them.

So that’s what I’ve been up to for the last three years, making that work really well.

Oscar: OK, excellent. Sounds good. I guess some people who are hearing this have an idea what biometrics is. So let’s jump into what biometrics in particular enable that other identifiers cannot?

Susana: So biometrics are any characteristic of your body that doesn’t change over time (or not significantly over time) and that can be used to identify you. And because it is something that is of your body, it means that every single person in the world has them.

So you have irises, you have fingerprints, you have a face. And that is not the case for other identifiers, right? You might not necessarily have a passport. You might not have a driver’s license. You might not be in the government database. You might not have a bank account and therefore not be in a credit rating agency. So just by design, biometrics are much more open and inclusive.

The other thing is that they can be used for multiple different things. So biometrics can be used to anonymously check that you are still you without necessarily knowing who you are and a great example of this would be like Face ID or any sort of biometrics that are used to unlock your phone.

All they’re trying to check is that your face is still the same as the face that was used to register that account or maybe your fingerprint. But no one really knows who that face belongs to. So biometrics are quite versatile in that they can be used to identify people in a completely anonymous way or re-verifying that they’re still themselves.

But then from a point of view of identity that is tied to a real-world human, and by that I mean identity that can be tied to a name and to a date of birth, then biometrics are really awesome because there is such a thing as almost a universal biometric, which is facial biometrics because those are all present on identity documents.

So identity documents around the world have sort of standardised on always including a face on them – or at least the vast, vast majority of them have them. So that’s a really cool thing where biometrics can be used to anonymously re-check that you are still you without necessarily knowing who you are. But they can also be used to tie you to a government-issued identity and that makes for a very powerful model.

Oscar: Oh, yeah, definitely. As you explained, there are different advantages of using biometrics against other ways of authentication of having someone’s identity and if we see that the benefits are for both, consumers and businesses. So what are the concrete benefits?

Susana: So there’s a benefit that comes to mind which is biometrics can’t really be stolen, at least not stolen at scale. So no one can really steal your face or your iris or things like that. The other benefits are privacy, particularly in the authentication scenarios. So it’s very simple for you to unlock your phone and Apple never really knows who you are. They don’t need to know. It’s a very secure way of doing so while using a technology that is very private.

Then convenience, obviously you don’t need to remember your password. You don’t need to go and open a bank account. You don’t need to go and pick up the keys to whatever service you’re trying to access. So biometrics is an extremely convenient way of verifying people’s identities or authenticate them because they always have their face with them. They always have their fingerprints with them.

So it is a modality that is really heavy on the convenience side of things. Then the other thing is also more security. So one simple example of that is it’s very, very effective way against phishing for example. So someone could try to pretend to be – let’s go for a topical example. Someone could try to pretend to be from the track and trace team, trying to tackle COVID and try to convince you to give them your details or give them something that will compromise your identity later or someone might try to call you and pretend to be your bank and you might give them a unique pin code to get access to it.

You can’t do that with biometrics. You can’t just give them your face or give them your fingerprint. So it protects users even against themselves in situations that they might not realise they’re under attack.

Oscar: Yes, that’s true. You cannot give your face, your finger exactly. And that comes to one question that we have seen in – I don’t know, in relatively recent years that some people for instance take a photo of you to authenticate. In some cases, that happens. Some systems have allowed that by …

Susana: That’s called spoofing.

Oscar: So why does that happen?

Susana: That means that that particular authenticator is not of great quality. So when you’re building a biometric system, you’re building two things. One is you’re building a comparison engine of some sort. So you want to make sure that whatever was used to create that identity, to – in some way, either it is that face that you registered when you bought the phone or it is the identity document in our case and then you want to make sure that it’s the same face with the user that’s in front of the camera right now or putting their finger on the sensor or whatever biometric you’re using.

So that’s the comparison part and that’s generally the easy part as long as you’re fairly robust to things like noise and light and reflections and sunglasses and things like that. So that’s bread and butter.

The second thing that you need to do is make sure that the biometric that’s being presented in front of the camera or the sensor is real and that’s called anti-spoofing. It’s also called liveness detection and that’s something that as an industry, there has been a lot of investment into. But not all biometric authenticators have very strong anti-spoofing capabilities.

So what we’re looking for is, is it a photo of a photo? Is it a photo of a printout? Is it a photo on a digital screen? Is it a 3D mask? Is it a latex mask? Is it a deep fake? Is it – all sorts of crazy things in which people try to be someone else and yeah, that’s an area where we at Onfido are very proud to be very strong at. But not every biometric authenticator will be very good at their certification programmes and things like that, that you can go through to show that you can do that very well.

Oscar: So that’s been a failure of design in these two elements that you mentioned, the comparison and the anti-spoofing, checking the liveness.

Susana: Yeah. So it would successfully compare but then it wouldn’t detect that face is not a real face for example. And that is a crucial element of creating a good, solid biometric authenticator.

Oscar: Yes, because I believe some brands with a good reputation had fallen into that not long ago. So I hope they are doing better nowadays.

Susana: Are you thinking about phone manufacturers in particular?

Oscar: Yeah, I think so.

Susana: Yeah. So those manufacturers actually have an advantage that not all biometric providers have because they control the hardware. So they can do very smart things and Apple in particular is excellent at this. They have specialised hardware that allows them to do things like infrared and other things that you don’t necessarily get as a software-only vendor. So there’s a lot of opportunity for those hardware manufacturers and us as an industry as a whole to get a lot better at this. But it is a fundamental component and without anti-spoofing, biometrics are essentially useless.

Oscar: OK. Some said that you mentioned also – privacy as one of the advantages of biometrics. But there are also some concerns. There are some concerns about biometrics being used in a bad way for that because it’s becoming a more widespread identifier of the person. So what are your thoughts about those concerns?

Susana: I think it’s worth separating the different use cases of biometrics. So as we talked about, biometrics for unlocking your phone is one aspect and the company that owns the phone or created the phone doesn’t know who you are. So that’s a very Privacy by Design kind of use case, which uses biometrics.

Then you’ve got the Onfidos of the world that do identity verification where we are trying to verify your claim. I say I’m Susana and then we go, “Yeah. You are Susana.” So we are not attempting to prove that you are someone else or we’re not trying to spy on you. We’re literally trying to verify your existing claim. So that is very bounded in terms of privacy.

Surveillance is a third scenario in which case privacy starts to get a little bit invaded you could argue because all you’re doing is you’re walking down the street. You’re minding your own business, doing whatever you need to do and suddenly someone is watching you and attempting to recognise you.

So that is a different usage of biometrics and when I think about it, I’m a little bit concerned of how sometimes we confused these three use cases and then we say biometrics are bad. Biometrics are an excellent technology and it’s about how do we put in place the right guidelines, the right regulations to make sure that these three use cases, and there are many more, are done in a way that complies with what people are trying to achieve and protects their rights and their freedoms.

So there has already been some really great work in creating regulations by governments. For example, the GDPR done by the European Union and also some great laws coming out of California that start to create much more granular guidance for biometrics depending on its usage.

So for example, when biometrics are used to uniquely identify and recognise someone such as in surveillance. They’re in a different category of personal data. They become a special category of personal data and that means that the scrutiny and the types of controls that are needed to put in place to be compliant with the GDPR are much, much higher and much more strict whereas biometrics that are used for other use cases that are not about surveillance have more looser and more generous controls.

Oscar: OK. So mostly the concerns fall into this third use case…

Susana: Absolutely, and there’s already some great work to mitigate those risks. I think because the onus is on governments to do so, that has meant that we have a little bit of a patchy regulatory framework at the world scale, right? So you’ve got some great work being done in Europe. You’ve got some great work starting in California and some other states in the US. Other countries are not necessarily implementing these types of measures and regulatory frameworks at the same speed but I think we’re going in the right direction and there’s a lot of cross-pollination and learning from each other. That’s already happening.

Oscar: And in this context of surveillance, only the governments have the capability to do that or it’s possible that let’s say someone who can put cameras, build a system and do some surveillance?

Susana: So to build a surveillance system, you need two things. You need some sort of engine that is able to look at a capture either through a camera or some other sensor and then match those faces in this case – because faces are typically what’s being used as a biometric for surveillance. Then match those faces against something and then say, “Yes, I found a match.”

Now that something is the critical part. So the something could just be a picture of my best friend and I’ve decided that I want to be alerted every time my best friend comes to my front door. So I put a little camera in front of my front door and then I wire this biometric system to return a match when it finds my best friend and I get like a push notification. That is a little bit of a surveillance use case in that my best friend doesn’t know this is happening. There’s just a camera. He has not provided any consent and he’s being matched without making any claims. He’s not saying, “I am Susana’s best friend. Is that true?” which would be an identity verification use case. He’s just walking out through the door and I’m getting told that that’s my best friend.

So that is one thing that I could do very easily. But that doesn’t scale very well, right? That’s just one person. What governments have that individuals don’t necessarily have are databases and this will very much depend on the countries. For example in the UK, this is not something we do have is databases of the faces associated with names, right?

So when you start linking those biometrics with an identity, that’s when you start having the power to watch over people and see what they’re doing. So those types of activities are highly regulated in Europe and then that will vary around the world. Who has that data and how do they store it and how do they use it?

Oscar: Yes. OK. Very nicely illustrated. I mean you could make some homemade surveillance but it doesn’t scale.

Susana: Yeah. So I mean you could do it for all of your friends. You could download everyone’s picture from Facebook and say this is my friend, this is blah, blah, blah, and then every time they show up at your door, you get a little notification on your phone. You know, that could be a fun pandemic project really. Now I kind of want to do that myself. But you see what I mean. The real value is not on the matching ability. It’s on the database itself.

Oscar: Yeah, exactly. That’s why the government have the power of doing that. OK. Yes. That’s very interesting on that as well. And you say that regulations are already doing the right steps to protect people, right?

Susana: Yeah, absolutely. So the usage of biometrics is regulated as I mentioned in particular parts of the world quite well. So the area that I think is a little bit murky and that we’re actually helping build out a little bit more, flesh that out a little bit more, is how do you even create biometric systems. So to go a bit deeper is – how do you create the algorithms and technology that is able to do this surveillance in the first place or able to do this matching in the first place?

A lot of those systems are based on machine learning and machine learning is a very, very data-hungry technology, right? It needs to learn from annotated data and the process of collecting that data with the correct consent, the process of training those algorithms, there isn’t a lot of regulatory framework that helps companies like us and many others that do biometrics around the world are doing that in a way that matches what the regulators would expect because they haven’t really crystalised what best practice would look like.

So Onfido has actually been working with the Information Commissioner’s Office in the UK in developing machine learning algorithms particularly with regards to algorithmic bias. But we were doing so under the supervision of the regulator because we want to help them create regulations that are informed by industry practice. So they created this regulatory sign blocks and they invited a bunch of companies and so that meant that we got guidance from them on how we should do things and they got guidance from us on how we think these regulations can best protect the consumers and also optimise for businesses to be able to build a technology that at the end of the day provides value for us all.

So that work has been really fulfilling and we have some early indications that there are some new regulations that might come around based on some of the experiences and the shared learning. I think when we have these type of industry and regulatory bodies coming together, I think we can make really great strides and creating regulation for things that the regulatory bodies wouldn’t have even considered, right?

For them, the first priority was to regulate the use of biometrics. But then we went to them and we went – you know, to create that technology in the first place, there’s a lot of personal data that is needed. We want to have a conversation because we want to make sure that we’re doing this right by the regulation but also by the users.

Oscar: And you mentioned the algorithmic bias. That’s another topic that has been also in the news in recent months. I would say last couple of years and yeah, that’s also something that is getting people suspicious about biometrics. So do you think this poses a barrier for biometrics to be more widely used as an identifier?

Susana: I think algorithmic bias has manifested itself in a lot of ways and places. So one of the things that typically gets talked about the most is about race. So typically because the companies creating the algorithms are using data sets that are more representative of the countries that they’re living in, often the algorithms reflect that data set that went into it and it’s much more likely to perform better in a particular ethnicity rather than the other.

But ethnicity is not just the only one. There’s also age and gender and other demographics. The National Institutes of Technology in the US, NIST, has recently put out some great studies assessing some fairly common algorithms against not only ethnicity but also age and gender and other demographics and they talk about it as demographic differentials. They don’t use the word “bias” because bias seems to indicate that it puts people at a disadvantage and they talk about differentials because sometimes there is an advantage to the system performing differently for a certain demographic.

So it might make it easier for you to open a bank account because you’re of that particular demographic, whatever that is, or it might make it harder. It depends on which way the difference in performance swings.

One thing from their study that was really interesting was that of course there are racial differences but also when you combined race and age, the group that performed the worst across the board were white males that are old, which they call the “Santa Claus effect”. So there was definitely a difference in performance there and often those people are also underrepresented.

So going back to your question, do I feel like this poses a barrier? There are two things is do we feel like it poses a barrier to adoption because of consumer sentiment or from a regulatory point of view? I mean consumer sentiment, obviously people are already talking about this, which is fantastic because it puts the pressure not only on the companies to address the issue head on, but it also puts the pressure on the regulators to start talking about algorithmic fairness, right?

So when you’re using something to fundamentally either allow people to access services or when you’re doing this to do surveillance exercises when you’re trying to find people of interest for example, it’s really important that fairness is one of the buying criteria. Can you talk to the vendor that you’re considering and can they show you that they’ve done their homework, that they’ve tested across a dataset that is representative of the use case that you want to do, right?

All of those things are coming more to the surface and I think the fact that we’re talking about is fantastic. I think a lot of it will have to do with transparency, right? Like how can vendors really show that they’ve done this work and how can regulators force the hand to make sure that this is addressed throughout not only at a company by company level but this becomes the norm, the standard.

This is how a good performing authenticator or verification system works and we at Onfido have been doing a huge amount of work on this and, as I mentioned, collaborating with the ICO to make it happen. It’s just a good thing that as an industry, we’re tackling this head on now.

Oscar: So it’s a good thing that people talk about that because there will be an impact on both the technology providers, the ones who created technology, the products. The regulators also because they have to make sure that the technology vendors do the right products. But also the buyers, right? So people in organisations who are going to buy these products, they’re going to be much better informed.

Yeah, excellent. I would like to hear also about Onfido, if you can briefly tell us what is your service? So how do you combine all these things we have been talking about, biometrics and what is special in your offering?

Susana: So Onfido is, as I mentioned, an identity verification provider. So typically when you’re either trying to rent a car or open a bank account or, I don’t know, transfer some money to a family member that lives abroad or many, many different types of scenarios, normally historically you would have had to go somewhere and then at some point they would go, “Please show me your identity document because I either need to know who you are from a regulation point of view or I need to know that you’re old enough or I need to know that you have the right to drive this particular type of vehicle”.

So those are just some examples of what people would typically do. What Onfido offers is it allows you to do that from the comfort of your own home. So instead of going to somewhere, you would take a photo of your identity document that was provided by a government. So that could be a driver’s license, your passport, your residence permit, visa, whatever.

So you take a photo of that and then you take a photo of yourself and that’s basically it. So Onfido would then use machine learning and specialist analytics to make sure that those pieces of information are genuine, so that the document is real, that it belongs to you and the belonging to you is what biometrics is being used to. Comparing the face on the document to the face that you have on the selfie and then making sure that this face is real, right?

The anti-spoofing which we talked about, make sure that your face is not a photo of a photo, a photo of a friend, a mask, a deep fake, all of those wonderful things that fraudsters try to use to pretend to be you and we do so with very simple two-inch grade technology through an SDK or you can integrate directly through our API.

Then you get all of those results via API as well. So you can build your own flows but also through a dashboard, so you can investigate what kind of answers we’ve given back and they’re very detailed as well. So you can make very custom flows based on what we said, either give feedback to the user, onboard them, block them, raise the risk level that you see associated with this person. Yeah. So we’ve been building that.

Oscar: And then the user just needs a device, like a laptop or a phone connected to internet with a camera and that’s it.

Susana: That’s right. Either a phone or a laptop with a camera or even a normal laptop without a camera and they would receive either a QR code that they can use their phone to take the photo or even a text message. Obviously we prefer if you use your phone because the quality of the cameras there are much, much higher and it allows us to be more accurate in terms of the types of fraud we can detect, but we can work very well with webcams as well.

It’s all about widening access and giving people the ability to – I don’t know. During the pandemic, they can’t go see a doctor to get their prescription. They might sign up to one of our clients, verify their identity through Onfido and then be able to order their repeat prescriptions online through an app or through a website or even sign up, verify their identity and then have a virtual consultation with a doctor.

So we are essentially allowing people to access all of these services that they would have had to do in person before and we’re very proud to be helping in a pandemic type situation. Very fortunate to be in a position to be able to help.

Oscar: Yeah, exactly. I mean many people cannot for instance travel, will not travel for months. So we can do this identity verification quite easily. Finally I would like to ask you also thinking of the end users, like a normal person, if you could give us a tip, practical advice for anybody to protect our digital identities.

Susana: OK. So one thing that I think about when I’m accessing services online is, how much information am I sharing and why? So one thing I notice is when you’re filling in a form, so maybe I’m ordering something on the internet. They ask so many questions of me and I’m just machinely automatically going through them and answering all the questions. Where was I born and blah, blah, blah?

A lot of the time, you realise that a lot of those answers are not actually needed. So one thing I would encourage people to think about is, “Hey, can I get away with giving them less information?” Sometimes they only need your name and they need a password. Then they give you all these things to fill out, that they don’t actually need, and you can click “Save” without filling all of that out. That protects your digital identity because your digital identity is almost anything that you do online, right?

It builds a profile of what you care about and what you do and what you buy. So the least information you give, the more protected you are. The other thing you could do is a very similar example. It’s when you’re logging in with Facebook or Google or logging in with LinkedIn when you’re creating that link, often it goes, “Hey, you’re about to give permission for this company to do so and so to your LinkedIn profile,” and often it gives you controls on what you want to share. So I would also encourage that you review those settings before you blindly click “Yes” because many times you can get away with just giving that third party company your name. But they want more, right?

They want your date of birth. They want your friends list because they want to market to them. So just be conscious of what is strictly necessary and what isn’t and what can you keep to yourself because that company doesn’t need to know all of those things really.

Oscar: Yeah, definitely. In the two cases, you said the less information you give to the companies, the better you are protecting yourself. OK. That’s very, very nice. Thanks a lot Susana for sharing everything about biometrics. It was very, very interesting. So please let us know how people can get in touch with you and what space to follow your work or what Onfido is doing.

Susana: The best place would be to either go to the Onfido website. That’s onfido.com and you can follow us on Twitter or if you’re more into the technology side of things, we’ve got an excellent technology blog on Medium. If you look for “Onfido Technology Blog,” there should be some really interesting content there too.

Oscar: Excellent. Well, again, thanks a lot for this conversation Susana and all the best.

Susana: All right. You too. Take care.

Thanks for listening to this episode of Let’s Talk About Digital Identity produced by Ubisecure. Stay up-to-date with episodes at ubisecure.com/podcast or join us on Twitter at @ubisecure and use the hashtag #LTADI. Until next time.

[End of transcript]