Let’s Talk About Digital Identity with Katryna Dow, founder and CEO of Meeco.

Katryna talks to Oscar about her career (including inspiration from Minority Report), Meeco’s personal data & distributed ledger platform, the importance of data minimisation to inspire trust in organisations, and cultural differences in attitudes towards digital identity.

[Scroll down for transcript]

“The greatest way to overcome this privacy paradox is transparency.”

“Where regulators have moved to increase the data transparency and data rights of individuals, these need to actually be part of the solution architecture.”

Katryna Dow headshotKatryna Dow is the founder and CEO of Meeco; a personal data & distributed ledger platform that enables people to securely exchange data via the API-of-Me with the people and organisations they trust. Katryna has been pioneering personal data rights since 2002, when she envisioned a time when personal sovereignty, identity and contextual privacy would be as important as being connected. Now within the context of GDPR and Open Banking, distributed ledger, cloud, AI and IoT have converged to make Meeco both possible and necessary.

Find out more about Meeco at meeco.me.

For the past three years, Katryna has been named as one of the Top 100 Identity Influencers. She is the co-author of the blockchain identity paper ‘Immutable Me’ and co-author/co-architect of Meeco’s distributed ledger solution and technical White Paper on Zero Knowledge Proofs for Access, Control, Delegation and Consent of Identity and Personal Data. Katryna speaks globally on digital rights, privacy and data innovation.

Follow Katryna on her blog at katrynadow.me, on LinkedIn and on Twitter @katrynadow.

We’ll be continuing this conversation on LinkedIn and Twitter using #LTADI – join us @ubisecure!

­Go to our YouTube to watch the video transcript for this episode.

Let's Talk About Digital Identity
Let's Talk About Digital Identity
Ubisecure

The podcast connecting identity and business. Each episode features an in-depth conversation with an identity management leader, focusing on industry hot topics and stories. Join Oscar Santolalla and his special guests as they discuss what’s current and what’s next for digital identity. Produced by Ubisecure.

 

Podcast transcript

Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla.

Oscar Santolalla: Hi and thanks for joining today. Today, we’re going to have a very interesting conversation about how many technologies and business ideas converge into products that help people directly to protect their data and their identity. For that we have a very special guest. Our guest today is Katryna Dow.

She is the founder and CEO of Meeco, a personal data and distributed ledger platform that enables people to securely exchange data via the API-of-Me with the people and organisations they trust. Katryna has been pioneering personal data rights since 2002, when she envisioned a time when personal sovereignty, identity and contextual privacy would be as important as being connected. Now within the context of GDPR and Open Banking, Distributed Ledger, Cloud, Artificial Intelligence and the Internet of Things have converged to make Meeco both possible and necessary. For the past three years, Katryna has been named as one of the Top 100 Identity Influencers.

Hello, Katryna.

Katryna Dow: Hello, Oscar. That introduction makes me feel I’m going backwards and forwards in time at the same time.

Oscar: Very nice talking with you now Katryna. It’s super interesting having this conversation with you. I know there are so many things we can talk about. And so, I would like to hear from you what was your journey to this world of digital identity?

Katryna: So, I don’t know where to start because I’m not sure it’s something that I ever consciously woke up one day and went, “Oh, you know, I really want to work in the identity space.” And I think that maybe true for a lot of people that maybe you’ve even interviewed previously. It actually unfolds out of something that is either driven by something you’re trying to do in society or related to commerce or related to access to services.

And then all of a sudden you have this question of who or what are you? Are you supposed to be here? Are you allowed to have access to this place or this thing? And now you have access, what are you allowed to do? And kind of that idea of us coming up against some kind of perimeter where we need to prove or state who we are and then have access. It happens to us every single day in every way whether or not it’s jumping on a bus or paying something or opening a door or joining a call with a password. This idea of this entrance is something that is a theme for us in the physical world, in the digital world. And therefore I think, one of the things is when we say identity it really just means so many things. And it has so many applications to so many aspects of our life. And I think rarely does it mean the same thing in every circumstance or certainly the same thing to everybody.

Oscar: But your origins were in the legal, in the tech, how you started?

Katryna: Yeah. So, I think where it started for me, actually is I’m a big Sci-Fi fan and I guess the film that really changed the trajectory of my career and led to me starting Meeco was the film Minority Report. And I don’t know if you remember that or if listeners remember that film. It actually came out at around I think 2002. And in fact, Jaron Lanier, who went on to write a number of books specifically around privacy and personal data, he was the technical consultant on that film.

And it featured lots of fantastic, super futuristic technologies. And the interesting thing if you listen to Jaron Lanier speak now, for the film he invented all of these cool things where everyone’s data was being aggregated and everyone could be tracked and traced. And his aim was actually to scare people in terms of what that future would look like but the technology looks so cool. I think people then started from that time on starting to implement it.

And for me, I remember watching that film and there are a couple of key scenes where you have Tom Cruise running through a mall with all of these digital messages being streamed into his eyes, all these advertisers, based on his digital footprints. And I just remember being in the cinema just staring at the screen and thinking, “You know, is this really the future?” And in some ways, it may look like a marketer’s dream. But how do we navigate that? How do we navigate that in decision making? How do our brains cope with that? How do our senses cope with that? How does our sense of self cope with that?

And I remember coming out of the cinema and thinking, “I really enjoyed the film. But it freaked me out and someone should do something about this.” And it never dawned on me that it would be something that sort of changed the direction of my life in years to come. But I could say that there was absolutely that moment, walking out of the cinema, standing on a street corner, looking up to the sky and thinking, “Is this the future we’re heading towards?” And then it really started to bother me. So there were many, many aspects of that film around the idea of access to our information, our digital identity, our physical identity, our freedom of movement. Tere were just so many parts of that that then started to bubble up for me to be problems to solve or things to understand, which eventually led to me starting Meeco which was almost a decade later.

Oscar: So super interesting. You are here because of, among other reasons, because of Minority Report. Super, super interesting. I would like to hear, as you start saying about the problems that have been appearing during these years, and now, you have some solutions for those and they are behind your company, Meeco. So, I’d like to jump into that. So, tell us about Meeco. What are you doing? What are use cases that you are working with?

Katryna: Yeah. So, one of the things that we say that Meeco is simply the API-of-Me. And the reason we say that is pretty much everything that we do today, even the fact that you and I having this phone call, this digital phone call, you’re recording it, we’re co-creating data and information. So, everything we do creates data. And so we’re creating this digital twin or this digital replica of ourselves every minute of the day. And I guess the idea first came from the perspective that my background at that time that I started the company, I was working in financial services, I was working in strategy. You know I was noticing that a lot of the products and services that we were creating every day were not really helping the people that we were trying to help. They weren’t helping customers.

And a big part of that was because we were building things that we thought people wanted or needed without necessarily understanding what was happening in people’s lives. And it seemed to me that data or understanding what was going on for somebody was a really important part of it. So that was one aspect.

The other aspect was obviously this thing with Minority Report that had been driving me crazy. Another component of it was the idea that we’d already seen significant business value in organisations being able to integrate digitally. So the idea that you could have a financial institution talking to an insurance company or talking to a government department through a technical integration obviously led to really great outcomes.

And so, for me, I was thinking OK, if we’re becoming more and more digital ourselves, wouldn’t it make sense that we could integrate directly and hence the idea of the API-of-Me. If we could bring our digital identity, or the relevant parts of our digital identity, and our data together into an API and then exchange that directly, plug that into an organisation and exchange that directly with people or organisations we trust. And why? Because that would give us a greater chance of having some context and control over the kind of information we share. But it would also mean that we could start to connect into more meaningful relationships where by using that information, hopefully we could make better decisions and have better outcomes.

So the idea initially was around this idea of if you have access to data and information. And early 2013, I wrote a manifesto which is around the idea that up until now, governments, organisations, social networks, have had the ability to bring data together. What if you and I had that same power? So that was kind of the early concept.

And at that time, to sort of validate whether or not the world was going that way, Facebook was about to do its IPO. So I think their IPO was around May 2012. And of course, there was massive growth in the organisation and so it was one of the things that I wanted to look at in terms of validating this hypothesis. So, if we were able to have access to our data, if we have secure tools and a method of exchanging or sharing that, if we wanted to use data to make better personal decisions, better family decisions, better society decisions, what was happening in the data space?

And the first thing I did was look at Facebook’s IPO documents and I jumped straight to the risk area to look at the risks that they were signalling to the market that may impact either the reputation of the company or their share price in the future. And there were four things that really jumped out at me as worthy problems to solve and the first one was privacy. This idea that individual users of the platform at some point might be concerned about their privacy.

The second thing was that people may actually come to understand the value of their data and there was a risk that maybe the business model that Facebook was recommending may be something that individuals may see was not in their best interest which led to another risk around the possibility of regulation. So regulation that would protect data rights or protect your privacy.

And then the fourth really important thing at that time was that Facebook was moving from desktop more and more to mobile, and so many aspects of their business model were becoming more opaque, you know their under the bonnet. And so a lot of what was going on around data collection and data use in advertising was not so visible on a small screen. And so for me, I was looking at those four things and I thought, “Well what would be the opposite of that?” The opposite of that would be a platform that was privacy by design, security by design.

It would be a platform that didn’t read or mind or sell personal data. It would be a platform that took into account the importance of regulation and protections. And it was a platform where the focus of the data use was for the specific benefit of the data controller. And it was sort of each of those factors that became the inspiration of saying, OK, why don’t we start to build this as a technology platform?

Oscar: Yes. And these four risks that Facebook identified at that time, the time of the IPO, you analysed them. So, it means that now Meeco is a product that somehow is addressing these four risks. From the point of view of Facebook, they are risks. From the point of view of the people they are what would protect us. So, now you have a product, Meeco as a product could enable an individual to take control of their identity and data.

Katryna: So we started with that Oscar. We started with the idea – well in fact, if I go back to the 2012, 2013 I think like any start-up story or the idea of you get one thing right, you get five things wrong. So, the early days were absolutely pioneering. You know we were trying to understand how we could realise the vision of those four things. And we certainly developed a product, in fact, early 2014 we developed a platform that could do all of those things that I just described.

However, what we realised was that we were still really early from a market point of view. And one of the things that was missing was helping organisations, helping government, helping society understand that shift in power towards individuals being part of the value chain, being able to collaborate directly, but also use cases that focused on mutual outcomes.

So, if I fast forward to where we are now, it’s not that any of our technology has necessarily changed but I wouldn’t describe Meeco as a technology now. I think what I would describe Meeco more accurately is a really important set of tools or a toolkit. And so, what we’ve done over the last couple of years and really importantly more so through COVID is realise that yes, you can build a product and you can try to bring a product like that to market and focus on either a single use case or set of use cases, or you could take a step back and think what if you took the components of that product and turned those into tools so that you could actually help solve these problems in lots of products. And really that’s what our focus is.

So we have a suite of APIs. We focus on collection and protection of data, so encryption, the exchange, so fine-grained consent, the ability for you and I to connect and for me to be able to decide what I want to share and for how long, and a whole range of modular solutions that are available for a bank to use to build a more privacy enhancing experience for their customers.

So a good example of that is KBC bank here in Belgium that had our technology inside three of their retail bank brands and that’s specifically to allow their customers to have a digital vault where all of the data is directly under the customer’s control and the bank can’t read that data and we can’t read that data. And it’s up to the customer to decide how they want to use that data. So, that’s an example of how our privacy by design, security by design APIs could develop something specifically for an end customer.

And another example also here in Belgium is a really exciting product that we will co-launch in a few months called Mix It and it’s actually a media platform specifically for children. And as we often say the features of Mix It are all the things it doesn’t do, so no tracking, parental controls that make sure that there’s no content that a child has access to, the content has all been approved by a guardian, a parent. And again, it’s all our backend technology, our APIs, our encryption, our privacy by design, security by design capability.

But the whole idea of what you can do with that is really what our focus is now. So for us, it’s here are these foundational tools that give people access and control and then how might you start to build applications that can be more inclusive, more privacy enhancing and ultimately create better outcomes for the service provider and definitely for the individual customer.

Oscar: Sure. I understand more. So, nowadays Meeco has, as you said a toolkit, a set of tools that address different problems. For instance, who is the customer of Meeco and who is the one who is going to use the API, implement the API or integrate. It’s an organisation, correct?

Katryna: Well, yes, we often say that we play this interesting position of kind of Switzerland, this neutral territory. Because if we go back to our original manifesto which still guides every decision we make, our focus is on empowering individuals – there’s no question about that. But we recognise, and this was part of our learning process, that you can actually make a much bigger difference if you’re able to take that capability and that power in those tools to where customers are already, where citizens are, where students are, where patients are, rather than expecting people to dramatically change their behaviour.

And I think that’s one of the big challenges, you know with technology in general but certainly around privacy and helping everyday people to understand the benefits of being able to have greater control or transparency over their information. So in that regard, we’re very, very clear that our end-customer or the person we serve is always the individual that has those digital rights, always.

However, we recognise one of the ways to do that is to enable applications to be built that make people data happy, or regulators data happy, or families data happy, or cities data happy. And what we mean by that is take all of those four foundational ideas that we had right at the beginning of building the company and our technology and make sure that they’re available to be embedded in existing applications but to give very different outcomes. So those different outcomes are greater transparency. Those outcomes are around privacy, security. Those outcomes are around ensuring that individuals recognise, before they say yes to something, whether or not that is going to result in a better outcome, or whether or not it enables them to do something with information in terms of decision making or access that they haven’t been able to do previously.

So, our focus is always on how do we serve the customer, the patient, the student, the citizen and therefore how do we make these tools available into the various technologies that people are using every single day by adding some additional choices for them around that control and transparency.

Oscar: And going – now, focusing on the identity of the set of tools that Meeco has, so how is the identity? How is the identity that is created? And could you tell us about that? What is created?

Katryna: Sure. So there are a number of things that we do from an identity point of view. So, the most independent set of capabilities support self-sovereign identity. So we follow the W3C Standards with regards to verifiable credentials and SSID, so the ability for us to generate a wallet for verified credentials to be able to use to prove your identity. We also assist with a key store to help people bootstrap and get themselves out of trouble if they were to lose that identity in some way, by being able to regenerate keys and take some of the complexity out of SSID. And then the other thing that we do is we can marry that together also with a secure data storage, or secure enclave, where data can be saved and controlled and then it can also be integrated into other solutions. So, in terms of identity that’s one aspect.

The other part of identity is to be able to embed our applications and tools inside existing infrastructure. So for instance, into your mobile banking applications so that you would log on to the bank exactly as you do now. You would rely on all the bank’s security architecture, except that you would find yourself transferring to a secure space inside a secure space that was completely controlled by you in terms of your data and information. So, sometimes it’s a case of identity standing alone and independent. And other times it’s about layering that identity into secure spaces that give you the ability to work within an existing environment.

So for instance, within the context of your banking application or in an ecosystem, so for instance within the context of the relationship that you may have through your mobile banking identity with the other service providers, along with the ability for you to control the information that you want to share, or being able to have access to data from other parts of your life and bring that together with the information that is available to you from that service provider. And I guess the best example of that, the simple example of that, is obviously what’s happening with Open Banking.

Oscar: OK, interesting. So you said the way you are addressing is thinking of the individual but you are operating with the organisation mostly. And what about this balance between the privacy needs that the people have and the privacy needs of the data needs of the organisation.

Katryna: That’s an excellent question because one of the things that we know is that organisations are always looking for more data as a means of being able to personalise outcomes and so there’s often a strong argument to say why they would like more data. But we also recognise, and I was actually reading a report just this week, it was published by the Australian government and it’s a post COVID study – and by post COVID I mean it’s very recent. It’s not something that was done before March. It’s been done within the last month or so, looking at trusts, specifically trust and privacy.

And one of the really interesting statistics that jumped out at me, and this is in the Australian context, is that up to 81% of Australians in the research felt that when organisations ask for data that’s not directly related to the service that they are involved in, that they consider it misuse. So I think this is this paradox that we have here. So we have organisations that want access to more data in order to understand the customer better or the patient better or the student better or a city better to plan you know mobility or access to services or public transport. So you see that there is this interest in having the data but at the same time without that sense of transparency or there being a clear value proposition, it’s leading to greater mistrust.

And so, this is one of the ways that we believe very strongly our technology addresses this privacy paradox – by bringing the individual directly into that value chain and making it clear what information you want and why, and how that is going to create some beneficial outcome. And the beneficial outcome doesn’t always have to be for the individual.

A perfect example is how you may want to share data for city planning, so that public transport runs more effectively. Or, how you may want to participate in a clinical trial to make sure that something like a vaccine is more effective. Or how you may want to take intergenerational information that’s been important for you because you have small children, and you may have had an illness with one of your parents and you recognise that there is a great benefit in actually understanding the health and well-being of the family from an intergenerational point of view.

So, one of the things we always talk about is equity and value. And we’re not talking about money. We’re not talking about grabbing your data and selling it because in some ways that’s a bit of a race to the bottom. Because the ability to help people make better decisions all contribute to decisions in their family, or their society, are actually just some of the ways that people can collaborate and use their data.

So, the important thing is that transparency and if we go back to that statistic of 81% of people mistrusting and being concerned, we actually find it flips over to be greater than 86% when organisations are very transparent around why they were asking for something, how they’re going to use it and then what the value proposition or the benefit is for you to participate.

And so, the greatest way to overcome this privacy paradox is actually transparency for organisations to say, “In order for us to serve you, these are the things we would need to know and this is why and this is how we will use this information to help.” Which leads to another really important factor which is data minimisation. Because when you then are able to enable the customer to directly collaborate with you then you only need to collect what you need for a specific period of time for a specific outcome.

And so, not only does that reduce the burden for the organisation in terms of data collection, risk, fraud, saving unnecessary data, the compliance of unnecessary data. But it also reinforces the trust because now what you’re asking for is what you need and then you’re fulfilling an outcome which, if that is satisfactory, leads to the opportunity to build on that. And so, by taking this perspective and allowing the individual to be actually part of that collaboration, it actually does lead to better outcomes for both parties and it also starts to manage for an organisation their compliance responsibilities and also the risks associated with over collecting data.

Oscar: Yes, as you said transparency is very important, so people feel that they are revealing some of their data but for a good purpose. And yeah, that also made me remember some times in both commercial and in public sites you are asked for a long list of the data – too much if it feels already too much, it’s already tedious. And at this point, with people are getting aware of these breaches, companies misusing data, it already feels like “Oh, it’s too much.” So, data minimisation is super important as you said. But also, I was thinking when the organisations tell you, OK, we are going to use this data of you for these reasons and that is for the let’s say, public benefit. But how does this have to be communicated? Just in a form or should there be some easier way for people to…

Katryna: Yeah, I think it needs to be embedded in a really beautiful and simple digital experience and I think that’s the key thing. And look, lots of our research and studies have validated this. Many of the use cases we’ve built together with our partners that are deploying our technology. They show time and time and time again that when you build that transparency into a digital experience and you make it clear what you need to collect and why and the purpose and you make that understandable, really easy for an end-user to process, that again makes it much more likely that the individual be able to make an informed decision and engage.

So, part of the idea of making that simple and clear is that leads to people being able to make good decisions and timely decisions. The more complexity there is the more likely- you know what it feels like if you don’t understand something or something seems like it’s just a wall of terms and conditions or makes you feel uncomfortable or afraid, the easiest thing to do is to say no. And so, the focus is always on that simplicity and making it really clear what that decision is about.

And also, if you marry that together with the idea of data minimisation it comes back to your comment, yeah, there are so many times that you apply for a service and there are so many questions that are asked that don’t seem to be relevant at all. And I think we’ve got so used to answering those questions or filling out those forms without stepping back and going, “Hold on, I’m subscribing right now to a media platform does it really matter whether I’m a man or a woman? Yeah? Does it really matter what decade I was born or what date I was born?”

And you understand that some of those things may be really important if you’re focusing on KYC, you know where you have to know the customer, you’re providing a regulated service and you need to be sure that it’s really you. But there are many, many services where we’ve got used to giving away really intimate, important information about ourselves that doesn’t really make too much of a difference with the service and what we’re trying to achieve.

And I think that’s a big part of helping organisations to recognise that if you then focus on the things you really need for personalising the service in a way that has a good outcome, then you’re much more likely to have people participate in sharing that information in that context.

Oscar: Yeah, yeah definitely as you said, very simple, very engaging and well-designed user experiences. Now, bringing a bit of attention about the differences in the geography. So, you’re coming from Australia, now we are living in Brussels. Here in Finland, we’re in the European Union so things are a bit different, I guess. You might know more the difference. What are the cultural differences around identity and data privacy, how people think different, act differently?

Katryna: Look, I think one of the things that’s very different, like a big difference, between Australia and Belgium or Australia and EU is we don’t have the idea of a sort of national identity card. So, I’ve been going through the process for the last year and only about a week ago received the final part of my visa which resulted in an identity card. So, I think that’s one of the biggest differences and that seems to be just normal in a European context. You have an identity card, you can use it at the border, you can use it to access the government service. There’s information coded on that card which is completely personally identifiable, your name, your date of birth, the rights that you have within the context of the country in terms of living and working.

So I think one of the big differences is this idea of national identity. And of course, we have that in Australia because we have a passport of course, or driver’s license which is issued state by state. Or we have what we call a Medicare card which is the access to a medical system. So, we have those identifiers and of course those could be federated that make it very easy from a government perspective to know who you are in that context. But I think that’s one of the biggest differences.

I think the other difference is that there is more bureaucracy and process that I notice in comparison to maybe how things are done in Australia but in some ways the integration of that bureaucracy and process also enables you to do more. So, on the one hand, again if I come back to the idea of the identity card, it then connects you with a lot of services where you’re able to be recognised immediately and that eligibility is recognised immediately. Whereas in the context of where I come from in Australia, some of those things are more siloed and so you have a specific identity that works for one part of an essential service but doesn’t work in another.

And for me, I’m not sure, Oscar, I’m not sure whether I think one is better than the other. I think I can see positives and negatives for both systems. And I think for me it also comes back to this idea of maybe governance, separation of concerns and the data minimisation. You know you can have a great federated system if you manage the governance of that well and the access management. Or you can have a highly siloed system but is not very well managed. And despite it not being connected to other things, you have somebody’s identity very much at risk even though it’s only in one siloed database or something. So I think for me that’s more around the security architecture than necessarily the way the system works.

Oscar: OK. Well, interesting. And going now into thinking of business opportunity, business opportunities of people who are listening to this work in let’s say organisations and they want to follow this principle that you have been talking about now, keeping privacy first, focusing on the individual – what are opportunities that organisations have today?

Katryna: I think the interesting thing is – and there are always three categories when it comes to thinking of any emerging technology. So there is the group of people that are interested in things because of the innovation, right? So, what’s happening? What’s available? What’s possible? How do we keep our organisation or our cohort at the edge of what’s happening from an innovation point of view?

Then you have the middle where things may become mainstream and one of the things that creates that mainstream adoption often is regulation, or the way things are done. Meaning- a really good example for instance is Estonia and their digital identity scheme and the way that many of the services that you want including your tax return or doing anything with the government, is through that integrated system. And so the adoption is directly related to the motivation for somebody to get a rebate on their tax, OK? So, if you need to do that then you adopt.

And then you often have the last group, which is interested in adopting technologies because there is no other choice, meaning that they face a fine from a regulatory point of view or maybe they have been fined, or they’ve had a data breach, or they’ve had a privacy problem, or there’s a loss of trust. So I think for us what we found over the last couple of years more and more we were working with that first group.

So, a really good example with KBC here in Belgium is that for the last five years they have been voted the most trusted brand in Belgium, so not just bank, bank brand, but trusted brand. So it makes sense that they would focus on doing something that would give their customers a greater sense of that trust, privacy and control. And they’ve also last year were voted the best digital bank. So that’s a really good example of an organisation that has a strategy around trust and therefore makes decisions for its customers that are in line with that.

However, what we see increasingly now are things like Open Banking, GDPR, in Australia the consumer data right, in California the CCPA. So what we’re starting to see is where regulators have moved to increase the data transparency and the data rights of individuals, that these need to actually be part of the solution architecture. And that’s really where Meeco comes in, is finding this lovely balance between innovating for the customer and providing better outcomes but also recognising the obligations that you have from a compliance point of view, not only to your customers to protect them, not only to your organisation to do the right thing, but also because you’re working within a regulated market where it’s critical that you are operating within whatever those guidelines are – you know data guidelines, financial guidelines.

So what we find increasingly is the case now, is that organisations are looking for these kinds of tools and they’re looking for how they can be embedded in existing applications to give greater transparency to customers, but also to create better outcomes for customers and actually innovate with some amazing new capabilities, that weren’t possible without that foundation of trust and control.

Oscar: Yes, definitely. Very good examples, as you said the one bank in Belgium is the most trusted brand. Wow, that’s a really great example also. Katryna, last question for you is going to be, if you will leave us with a tip, a practical advice that anybody can use to protect our digital identities.

Katryna: OK. So, this may be is a slightly different way of coming at this answer. But I always like to think of my digital identity as if it was a part of my physical self, you know an extra finger, an extra toe. And if you start to think of your digital identity in the same way that you might think of your physical self and somebody actually wanted you to you know put your finger on a reader or have access to your physical self in some way, there is a moment where we always stop and think, yeah? If somebody wants to touch you physically, you think about it, yeah?

Oscar: Yes.

Katryna: You have a response. And I think what’s happened all too often as we’ve been so acclimatised that somebody offices for something digitally and we just give it. We don’t stop and have that little pause where we think, “OK, why? Or why am I doing this? Or what is it needed for?” And I think a good practice, a practice that I do myself all the time is if I’m asked for something digital, I stop for a moment, I think, if somebody was asking this for me in a physical sense, would the answer still be yes or no? And it may well be that the answer is yes, but you are completing that process in a very, I guess conscious way. You know, you’re clear about what you’re doing and why you’re doing it.

And if the answer is no, then it just gives you an extra 30 seconds to consider about whether or not it’s the right thing to be doing digitally. You know, is this the right website to be putting my credit card into? Or is this the right service for me to be providing my date of birth and my nationality? Whatever it is, and it may well be yes, but it’s a good habit to cultivate to just stop, think about it in a physical sense and then work out whether or not you want to complete the transaction.

Oscar: Oh yes. So you do it every day.

Katryna: Every day. Every day.

Oscar: Wow.

Katryna: I think there’s a part of my brain that goes crazy whenever I go to do anything digitally there’s another part of my brain, “Oh, here she goes again.”

Oscar: Well, it’s a good one. It’s a good one. Yeah, I haven’t heard that example. It’s definitely a good practice. I will try it definitely. Thanks for that. Well, thanks a lot Katryna. It’s very fascinating talking with you. Please let us know how people could learn more about you, the work you are doing in Meeco or how to get in touch, etc.

Katryna:  Yes, Oscar. It’s very simple. You can visit our website which is Meeco, M-E-E-C-O.me and there you’ll find some information for developers, so if there are things we’ve talked about today that inspire you, you want to build a product or a service and you care about your customers, you care about their privacy, you care about their digital rights then we would love to talk to you. We’d love to see how we can help.

You may be and end-customer and you’re interested to see the type of applications that are already using our technology, which include, as I said, products in the financial space, in the media space for children. So you may just be interested in choosing products that give you greater control and choice. Or you may be an organisation that is starting to look at your overall digital strategy and the things that you care about is getting this balance between transparency and happy customers and also doing things correct from a regulatory point of view, then it’s a way of bringing all those things together.

Oscar: OK. Perfect. Again, Katryna, it was a pleasure talking with you and all the best!

Katryna: OK. Thank you so much, Oscar.

Oscar: Thanks for listening to this episode of Let’s Talk About Digital Identity produced by Ubisecure. Stay up to date with episodes at ubisecure.com/podcast or join us on Twitter @ubisecure and use the #LTADI. Until next time.

[End of transcript]