Let’s Talk About Digital Identity with Lisa LeVasseur, Executive Director at Me2B Alliance.

In episode 38, Lisa and Oscar discuss the Me2B Alliance and how it aims to make technology better for humans, plus the businesses (B-s) which are shining a light on privacy issues and giving the Me-s more control.

[Scroll down for transcript]

“We used to call ourselves something like the ‘organic food label’. But that’s actually not right. We’re more like independent automobile crash testing.”

Lisa LeVasseur headshotLisa LeVasseur is Executive Director at Me2B Alliance, a non-profit organisation that is setting the standard for respectful technology. An MBA technologist with a background in Computer Science and Philosophy, Lisa began strategic work in cellular telecom industry standards in the late ‘90s while at Motorola. Since then, she has participated in 3GPP, 3GPP2, MEIF, WAP Forum, IETF, W3C, IEEE and Kantara Initiative.

Find out more about Me2B Alliance at me2ba.org. Join as a ‘Me’ or a ‘B’ at me2ba.org/membership.

We’ll be continuing this conversation on LinkedIn and Twitter using #LTADI – join us @ubisecure!

­Go to our YouTube to watch the video transcript for this episode.

Let's Talk About Digital Identity
Let's Talk About Digital Identity
Ubisecure

The podcast connecting identity and business. Each episode features an in-depth conversation with an identity management leader, focusing on industry hot topics and stories. Join Oscar Santolalla and his special guests as they discuss what’s current and what’s next for digital identity. Produced by Ubisecure.

 

Podcast transcript

Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla.

Oscar Santolalla: Hello and thanks for joining today. We are going to discuss today something pretty different about the ethical aspects of technology. A lot of technology we are already using. We are using a lot of technologies brought by big tech, by many organisations around the world and we are going to hear what could be a better vision for how the technology treats people in a more respectful way.

For that, I have a very special guest who is Lisa LeVasseur. She is the Executive Director at Me2B Alliance, a non-profit organisation that is setting the standard for respectful technology. An MBA technologist with a background in Computer Science and Philosophy, Lisa began strategic work in cellular telecom industry standards in the late ‘90s while working at Motorola. Since then, she has participated in several other standards organisations such as 3GPP, 3GPP2, MEIF, WAP Forum, IETF, W3C, IEEE and Kantara Initiative. Hi Lisa.

Lisa LeVasseur: Morning. Or evening!

Oscar: Yes, exactly. We’re in the opposite. Quite early for you. The night is falling here in Helsinki. So it’s a pleasure talking with you Lisa. Welcome and let’s talk about digital identity and this very interesting concept and project you are embarking on, Me2B. But I would like to hear more about your beginnings and how things led to the world of digital identity and this latest project you have.

Lisa: Sure. Thanks Oscar. Thanks for having me. I’m really honoured to be here talking with you. So how I got involved in this world was back in 2009, I started working on a product that was designed to put families really in control of their information and the services that they use, whether those services were in the brick-and-mortar world or online services.

And it was through research in that project where I really became aware of – I think it was initially Doc Searls and I maybe became aware of some trust framework stuff and then I sort of unlocked the door to this whole world of people working on identity management and identity standards and realised that there was a whole world of people sort of on the leading edge of this work. That’s how I kind of stumbled in. It was probably around 2012 or so.

Oscar: At that time you were the product manager, building software, building product? That was your role at the time?

Lisa: That’s right.

Oscar: And how did that evolve to today, Me2B, which is relatively new, right?

Lisa: Yeah. Well, interestingly enough, having this sort of long experience in industry standards and being one of four people on the planet who actually like industry standards work, as back – as far as 2009, I actually had this idea when I started to define this product because I had the ecosystem in mind and I was like, “Well, what we really need is we need a new kind of standard.” We need a standard that measures beyond just bits and bits and protocols but into like the softer aspects, like the user interface and the usability and eventually the ethics of it.

So I had this idea for – and it was really kind of quaint and naïve that all we needed to do in order to really stimulate this market for more ethical technology was build a standard. Build a standard. Build a specification and start certifying and voila, we will have what we need.

It has been much more difficult than that. As I mentioned, I started thinking about this in 2009 literally. I have slide decks with a precursor to this idea and then it had just been percolating, percolating, percolating and I was working on the product.

Fast forward to 2018, I went to IIW. You know, the Internet Identity Workshop in October and I had at that point a fully-fleshed out idea of the organisation. I pitched it there because it’s an unconference and I ran a couple of sessions and got a lot of support and then went back to the foundation where I work and said, “So, I’ve done this thing. I’ve created this organisation. I’m going to do it no matter what. So I just wanted you to know.” Luckily they got behind it and they’ve been supporting us kind of as seed investors into the Me2B Alliance and we started in earnest in 2019.

Oscar: OK. So the idea was in your mind and of course you were just iterating with the work you were doing through this year until this unconference, right? In the Internet Identity Workshop. So you show it and you get a lot of support and I guess some people who had similar ideas were able to also tell you and support you. So what is today – how would you really define now Me2B?

Lisa: Yeah. So let me kind of separate the semantics or syntax of this a little bit. So Me2B is really a qualifier. It’s really a shorthand that is encompassing an entire ethos and sort of ethical point of view and it’s not really different from other ethical points of view like VRM, like – you know, Doc Searls’ VRM and my data and we’ve listed all of the organisations that are working on better technology in some form or fashion and there are hundreds of them. Hundreds of organisations around the world and we all sort of have the spiritual alignment that we want to make technology better for humans. But the real like heart of the ethos that we’ve come up with in the Me2B Alliance, one of the first thing we had to do was really figure out like an ethical foundation.

So in 2019 at the IIW in April, we ran a bunch of sessions there. We ran a session on sort of a new social contract for the internet. We had some really great minds in that session talking about the possible ethical frameworks. We looked at moral foundation theory. We looked at a lot of different things and after kind of a lot of synthesis and just kind of sitting with all this information, where we’ve ended up is that you can’t see me but I’m going to pick up my phone now and say this is not a tool. This is not a tool.

These things that we use, they’re not tools. We think they’re tools. We think like they’re tools. But they’re not. They’re relationships. They’re becoming more humanesque. We’re talking to them. We’re gesturing to them. They’re responsive. They’re personalised responsivity. It’s more like a two-way human relationship.

So if we hold technology to that level and say, “Ah, this is a relationship,” well, at a minimum, it should treat me like a polite stranger. At a maximum, it should be a trusted good friend and if we go even a little bit more down this path and we did into human psychology, interpersonal psychology, and we see that there are norms. There are actual scientific and sociocultural norms for how to create and what the attributes are for healthy human relationship.

So what we did is we took those attributes and we sort of distilled them out to a list of what we call the rules of engagement and we are quite literally measuring those attributes in connected technology and we are measuring – right now we are measuring mobile apps and websites. But ultimately anything that can be connected and that has a user interface, a user experience, we can test.

Oscar: And a relationship between person and a company or a business, that’s the type of relationship?

Lisa: Yeah. Well, that’s the tricky part and that’s the tricky part of the acronym. If you go on our website, you will see some educational tools. We’re really starting to hone in on a very clear vocabulary because I will say this. We do not have language to adequately describe what’s happening to us in the digital world.

And we have had to create the language so that we can actually measure things because you can’t measure when the words are too broad and not nuanced and specific enough. So Me2B, you have to sort of suspend this belief a little bit.

The Me2B relationship is actually a collection of relationships and some are with the B. So the legal side of it is with the B, the business, right? You sign a contract with the business behind the product. You have an experiential relationship with the product itself. We call that Me2P, me to product.

There are technology enablers that are necessarily along for the ride and we call those Me2T and then for all of those Me2P and Me2T relationships, there are integrations. We no longer write pure software, right? We integrate. It’s an integration activity almost more than anything else, right?

I’ve been in software since punch cards. So for me, it’s a substantial evolution of how things have changed. So in those Me2P and Me2T part layers, there’s also B2B relationships that are the integrations. Those are like the invisible sort of – I call them strange bedfellows.

When I say a Me2B relationship, there’s really a lot. It’s like an onion with a lot of layers. It’s Me2B in the sense that the B is responsible for the behaviour of the product. It’s responsible for the legal terms of using the product. It’s responsible for those integrations in a data controller sense perhaps.

But it is that whole network. It’s a whole network of relationships. So it’s a little fuzzy, which is why we had to get deeper and more specific.

Oscar: Right, right. Yeah, yeah. And many relationships, as you said this, and different types. OK. Tell a bit about the organisation as well. What are you doing?

Lisa: Yeah. So because I come out of standards and because I love standards, we are a kind of standards development organisation. We are organised like a standards development organisation. We just opened me and B members. Last year we were in soft launch with membership.

One of the critical points of the ethos that I didn’t mention is like sort of the underlying belief is that these healthy Me2B relationships, all of them in the layers, are better not just for me but also for Bs.

So that’s a critical part of our ethos really and so we feel it’s really crucial that, you know, we’re a finger on the scale on the side of Mes because of the power asymmetry right now with surveillance capitalism. You know, that whole dynamic.

But we can’t solve this without Bs. So we really feel like this is a sort of yin and yang kind of relationship. We need both Mes and Bs. We need the users of technology, the people using technology, as well as the makers of technology.

So a standards organisation that is really bringing in both and perhaps a little more focus on the Me side. So we’ve got four working groups. We’ve got one working group working on the certification criteria. We’ve got a Mes working group that is working on educational support for Mes. We’ve got a Bs working group working on educational support for Bs and then we’ve got a policy and legal working group, which is working on sort of educational materials and policy work for mostly in the US because as you I think well know, we are somewhat behind in the US in terms of our policy and this year looks to be like a very big year. We have some very strong ideas. We would like to see regulation or legislation that comes up. We really want to start educating people to kind of look at it through our lens. We think our lens is really powerful. It has really held up.

We’ve started testing products and it’s really holding up. Like I think a big problem with this space is that a lot of us viscerally know that technology isn’t treating us right. But kind of translating into a practical ethic has been very, very hard. So this framework that we’ve got is really – it’s holding up well and so that’s how the organisation is structured. We are – in the US, we’re a non-profit. We’re a 501(c)(3) organisation.

Oscar: OK. Excellent. Already very active with several work groups. Also to understand this vision if we try to see it in products that already exist. I think – I’m not sure if you already have certified products but let’s say products that are like tools or products that exist today and people could use them. Could you mention a few of these software or technology that somehow are embracing the vision, the Me2B vision today?

Lisa: Yeah. And we haven’t officially certified or published any testing results. So this has all untested my sort of sensibilities about what tools I like to use, what relationships I like to build out.

So really it’s the people who are building at least pretty privacy-aware tools and technologies. The browser is a very important animal in the ecosystem, right? It has access to a lot of information about us and that’s one of the most intimate relationships we have frankly with a browser, whether we know it or not.

So I think browsers are really important to scrutinise and make sure that they’re treating you like a polite stranger or a good and trusted friend. So the two browsers – actually, there are three that I feel OK about, the Brave browser, Mozilla’s Firefox browser and also I don’t use Apple products but I know Apple Safari is doing a lot of great things and really proactively setting privacy policies. They’re doing respectful defaults I think in a good way.

Then in Apple in general, Apple is on the right path with their privacy nutrition label in their app store. They’re on the right path with that. You know, so it’s the companies that are really marketing around and positioning on privacy. There are other categories too. Like there’s Digi.me and Meeco which are ecosystem enablers, so that people can be in charge of how their data is getting shared across other apps.

There are other things like browser extensions. There’s a lot of really good browser extensions. I think that’s something that a lot of people are – well, I think they’re pretty aware of it with like Ghostery and the EFF privacy badger. Then Terms of Service; Didn’t Read, that’s another good extension to help us keep more aware of what’s going on.

I will say that one of the primary objectives of the certification market self that we’re developing and the testing that we’re developing is really about shining a spotlight on things. Like shining a light into the dark corners and letting people know really what’s happening.

So a lot of these tools, there are two facets of the tool. They’re either shining a light on the dark corner, like Terms of Service; Didn’t Read. They’re shining a light on the terms of service or they’re giving us more control over our destiny, like the Apple default settings, which are more respectful.

So – and I think those are – yeah, that’s a good list and eventually I do want to say this point is that we’re really – by creating a standard, you know, 5, 10 years from now, it’s my aspiration that this respectful technology is the norm and not the exception.

Oscar: Exactly. It’s today the exception as you said. Yeah. Thanks for sharing these. So that helps us illustrate more what technology or companies can do and – doing technology the right way and which one you can use. Yeah. So I have to try a couple of those I have not tried, so yeah. I will do definitely to try them. What kind of software we really need to fulfil the vision nobody, absolutely nobody is offering today?

Lisa: Well, the thing that I work a lot on in IEEE 701, it’s a P7012 – and that standard is machine-readable personal privacy terms. So when we think about like signing up for a new relationship, right? The first thing that happens, we call this the “Me2B marriage”. The marriage is creating credentials. Here’s the tie-in to identity by the way.

So the Me2B marriage is when you say, “You know what? I want to be remembered, recognised and responded to by this service, by this product.” Those are Joe Andrews’ three functional characteristics of identity. I want to create the credentials. I want to get married and then the marriage certificate is the terms of service, right? And those come from the vendor and they’re designed to really keep the vendor safe, right?

It’s a legal instrument that’s designed to keep the vendor safe. So in P7012, we’re defining the flipside of that, right? What if individuals could offer their own set of permissions? It’s like that’s great. That’s great that you want all that. Here’s what I’m allowing. Here’s what I’m granting.

So you can think of this almost like a reverse EULA or a reverse terms of service. But it’s a way, it’s a mechanism to actually assert your own preferences and permissions, legally binding preferences and permissions.

So what we need then is a tool that can actually – like a software agent, my Privacy Pal or whatever, my little agent, my trusted agent has a duty of loyalty to me. That’s important. Most companies don’t have a duty of loyalty to the end user, the individual.

They have a duty of loyalty to the company. So we do need a new kind of software agent that could for example – you know, we think about decentralised identity or bring your own identity credentials. It could bring my own identity credentials and it could bring my own terms. That’s really crucial. That’s maybe the most crucial thing and really having an impact on the power asymmetry between Mes and Bs.

Oscar: So that will be a service, an agent, a service, so someone to be running that.

Lisa: Yeah. Somebody, something, some kind of entity that can have a duty of loyalty and not just a duty of care but a duty of loyalty. Meaning like a real estate agent or a financial agent. Those entities at least in the US. I’m not familiar with like all global regulation around that. But at least in the US they carry a duty of loyalty usually.

So we really need something like that. We really need something like that. The J-Link protocol and what they’re doing and even like Digi.me. Like those ecosystem types of things and probably Solid where you can at least be in control of your information. It’s kind of like that. So like Digi.me has their own overarching developer rules, right?

So if you’re going to develop a Digi.me app in that ecosystem, you’ve got to abide by the generally privacy, human-respecting privacy terms, to even build an app in that space.

So those are kind of like that. The thing is, is you have to opt into an ecosystem and I’m a fierce independent in a lot of ways. So I say, “Why should I have to join something to be treated right?” and those are great products. I want to also say that they’re great and they’re absolutely necessary. But I – again looking further down the road, I would love to just always be treated right and not have to join something.

So I would like to be able to assert my own permissions wherever I go. Not just within like a specific ecosystem.

Oscar: Yeah, exactly, because there’s a lot of tools that a lot of people use and you are either in or out of that in many cases. Take it or leave it and mostly who decides that is the big companies, big tech. So what will happen with big tech in this vision? How will we make that big tech join this vision we are talking about today?

Lisa: Well, I think they will join because we will, fingers crossed, we will succeed and we will shed a lot of light on this and we will get a lot of education and awareness with people. We have this one in my mind and also on our website we have this diagram of like, you know, you start with a certification which is really awareness. It’s an awareness-building tool and people then start to become aware that oh, these things aren’t really treating me right, but some are.

So then they start to demand. You know, they start to choose and that hopefully drives more demand and more choice until eventually we have a lot more choice and I think once that awareness and once that dissatisfaction gets to be very wide scale, there will be a lack of tolerance with certain behaviours as there should be.

In some sense, I feel like technology is hiding behind the inherent opacity of it. We don’t understand it. We mere mortals don’t understand what’s happening under the hood of technology. But once we can start to understand that, which hopefully our educational tools and certification and testing will illuminate, people will get more vocal and demand better.

Oscar: Yeah, it’s true what you said. Technology goes so fast, especially the ones who are leading the kind of forefront and the ones who have the most popular services and applications, etc. So it’s difficult for everybody else to catch up to really understand what is behind these tools, this technology.

Lisa: Yeah. I want to add one more thought to this. We’re just on time with this. I think about the arc of technologies. And new technology’s introduced; then we understand the potential harms; then we regulate and then we get mature, right?

So we standardise and the order may shift a little bit. But one of the ways we describe our work, and I should have mentioned this earlier when I was describing the Me2B Alliance, we used to call ourselves something like the organic food label. But that’s actually not right. We’re more like independent automobile crash testing. There are many of these organisations, right? And they didn’t happen until the automobile was out and manufactured and had some sense of scale, I think. I’m not entirely sure but I think that’s how the arc of time went.

That’s very much what we’re doing is our testing is looking for the risks and potential harms to people and we won’t be the only one. It’s too big. There are lots of different kinds of risk. We’re in a whole new territory, right? Look at what’s happening with freedom of speech in the US right now and the confusion around this and the responsibility of the technology platform.

There are certain things we are not testing like deep fakes. There are organisations that are becoming expert in that and we’re hoping that there will be lots of other organisations testing other things. Like we’re down in the plumbing right now. You think about the content and the harms of the content potentially and that’s way up a stack for us.

So yeah, we think there will be lots of organisations doing this kind of technology crash testing.

Oscar: So independent crash testing for technology. Yeah. I like the simile. So it’s definitely pretty good.

Lisa: It didn’t come into me until last year. I have been looking at this for years and years and yeah, it was really hazy. It has been refinement exercise. You know, initially we were like, “Oh, Me2B, we’re going to validate the B,” and then I was like, “No, that’s not – not interested in that. It doesn’t scale well either. So I really don’t want to do that.”

But yeah, it has been a continuous refinement and then yeah, so crash testing feels good.

Oscar: So Me2B Alliance is doing crash testing. And what about Me2B Alliance, how people can – both people, individuals and organisations, can join forces? So what people who are listening to that and say, “OK, this is a good cause?” How people can join forces or contribute one way or another.

Lisa: Yeah. You can join the alliance. We made it very sort of approachable for both Mes and Bs to do so. We’re also – we’re in a soft launch. We are having a little bit of technical difficulties. If you will have some patience, you will see some things is all I will say. But you can join through our website at www.me2ba.org/membership I believe is the full link.

You can join as a me or a B or you can just join us. We’re just publishing a public calendar. We’re doing a little bit of refinement on the website. Every other month we have a public call that kind of is just an update on what’s happening.

So we’re due to have that the first Monday. I think that looks like February 1st. It will be our next one at 8:00 AM Pacific Time, whatever that maps to in your location.

We also have a lot of educational materials. There’s a tutorial or two on the website. In our library, we have a lot of the tutorial presentations in the presentation field. The most recent presentation that has been recorded is the one from the W3C credentials working group. So that’s the latest and greatest.

Oscar: Well, excellent. A final question is for all business leaders that are listening to us now. What is the one actionable idea that they should write on their agendas today, especially it’s beginning of the new year.

Lisa: I think the deep question is to have an honest, clear-eyed look at the things that you make, the things that you build, whether it’s your website or a product and say with honesty, “Is it treating people respectfully?” and the sort of sister question to that is, “Do I really know all of the integrations and all of the things happening in my software that I’m building?”

Again whether it’s the website or a standalone product of some sort or service. What we’re finding in our testing is we’re in early certification. We’re hoping to launch later this year. But in a testing that we’ve done so far, we’re seeing that our greatest value is that the makers of technology don’t really know what’s happening in some cases with the integrations and what’s happening on websites and apps.

Oscar: Yeah, definitely a very good reflection. That’s true because as you had mentioned earlier, a long time ago, you knew exactly the one who was making the product, the software, you know exactly what it was, the pieces, because the same company would do all the pieces. But today it’s just the opposite. We are taking parcels from different open source or providers, so much your own software. So a lot of integration, APIs, et cetera. So you don’t know exactly what each company is doing.

Lisa: One of the critical things that we’re unearthing in our testing right now, and this relates to identity too, and this was something I raised with the W3C credentials working group. There are two parallel universes when we use technology.

There’s the one that we as the user of technology, you know, as the individual side, the Me side, there’s what we experience viscerally. Like oh, you’re asking me to share location. You’re asking me to create an account. I see that. I see what I’m sharing. It’s the known sort of visible world.

But the more potentially harmful and risky parallel universe is the invisible world. Largely this is through a lot of the MarTech and AdTech technology. But there’s this whole invisible layer that’s there that is actually identifying you with specificity globally and so when we think about identification, there’s this sort of identity and access management at the visible layer and then there’s this invisible layer that is happening unbeknownst to us and sometimes unbeknownst to the companies that have built the software. So it’s that layer that just really needs more understanding and more visibility.

Oscar: Oh, thanks a lot Lisa for this conversation. It was super interesting hearing this Me2B project that you are leading today, doing this fabulous work and I really hope that everything crystalises as you said. In five to two years, we will have this level of respectful technology that we should operate today. Please tell us how people can find you or the organisation. What are the best ways for that to find you on the net?

Lisa: Yeah. So the website is www.me2ba.org is the URL and that’s the best way to get connected with us.

Oscar: Excellent. Thank you Lisa again and all the best.

Lisa: Oscar, thank you so much. It was wonderful catching up with you.

Thanks for listening to this episode of Let’s Talk About Digital Identity produced by Ubisecure. Stay up to date with episodes at ubisecure.com/podcast or join us on Twitter @ubisecure and use the hashtag #LTADI. Until next time.

[End of transcript]