Mariana Olaizola Rosenblat of the NYU Stern Center for Business and Human Rights and Kate McNeece discuss the risks surrounding foreign influence and extremism on interactive digital media platforms.
In March 2024, the Canadian government issued new guidelines concerning foreign investment review of investments into Canadian interactive digital media companies. In this episode, we dig into the reasons for the government’s concerns with Mariana Olaizola Rosenblat, the author of the recent report “Gaming the System, How Extremists Exploit Gaming Sites and What Can be Done to Counter Them”: What is interactive digital media anyway? How is it susceptible to foreign influence?And, what can companies – and investors – do to address these risks?
00:00
Counterfactual Podcast
Welcome to Counterfactual the podcast brought to you by the Competition Law and Foreign Investment Review Section of the Canadian Bar Association. Counterfactual takes a fresh look at issues relevant to business competition and related areas of regulation, and explores the real and hypothetical worlds to gain practical insights and debate policy. Hope you enjoy the show.
00:27
Interested in learning more about competition and foreign investment issues in the tech space? Register for the CBA’s 2024 Competition Law Spring Conference taking place on May 2nd in beautiful Montreal. This year's conference will explore the application of competition law to technology transactions and innovation, covering hot topics such as acquisitions of emerging tech companies, video gaming, artificial intelligence, and the anticipated explosion of competition litigation. With the impending changes to Canada's competition laws, sign up for the pre-conference festivities on May 1 before they sell out. Younger members of the bar also invited to the young lawyers symposium immediately following the conference. For more information, a complete agenda, and to register for the conference, please visit cba.org/sections/competition-law.
01:32
Kate McNeece
Welcome to Counterfactual, the official podcast of the Canadian Bar Association’s Competition Law and Foreign Investment Review section. On today’s episode, we will be discussing national security issues in the interactive digital media space with Mariana Olaizola Rosenblat of the NYU Stern Center for Business and Human Rights. Interested in continuing the conversation or have a great idea for a podcast topic or guest? You can now contact us by email at podcastcommittee@CBA.org. Thanks as always for listening – this is Counterfactual.
02:13
Welcome to the counterfactual podcast I'm Kate McNeece, I'm a partner at McCarthy Tetrault and the Chair of the Podcast Committee. I'm very pleased to be joined today by Mariana Olaizola Rosenblat, a policy advisor on technology and the law at the NYU Stern Center for Business and Human Rights and the author of a 2023 report “Gaming the System, How Extremists Exploit Gaming Sites and What Can be Done to Counter Them.” So Mariana, thank you very much for joining us today.
02:42
Mariana Olaizola Rosenblat
Um, it's my pleasure. Thank you for having me.
02:45
Kate McNeece
So the reason that we reached out to you to have this conversation on the podcast is that here in Canada there's a recent policy memo that came out under our Investment Canada Act which talks about the interactive digital media sector and the potential national security considerations that could come up in reviews of investments into that sector in Canada. And we understand that you're an expert in the sort of “interactive digital media” space. Can you tell us a little bit about your background and your research and how you came to these issues?
03:15
Mariana Olaizola Rosenblat
Sure. Well I am a lawyer by training and I have focused on human rights in my career - in the earlier part of my career - I focused on the protection of migrants, refugees and stateless persons. And it was while working for the UN refugee agency, UNHCR, and then teaching a human rights clinic at the University of Chicago that I realized the immense impact of technology and technology companies in the realization of every single human right, from the most obvious like the exercise of civil liberties, freedom of expression, association, data privacy, public participation etc. to social and economic rights like health and education so literally the whole panoply of human rights. And then I decided to redirect my advocacy efforts from targeting governments to targeting businesses. Some people might not know this but of the largest 200 institutions in the world one hundred fifty seven of them are corporations, not governments. But companies are seldom the targets of human rights advocacy. So um I decided to join this small but growing community of business and human rights professionals trying to influence companies directly. In my case tech companies, so they have a more positive impact or less negative impact on rights.
04:42
Kate McNeece
That's really interesting.I think as, you know, competition and foreign investment lawyers we’re rarely in a position where we really talk about human rights. You know, unlike some African countries you know positive human rights aren't really a part of our regulatory sector. But I think that's a good introduction to you know why we care about these issues. Why we care about interactive digital media and you know in Canada we often use video games as a shorthand for what we're now calling interactive digital media which I think largely has to do with some rules we have around cultural businesses and the fact that those are the types of interactive digital media that are often caught by our sort of cultural review regime but it's certainly clear you know, especially over the last several years the pandemic seems to have just accelerated a trend that was already starting but we're all spending so much time on our phones, on video games, in the metaverse, all of these different types of interactive technologies. So, can we talk a little bit about you know what's the spectrum of interactive technology that your research suggests could be susceptible to human rights abuses or extremism or foreign influence. What should we be really thinking about in terms of the breadth of that category?
04:54
Mariana Olaizola Rosenblat
Right. So, the policy statement that you shared with me ah defines interacts interactive digital media broadly, quite broadly, as um I mean technology platforms that can be used for entertainment, education, training, e-commerce. We don't know, or at least I don't know, how the Canadian government will interpret and apply that definition but it potentially encompasses any type of online platform that hosts user generated content. So this would include social media apps like Twitter or X now, Tiktok. Social networking platforms like Facebook, Discord, messaging apps like Whatsapp, metaverse apps and e-commerce apps. Um, all of these are susceptible to foreign influence to a degree and in different ways. Um the problem of foreign influence on traditional social media is relatively well-known and relatively well studied. I would say since the revelation of Russian interference in US elections and elsewhere in 2016. My research in part focuses on uncovering efforts at manipulation using other less studied and less well-understood technologies. So let's start with video games. Um, one obvious target of this policy statement.
07:19
Mariana Olaizola Rosenblat
Video games are the dominant entertainment sector of the twenty first century. Few people realize that the video game industry generates considerably more revenue globally -- around two hundred billion US dollars annually -- than the film and industry music industries combined. Um and in terms of population, over 3 billion so that's more than one-third of the world's population plays video games. I'm most familiar with the statistics in the Us. But I also read in Canada um, 23 million people that's 61% of the national population plays video games. Um, so video games are a huge and growing industry, hugely influential, especially among younger generations in Canada -- 89% of children and teens reportedly play games. Um. So let's get to how games can be exploited.
The first thing to understand um is that games are about much more than entertainment. They are really social platforms. Most games today enable simultaneous text and/or voice communication. Um, and this is a really great way to socialize because imagine that you're playing game and at the same time talking in real time with ah with a stranger or a friend online. Um and companies realize that um this kind of real-time communication enhances network effects and so more and more game companies are enabling especially voice chat um people just play more when they are in a social um environment and they feel like they're building a community. Gaming is a giant community or I should say it's many communities and bad actors like state influence operators, and extremists and terrorists even, go where their target audiences are. In the case of extremists, which I looked into recently, gaming spaces are very appealing because that's where they find large numbers of highly engaged youth and children. And these are youth and children who are yearning for a community and a sense of belonging and they're just kind of primed for the type of propaganda that these extremist groups um channel through the online games.
10:04
Kate McNeece
As we're talking about this sort of influence of extremists and things like that, in your report you reference a number of different sort of means that they can use these communication platforms to reach audiences, influence people or you know carry out bad acts, I guess, so there's kind of maybe controlling a game itself and sharing thoughts or views or perspectives through the game content itself, then there's engaging in the games sort of as a player as a participant and communicating with people and sharing views sort of through that, then there's you know you have the actual communications technology built in there to organize or to get people to do things in the physical world. The non metaverse I guess. So are there any things that I'm missing here? Like what are the other avenues that that could be used to influence people through this this video game community.
10:57
Mariana Olaizola Rosenblat
Well the avenues that are known are some of the ones you mentioned um and one that maybe was not included, there is the cross platform avenues are dynamic. So one theory that has been proven with some evidence recently is this idea of the radicalization funnel. Um, where online games these are in some cases huge arenas where people, strangers, meet um are the beginning of this funnel where extremists will throw out, you know, provocative statements and um, you know, certain controversial narratives. And those who respond positively to those um narratives and the theories will choose to engage in progressively more private settings such as in gaming adjacent platforms -- and I can speak about those - but um, this ah function that video games play as sort of like the initial arena where people come into contact, and then are taken to more and more secluded places online or even offline. That's one major way that extremists operate.
12:18
Kate McNeece
That's really interesting so you referenced in that answer the sort of adjacent platforms or - I'm not sure I’m using the right term there. Can you talk a little bit about those? I think you know I'm an old person I think comparatively so I don't do a lot of video gaming and you know the idea that you're not just sitting down sort of playing Mario Kart with your friends is sort of what I think of as video games. So can you talk a little bit ah more you know we've talked about video games and how they themselves are these huge social community platforms. Can you talk a little bit about the ecosystem sort of around the games.
12:51
Mariana Olaizola Rosenblat
Yeah, sure, so gaming adjacent platforms are any platforms online that are popular among gamers. So it's a really fluid category. And some sites like or some apps like Discord are most sort of - they come to mind most readily because Discord it's a social networking app. Um, was created by gamers for gamers. It has since branched out into other you know people can create chat groups about anything, any hobby that they have, political conversations, anything in the world but it still has a very large gaming constituency and sort of even the site's architecture and aesthetics um was inspired by gaming sites. So Discord I would say is the prototypical “gaming-adjacent platform”.
13:45
But there are others. There's Twitch where gamers stream themselves live um and audiences of just watch them playing for hours on end. Reddit is potentially a gaming adjacent platform but I would say Discord is the primary one. It's just so popular among gamers. And the way Discord is connected to the gaming ecosystem is basically that most games, some games are not like this some are like what one would call a um persistent social space where you can just play the game indefinitely. But a lot of games. um start and end in a matter of 30 minutes or an hour um and they don't really lend themselves to community building necessarily so what gamers do is they might you know, enjoy playing a match with somebody or they have some friends they like to play with and then they establish their community on Discord, this adjacent platform where they continue discussing the game. They might branch into – branch out into other topics or, you know say ok, we're going to play on this date this time it sort of allows for this whole like planning and coordination that ah, that is totally innocuous when you're talking about playing a game, but often Discord has been used – or not often, but in locations now for extremist mobilization and indoctrination. And eventually you know to incite people to violence and other harmful activities. So. It's just the community building. And has community building aspect that discord brings to the table and any other platform that provides similar functionalities that can serve as a gaming adjacent platform in this way.
15:42
Kate McNeece
That makes sense. So you mentioned as well other technologies like secured messaging so you know again I think of secured messaging as you know something I'm using to communicate with the friends or people that I know so are those mostly you know, potentially problematic as offshoots or spillover effects from these types of larger online communities where as you say you sort of go down the funnel and you know you're talking during the game and then you're talking on the Discord and then you exchange phone numbers and you're texting or, are there other sort of unique ways that a messaging platform rather than a larger multi-person community can be used for these types of messages?
16:16
Mariana Olaizola Rosenblat
Right, so um I have been investigating encrypted chat applications like Whatsapp, Fiber and Signal but not necessarily in their role as gaming adjacent platforms. They may or may not be and that's precisely the point about encrypted chat applications, messaging applications, is that the content in them is encrypted which makes it impossible or near impossible for anyone to know what's happening um inside them. So right now I'm investigating foreign influence um in encrypted messaging apps. Very difficult again because we as researchers and the platforms themselves don't have access to the content shared. Um, we do have some anecdotal evidence from places like India, Brazil and elsewhere that encrypted chat apps are key channels for this dissemination of propaganda. And some extremist recruitment. But that's the very important challenge of encrypted chat apps is they may or may not be facilitating all sorts of harm and we just don't know because again the concept of the content is encrypted.
17:26
Kate McNeece
Yeah, that makes perfect sense So we've talked a little bit about how these technologies are susceptible to foreign influence or extremism but are there aspects of the businesses themselves or about how these platforms are trying to monetize their businesses, especially that. You know, disincentivize addressing bad behavior that that lead businesses to kind of turn a blind eye to this type of behavior.
17:54
Mariana Olaizola Rosenblat
So Business models differ and the incentives or disincentives for addressing harmful activity vary accordingly – generally detecting abuse takes resources. There is no way around it, companies have to invest in content moderation or other types of moderation and this applies to all interactive media places where users can add content and so forth. The regulatory landscape matters a lot too because when a business tends to suffer legal penalties for doing or not doing something, they have a strong incentive to comply. But in the absence of regulation, incentives for tech companies really cut in both ways. I can just focus on gaming here, as some of the same logic applies to other sectors. But just to add some specificity.
Game companies have incentives to grow their user base as much as possible. Some games – a lot of games – are free to play but as a result the companies monetize in different ways. The most common ways to make money in gaming is through in-game purchases, say advanced weapons or so-called skins, these are cosmetic enhancements for avatars, and virtual personalities. And access to premium content. So companies have a strong incentive to enlarge their user base and to keep that user base.
And here's where the incentives diverge a little bit. In gaming there's that traditional kind of core constituency that's traditionally averse to moderation and greater inclusivity in games. This is like the you know. Prototypical gamer, white male, teenage, they’re kind of rebellious, that that's definitely a stereotypical picture, but it's true that that forms sort of the core constituency of video games traditionally the demographics have changed a ton. More women – at least in the US they form like 48% of gamers. And it's become more and more diverse ethnically, in terms of gender and so forth. So companies want to keep their core demographic but they also need to increasingly cater to a growing diverse user base. There was a recent controversy that's kind of very tangible here, that's the Sweet Baby Inc. Controversy – I don't know if you followed that one.
20:42
Kate McNeece
I don't know about that one. I remember hearing about the Gamergate controversy which I think was sort of on those topics where they were I guess stalking female video game journalists and putting their addresses online and things and harassing them but I haven't heard of this other one.
21:00
Mariana Olaizola Rosenblat
Exactly well, this recent one is sort of like the new iteration of Gamergate – it was smaller in scale but it implicated a Canadian video game narrative development company. So I think maybe relevant to your Canadian audience, where this videogame narrative development company was accused by a group of gamers – this happened like a month ago or less – of forcing “wokeism” into the gaming industry.
Basically they theorized that this company was forcing gaming developers to integrate more diverse characters and make the narratives like too inclusive and that was bad so they started a group of gamers organizing on Discord and Steam, another gaming adjacent platform, started attacking minority developers, journalists and gamers, some of them my friends and colleagues. And this has obvious echoes to Gamergate which was as you rightly pointed out was the campaign in 2014, harassment campaign against women gamer journalists and developers who had dared to criticize misogyny in gaming. So this this puts the gaming companies returning to the business models in a tight spot because they have to speak out against such harassment and such use of their platforms, but they also don't want to alienate their traditional constituency and they have to figure out that that balance
But on the whole and recent research bears on this abuse mitigation and inclusion is good for business. Game companies have realized this and they have moved on from denial of abuse of their platforms and the fact that they do need to moderate to now at least attempting to show that they're ready to tackle the problems. And of course some efforts are more rhetorical than concrete. But they are moving in the positive direction.
23:05
Kate McNeece
I think that that's a great takeaway into I think what we have down as our next topic which is mitigation of these concerns. So in our foreign investment system if you have a transaction that could cause a national security issue, could be injurious to Canada’s national security, you know, one way of mitigating that is for the government to block the transaction. Just say you know you might be a bad actor, you can't own this video game company or this digital platform this social media or what have you, and I think we're seeing a little bit of that in the US so we're recording on April fifteenth so I think it's not front page news every day anymore. But the Bytedance/Tiktok legislative sort of proposal in the United States for lack of a better word.
So that's one way you could deal with it the other way we could deal with it from a foreign investment perspective is for an investor to enter into an agreement with the government saying you know here are the steps x, y and z that we will carry out to ensure that you know you are comfortable that this does not injure Canada's national security. So if you're an investor who thinks they might run into this issue or if you're a business who just wants to address this, what do you think are the best kind of concrete steps as you say to address these types of issues of foreign influence, extremism, harassment that that can plague these platforms?
24:21
Mariana Olaizola Rosenblat
For video games and here I mean each technology is different so I'm happy to go one by one but um, let's start with video games. I think most basic is for the company first to have clear policies around prohibition of extremism, harassment, transparency around foreign influence and then a sort of workable system for conducting thorough moderation of this space. Not just text based moderation, but as I mentioned a lot of games have simultaneous voice chat and also user generated imagery so companies need to be comprehensive in their moderation and proactive not just reactive. There are two types generally of moderation in online platforms one is reactive where companies allow users to report harmful activity and then the company will review it and decide, okay, do they need to ban the user or take any action, remove a post, but in gaming since everything occurs in real time and this applies to metaverse applications which are kind of an extension of gaming in a way, everything occurs in real time and content and behavior are ephemeral. The moderation needs to be proactive and that's really difficult to do at scale, but more and more there are technologies that come out for doing voice moderation in real time. But all of this takes investment, manpower and the adequate technology and so I would say the first thing to ensure is that the company has a robust system of moderation. They would alongside that have to be transparent about how well the moderation's working um and not just conduct internal research.
But I would say a very important point is to give outside experts and academics access to the platform. Maybe anonymized data to not compromise privacy of users, but some way to study what the risks are. So the reason why the more traditional social media is well studied, it's easier to study because the content is generally public but in games and gaming adjacent platforms for the most part, and I'm thinking of Discord specifically, not Twitch which is more public, the content is hard to study so the companies really do need to allow access to researchers.Some legislation in Europe already goes in that direction compelling companies to be transparent in pretty rigorous ways.
27:05
Kate McNeece
It would seem to me that that would be helpful sort of on a number of axes as well because not only would people like you – academics, researchers – have the ability to identify risks or you know sort of externally monitor, which is something we see I think in a lot of human rights contexts, externally monitor what's going on, but doesn't also kind of send a signal to the market of you know, “We're doing this, we are acting in a responsible way” rather than everyone racing to the bottom of trying not to moderate so that they get as many people as possible including those who are sort of against moderation. You know they could demonstrate that this is market. This is standard behavior and accepted behavior and kind of be more of a rising tide.
27:46
Mariana Olaizola Rosenblat
Exactly, a lot of the transparency that we see from social media platforms in general and online platforms is very superficial. They'll put out these transparency reports that show how often they've encountered X type of abuse and what they've done and they're just these charts and numbers that tell you very little about how effective their moderation is actually because there's no denominator. They just say we found a thousand cases of extremism. Okay, um, but what was the actual total that is on your platform. We don't know we have no way of knowing. In the case of gaming platforms, I know of only one company that has actually partnered with academic researchers to study extremism, one major gaming company. And I'm hoping that more sort of take that example forward? But yeah, right now gaming companies are very tight lipped and opaque and so it's quite difficult to know what is happening and the scale of what's happening and what are some potential solutions.
28:54
Kate McNeece
Yeah, what about for some of these other platforms something like ah like a Discord like a gaming adjacent platform or even an encrypted messaging platform. What do companies there do when they may have less control over the content or less access themselves to the content that could be causing the problem?
29:09
Mariana Olaizola Rosenblat
That's a great question and Discord and encrypted chat apps are different because Discord is not encrypted. They've made that choice so the platform actually has access to all the content and they say so but they choose not to um, go in what they call private chat rooms. They're not private, they're just invite only so they they've sort of set out this expectation of privacy for users that the company will not be listening in to or reading um conversations on Discord. Which I guess is fine if they hadn't kind of set up the platform in a way that's very easy to exploit by extremists. And we've seen a number of cases already where mass shooters planned their attacks – I'm thinking of the Buffalo shooting in New York and El Paso, Texas, Highland Park in Chicago, and most recently some very high profile leaks on Discord where, you know there's illegal activity happening, very harmful, and Discord has actually has access to those rooms. And people like myself can even enter some of those chat rooms. And yet they just won't – they've decided as a matter of policy that they're not going to proactively moderate.
30:36
But if you think about terrorist mobilization or extremism, in those very exclusive areas of Discord where those bad actors are organizing, no one's going to report anything to the company because everyone in that group is implicated and so there's really no way for the company to learn through user reporting what's happening unless they do so proactively and there are ways to do it in my view while preserving a reasonable degree of privacy.
When it comes to encrypted chat apps that's a different ballgame precisely because of the encryption. I can't say I have any solutions to the problems of extremism and terrorism on encrypted chat apps, I would just say that um these applications that enable encryption and are meant really or are most appropriate for like one to one or small group conversations have because of the need to make money made their platforms more and more like social media adding functions and features that allow more and more people to congregate. Like thousands in some cases – and Telegram groups can be as large as 200,000. And those are a context where encryption really doesn't make sense, because where in the real world can you have 200,000 people congregating without anyone knowing what's happening? That just doesn't… Yeah, so um, I'd say the solutions are different for each, and I think one has to take sort of a nuanced approach but ah, the companies because of their business incentives are not really taking the problems seriously enough.
32:23
Kate McNeece
Yeah, so one possible solution that you raised to that specific issue of businesses not feeling particularly incented to monitor or moderate the content on their platform is sort of a regulatory or legislative or policy approach, so basically forcing them into or rebalancing the incentives perhaps to incent them toundertake some of these activities.What in your view are sort of the best proposals for legislative or policy approaches, or are there any governments sort of around the world, you know, not just in Canada with our policy that we discussed before, but elsewhere what are people doing legislatively to address these issues?
33:08
Mariana Olaizola Rosenblat
It may be too early to tell, but I think the effort in Europe under the Digital Services Act – the DSA – is the most promising so far. The DSA applies to all online platforms and there are different obligations for very large online platforms and other platforms. But it's very comprehensive and it mandates that these platforms have to be transparent in pretty concrete ways. They have to allow um vetted researchers to access the platforms and the data, there, to conduct studies. They also mandate that the platforms have to conduct risk assessments, so they have to assess and map risks including, you know, extremism and terrorist recruitment and disinformation as well. I would say that's an approach worth following.
The enforcement is still unclear because it's a piece of regulation that was passed relatively recently. The difficulty with – I won't suggest that regulating these types of platforms is easy. Especially um, in the US, and perhaps in Canada I don't know the Canadian legal system well enough, but platforms that traffic in user speech are very difficult to regulate precisely because of the guarantees of free expression. And you wouldn't want the government ether to be listening into or closely monitoring speech in online platforms especially ones that are supposed to be more private. And you also wouldn't want the government to prescribe certain broad categories of speech, like extremism and hate speech. Those are you know more or less well-defined but I wouldn't trust the government to necessarily you know in some in some contexts governments choose to define extremism as ah, pro-democracy…but's actually pro-democracy activism I don't think that's the case in the US, in Canada but there's always you know that kind of um aversion to giving government such power. So regulating these platforms is actually tricky but it can be done, mostly by mandating I think transparency can do a lot. And maybe some other procedural requirements like publishing risk assessments as Europe has done with the DSA.
35:42
Kate McNeece
Is this the type of industry where you think a code of conduct might – you know sort of an industrywide code of conduct such as it is would be helpful? Are we sort of past the point where that could be helpful? Should this be something we're thinking about in terms of emerging technologies that maybe we don't see in great use a lot – like wearables I guess is what immediately comes to mind – should we be trying to build in codes of conduct or ethical standards sort of at the outset of these technologiesto sort of mitigate these going forward?
36:16
Mariana Olaizola Rosenblat
Yes, if by code of conduct you mean sort of the best practices so in games we have code of –
Kate McNeece
Yeah.
Mariana Olaizola Rosenblat
So in the US space comp tech companies tend to just go forward roll out their technologies and see and hope for the best and they don't really proactively think about potential harms before they roll something out. This is um, very clear in the case of the metaverse technologies for example, Meta um, and other companies have rolled out headsets that do really invasive data tracking of bodily movements – this is sort of a separate topic but also pose serious safety risks. But they've decided to just go ahead and roll out technologies without first coming up with the best practices and doing thorough harm assessment and reduction. They do some and they tend to promote those efforts rhetorically and state them.
37:26
But it's definitely not I think with any um code of conduct you would have to make sure that it's not high level and very you know, ah, rhetorical and nice sounding but that it's actually concrete and robust and there's a way to hold different industry actors accountable. One type of institutional model that has some promise I think is this multi-stakeholder model, or cross-industry collaborations as well, where companies join and they decide we're going to abide by these standards and the more specific the standards are the better in my view – it seldom happens – but then they have some kind of mechanism to measure how each company is doing and some external parties to monitor that um and sort of.. It's like a peer to peer accountability. In the absence of regulation I think that model can be helpful but again, um, codes of conduct that are sufficiently specific and there's a way to measure. Companies’ performance according to those codes, I think that would be the only way to make them effective. Otherwise companies are very eager to sign on to non-enforceable commitments and very general principles that in the end nobody can really say that they're not upholding.
38:44
Kate McNeece
All right? Well I mean it sounds like there's ah, a lot to consider and a lot to a lot of work may be to be done in this area whether you know from the top down sort of government wise in the foreign investment context, or you know from the businesses themselves.
So look I mean I think this is a really interesting topic. Obviously where we are in Canada you know where it's April fifteenth right now I think it [the IDM policy] came out in you know the first week of March so we really haven't been in a place where we've seen this policy being applied at this point and we likely won't be able to see the effects of it until sort of well down the road. But I think it will be interesting to see with such a multifaceted consideration. You have privacy concerns on one side you have the need for enforcement on the other side you have this need for proactive moderation which is very difficult to do when balancing those other two aspects it'll be very interesting to see how this field continues to develop and I certainly look forward to reading you know your future work as you continue to study these issues.
39:53
Mariana Olaizola Rosenblat
Thank you, There's no shortage of work in this field which is why I'm in it and yes I'm also excited to see what new problems arise and potential solutions.
40:04
Kate McNeece
Yeah, well with that hopeful note I think will I think we'll cut it off. Thank you so much for joining us today and thank you to our audience for listening have a great day.
40:16
Mariana Olaizola Rosenblat
Thank you.
40:18
Counterfactual Podcast
Thank you for listening, Counterfactual is produced and distributed by the Competition Law and Foreign Investment Review section of the Canadian Bar Association. The opinions expressed by the participants in this podcast are their own and do not necessarily represent those of their employer or other organizations. If you enjoyed this podcast or would like to join the Canadian Bar Association, please visit www.cba.org/sections/competition-law.