Misinformation Effect on Innovation with Doowan Lee
Misinformation Effect on Innovation - Untold Stories of Innovation
“The quintessential definition of innovation is essentially: How do you solve large-scale problems cheaper, faster, and better than anybody else?” Doowan Lee, Strategic Advisor to the Institute for Security and Technology
From today’s episode you’ll learn:
Say it first, say it loud, say it often, and say it with others. This is key advice Doowan Lee shares with psychological operations professionals, and it’s applicable to any innovator trying to gain traction in our digital age. “Short and distort” is a dangerous but effective strategy in digital communication. To combat this, Doowan, Strategic Advisor to the Institute for Security and Technology, must both deliberate and try to answer the misinformation efficiently and succinctly. Timing is of the essence, and while Doowan acknowledges the importance of storytelling, framing, and motivation within innovation, there is the key factor of timing. Even among a plethora of disinformation and sprawling media coverage, Doowan Lee believes that truth and innovation will prevail among open societies. Listen in and share with us your own questions and advice on how you and your innovation teams are tackling disinformation in this fast-paced era.
Interested in the articles and information discussed in this episode? Check them out: Freedom House, “Democracy Under Lockdown.”
Doowan Lee is the Strategic Advisor to the Institute for Security and Technology. He specializes in disinformation mitigation, balancing deliberation and truth-seeking with the fast-paced world of information sharing made possible by the latest technology. Doowan is driven to bring scale and parity to influence intelligence to support organic discourse, national security, and democracy.
TRANSCRIPT
This episode, misinformation effect on innovation is powered by data storytelling training from Untold Content and Data+Science. Transform your data into powerful visual stories by learning best practices in data visualization and technical storytelling. Whether you’re a PowerBI or a Tableau person—or just want to better communicate your data—this workshop will inspire you to see the stories that lie in the data. Learn more.
Katie Trauth Taylor: [00:00:04] Welcome to Untold Stories of Innovation, where we amplify untold stories of insight, impact and innovation. Powered by Untold Content. I’m your host, Katie Trauth Taylor.
Katie Trauth Taylor: [00:00:19] Our guest today is Doowan Lee. He is the senior director of research and strategy at Zignal Labs, a leading digital media analytics and influence risk solutions company. And powering open society and national security. Doowan is an influence intelligence expert. He is also an expert when it comes to disinformation, analysis, and national security. Doowan, I’m so grateful to have you on the podcast to talk more about these topics.
Doowan Lee: [00:00:44] Thank you for having me, Katie. I’m very excited. I think, you know, I always tell people that a good story is only as good as how far it travels. So thank you for giving me this opportunity to share my personal story.
“What led you to the work that you do today?”
Katie Trauth Taylor: [00:01:00] Yes. Could you please start there because you have such an incredible personal journey. I’d love to hear what led you to the work that you do today.
Doowan Lee: [00:01:09] Yeah, thank you. You know, I was born in South Korea and I came to the United States to go to graduate school, but that does require a little unpacking. So when I was in Korea, it was quite a few years ago. And back then, South Korea was not as Democratic or as open, most people tend to perceive South Korea [that way] right now. So back then, I was a very committed pro-democracy activist. And when I graduated from college, I wanted to essentially pursue the same passion, but only to essentially understand how the dynamics of social mobilization work better. And that’s what led me to come to the states to go to graduate school. I went to the University of Chicago. My training focused on two areas. One, how external state actors essentially exploit organic political and social movements in pursuit of their foreign policy objectives. So some call it proxy warfare, but it’s not just warfare. I think it is very pervasive whether we are experiencing an intense conflict or during even peacetime. On the other side of my training was quantitative modeling and computer simulation. So typically there is not a lot of reliable data on what we call proxy warfare. So essentially, I use a lot of what we call agent-based modeling to understand how these dynamics propagate not only in the physical environment, but also in the information environment. And I ended up getting my first teaching job at the Naval Postgraduate School, and I was very lucky to get the job. And because of my training, I ended up developing and executing federally-funded quantitative [unclear wording] projects on behalf of the US government. And now I work in Silicon Valley mostly doing the same thing. But I must say that I’m very… every day I am excited to go to work because I work with really talented people, you know, dedicated people. And I feel like, you know, given this rise of disinformation in 2020, especially with the pandemic, I feel like, you know, my personal circle of wonderfully unforeseen accident is coming to a close, so to speak, and it’s becoming of really rewarding self-sustaining feedback loop. And I think there’s a serendipity kind of brought me to you as well to Unfold Stories [of Innovation]. So I’m very excited to be here.
“Social media platforms promise to bring us closer together. But in many ways they can really divide us, too. These divisions that are exacerbated by misinformation effect on innovation and content is rapidly shared.”
Katie Trauth Taylor: [00:04:04] It’s wonderful to hear that professional journey and that you’re now in a place where collaborations are thriving. And I think based on our earlier conversations sort of leading up to this podcast recording, it has a lot to do with being united in a mission to really combat disinformation and to ensure that democratic ideals are still upheld as we pursue a future with AI and other innovations. And one of the things that you and I had chatted about before this call is that we live in this age of rapid and globalized content dissemination. You know, content-driven innovations like social media platforms. They promise to bring us closer together. But in many ways, of course, as we all know at a personal level, that they can really divide us, too. Misinformation effect on innovation. And the division is not really as simple as maybe a disagreement, right, between neighbors standing across from each other on their stoops or their lawns, but rather these divisions that are exacerbated by misinformation and disinformation and content’s rapidly shared. Sometimes we share it, not necessarily being able to track or understand the origins. And it’s very difficult for the everyday citizen to know how they’re contributing, or not, to some of these problems. Can you share more with us for those who maybe are a little overwhelmed by this topic? What is misinformation? What is disinformation and what are the risks that they pose?
Doowan Lee: [00:05:30] Yeah, I think that’s a very like well-explained question, Katie. Let me start with some of the definitions. So misinformation is essentially intentionally false, no, sorry, misleading information, but it doesn’t really have a palpable intent, right? So, to me… So there are a lot of people who subscribe to conspiracy theories online, and that has become even more pervasive this year because essentially a lot more people [unclear wording] and then essentially they spend more time in front of their computer, right? Misinformation effect on innovation. But it is, you know, decentralized and uncoordinated. I mean, I don’t think a lot of our citizens intend to do harm to our national unity. So to me, the main differentiator between disinformation and misinformation is that intent, right? Are you trying to do harm to perhaps our political process or public health? So there are a lot of unwitting super-spreaders, so to speak. But to me, I think that it’s an important notion to take into account because every time we had an innovative communication platform, you know, think about, you know, print press, right?
Katie Trauth Taylor: [00:06:53] Sure.
Doowan Lee: [00:06:53] Think about radio or think about, you know, film and everything. You know, whenever we have these innovative moments, uh, political exploitation is almost inevitable, right? So if you think about the 1930s, we think about the 1930s and 1940s, you know, we’ve seen this rapid weaponization of mass media in the form of radio, film, and etc. So this has been with us for a long time. I think you’re right. What is really different this time, it’s not so much the trend of exploiting mass media or mass communication, but it’s the pace. Right. So to me, like one thing that I always want to share with my friends and policy makers is that: don’t try to convert all disinformation and misinformation. Essentially, we need a good taxonomy of disinformation along agency, intent, and coordination. Misinformation effect on innovation. So that sounds a little bit academic, which is a nicer way of saying very… Not very clear, right? So you’ll think about the 2016 presidential election where our intelligence community came out with a fairly strong consensus that the Kremlin was meddling in our political process. And to me, that is different from perhaps people just talking about how Bill Gates created the Coronavirus, right?
Katie Trauth Taylor: [00:08:30] Sure.
“I think we can essentially just focus on the most damaging misinformation effect on innovation campaigns as opposed to trying to clean the entire information environment.”
Doowan Lee: [00:08:31] Because they are very different along the three variables: agency, intent, and coordination. So let me start with the Kremlin example. Clearly, there was a very identifiable state actor doing this to essentially disrupt our political unity, the intent. Right. And it’s not just the Kremlin. The Kremlin relied on this organization called the Internet Research Agency, IRA. And the IRA has a great many proxies in the information environment, both essentially amplifying the far left and the far right of our political spectrum. So to me, it’s really understanding the scale of this threat coming from disinformation, right? Misinformation effect on innovation. And if you understand who’s doing this right, what is the strategic intent behind this disinformation and how well coordinated this campaign is? I think we can essentially just focus on the most damaging disinformation campaigns as opposed to trying to clean the entire information environment. Now, that’s a very dangerous proposition from my perspective, because we shouldn’t throw out the baby with the bathwater. Right.
Katie Trauth Taylor: [00:09:52] Yes!
Doowan Lee: [00:09:52] And just because our organic discourse is severely compromised by disinformation doesn’t mean that we can essentially enact policy mandates that may compromise or undermine freedom of speech, right? So I think there’s a very delicate balance to strike. And to me, it really boils down to agency, intent, and coordination.
Katie Trauth Taylor: [00:10:19] That’s such an excellent point. And I really appreciate the two examples to bring that to life, because as you were saying, the Bill Gates example and some of the misinformation or disinformation out there about how he had sort of created Coronavirus that was far less coordinated, right? Was it sort of dispersed pockets of converse?
Doowan Lee: [00:10:43] Yeah. So to me, you know, yes. I mean, it was not like it was inconsequential. It did have some consequences because what we’ve seen this year is this really interesting, you know, convergence of what I call useful idiots. Excuse my language for a second. But the idea here is that people who subscribe to different conspiracy theories, they’re always looking for what I call like authentication or validation. Right. Misinformation effect on innovation. So people who didn’t believe in public health mandates or requirements during the Corona pandemic. Right. So when they saw this news, you know, they saw that as a sign of validation not to wear a mask when they’re in public and so on, so forth. It did have public health consequences. But to me, it was not as great as what we experienced in 2016 and what we are most likely to experience in three weeks, so that is one thing that keeps me wide awake at night, these days. Essentially thinking about how to mitigate this collusion of authoritarian regimes, the agency working together to undermine our democracy.
Katie Trauth Taylor: [00:12:11] You know, that is one of the questions that’s been at the front of my mind actually is what keeps you awake at night? You know so much about the research. You’re right at the front of the field when it comes to how we understand disinformation within social networks or mass media and otherwise. So would you mind sharing some of the things that actually keep you awake at night as you study and lead in this space?
Doowan Lee: [00:12:36] Yeah, so, I mean, that’s going to get me all pumped up a little bit.
Katie Trauth Taylor: [00:12:44] Sorry.
Doowan Lee: [00:12:45] No, no, no. It’s all good, Katie. I’ll tell you what. You know, I really like working and collaborating with people who are a little bit angry but who know how to channel their anger. Right. Because to me it really signals they care. They care about something and they see something is not working. And their frustration leads to some, like, you know, anger, right?. And perhaps that’s what keeps me awake at night, because I think we’re just not responding to a death threat, but as a nation. So, and this is much broader than our sort of parochial national interest. We have to remember that we built this global liberal order and essentially it was the mirror image of our ideals. So what keeps me awake at night is this global backsliding of democracy. And Freedom House just came out with a really good report. I encourage everyone to look it up. It’s called Democracy Under Lockdown. Essentially how authoritarian regimes are passing new laws in the name of public health to systemize and propagate what we call surveillance technology. So this is also what gets me a little bit worried about contact tracing. And the idea here is that, you know, how do we preserve privacy when we’re trying to promote data driven solutions to large scale public problems? That’s right. And right now, I think we are doing a better job at problematizing and surfacing this debate. So we need to think about data and privacy. We need to think about, you know, essentially, information-sharing technology, as well as a kind of freedom of speech. Misinformation effect on innovation. But that’s just us, right? On the other side of this equation, the Chinese Communist Party and the Kremlin, have been really pushing what we call surveillance technology enabled by data analytics as well as artificial intelligence. And it’s not just China and Russia. They are also exporting these technologies overseas. And so, you know, I’m pretty sure a lot of our listeners are very familiar with Huawei and 5G technology. And the idea here is that they’ll be competing governance styles about 5G technology and surveillance technology capitalism. And it is not just shielding ourselves from this kind of weaponization of technology, but think about, you know, perhaps half the global population who will be subject to this kind of state-driven surveillance programs, right? A lot of at-risk communities will be further threatened. A lot of minority rights will be further compromised and undermined. And yeah, I think that’s what keeps me wide awake at night because it’s not going to just stay on the other side. Think about the McCarthy era, right? Think about, you know, during the Cold War… I mean, during World War II, the Japanese-American internment camps, right?
Katie Trauth Taylor: [00:16:25] Yes, yeah.
Doowan Lee: [00:16:26] When the other side, you know, essentially undermines our democratic ideals, it does impact us in some way. And to me, like, you know, preventing that, you know, overcorrection on our side. Right. You know, because I have kids and I don’t want them to experience any of those, you know, terrible historical mistakes being made in the past. So, yeah, that’s what gives me wide awake at night.
“Storytelling, as we see when it comes to disinformation and the spread of misinformation, it can actually become a weapon.”
Katie Trauth Taylor: [00:16:56] Thank you. Thank you for going there with me. And I’m grateful to be. I think it’s critical that innovation leaders, even those who aren’t directly perhaps… Actually, I think we could make the argument that really every innovation team, if they’re not grappling with these issues, they still… That doesn’t mean that these issues don’t exist or that they’re not… Have relevance to their innovative work. And I know that this podcast in particular, we try to focus in on storytelling and what role it plays. And oftentimes as we do that, it can sort of storytelling can get championed by myself, you know, our team at Untold in particular, as this very positive thing. But actually storytelling, as we see when it comes to disinformation and the spread of misinformation, it can actually become a weapon. And so especially for a rapidly spreading extremism or hate. And so I’d love to hear more of your research on that. And then, of course, we can talk about the implications that all of this has for innovators as well.
Doowan Lee: [00:18:02] Yeah. So let me start with essentially how perhaps foreign extremist actors exploit disinformation, you know? So I want to unpack this question into two sections. I’m still a recovering academic, so I may come across somewhat boring sometimes.
Katie Trauth Taylor: [00:18:26] I’m right with you there. I love it.
Doowan Lee: [00:18:30] The first part is: we need to be a little bit careful about how we are obsessing with disinformation, especially this year. This is not a new thing if you actually look up the origin of this term, you know, Stalin coined the term “dezinformatsiya.” Essentially, it was a very powerful element of what the Soviet Union used to call “active measures.” So the whole notion of disinformation goes, you know, quite bad, right? And we have to understand that this is just one element of perhaps what foreign policy experts may call “shock power,” in that is, how do you use information to perhaps cause, you know, or influence, you know, overseas communities or political communities and so on? So to me, it’s really about affecting somebody else’s political decisions. So I think that is one thing that we need to understand, and that is, it’s not just disinformation. Disinformation usually is a component of a much broader information operations campaign.
Katie Trauth Taylor: [00:19:53] Yes, yes.
Doowan Lee: [00:19:53] So that is one thing I do really want to highlight, because, again, going back to agency, intent, and coordination, if we do not see those overarching characteristics of disinformation, we’re going to fail to see which one is more threatening to us and not, right? So that’s one thing in terms of how perhaps like foreign or extremist actors exploit disinformation. I always like, you know, highlight sort of this notion I coined quite a few years ago, which was “WMD.” Typically it stands for “weapons of mass destruction.” But right now, I think disinformation is truly becoming a weapon of mass destruction, right? So what do I mean by WMD?
Katie Trauth Taylor: [00:20:44] And if… I hate to interrupt you, but…
Doowan Lee: [00:20:46] Of course, of course.
Katie Trauth Taylor: Misinformation effect on innovation. [00:20:46] …you actually studied weapons of mass destruction and you also were part of some of the teams who were able to track down the terrorists involved in the attacks on 9/11, right? You have this interest.
Doowan Lee: [00:21:01] I mean, I wouldn’t go as long as 9/11. I’m not that old, Katie. I’ll take some credit for the Islamic State. So, like, you know, I work very closely with the U.S. government in terms of, you know, illuminating and like mapping essentially this global network of foreign fighters that were traveling in and out of Syria between ’14 and perhaps ’17. So that is something that I’m very proud of. And in fact, when I was working with a certain command in the U.S. government, I deployed communication devices in order to help our special operators interface more effectively with their local partners. So this is something that I. Feel comfortable to state that, you know, I know a little bit about. I’m just trying to be a little bit cautious because, you know, there is, you know, this is essentially just one program in a big campaign. Right. And so WMD, traditionally, right, is something that I’ve cared about. But over the years, I’ve come to realize that, you know, information operations are becoming, really, the primary weapon of mass destruction right now, right?
Katie Trauth Taylor: [00:22:31] Right. And I just wanted to share that background of yours because you don’t say that lightly. You understand the gravity when you say that you have a significant amount of professional expertise and knowledge of what that really means to compare those things.
Doowan Lee: [00:22:45] I don’t know. I mean…
Katie Trauth Taylor: [00:22:47] You’re quite humble.
Doowan Lee: [00:22:47] You’re being overly generous, overly generous.
Katie Trauth Taylor: [00:22:51] To match your humility, as always.
Doowan Lee: [00:22:54] No, no no. You know there are true experts out there.
Katie Trauth Taylor: [00:22:57] Sure.
Doowan Lee: [00:22:57] I would not like count myself as one of them. I’m just a quantitatively-driven analyst, at best. I’m just being, like, being real, Katie. But there is a very important analogy to make. And that is: you know, so I did support some, what we call, “CWMD” projects, you know, “counter weapons of mass destruction” programs. And, of course, you know, nukes are the most conspicuous weapon of mass destruction. And the reason that it’s so dangerous is that it leaves a lot of residual radiation, right? And once, if there is, half-life is like hundreds, if not thousands of years, right?
Katie Trauth Taylor: [00:23:44] Yes.
Doowan Lee: [00:23:45] So once it is out it’s really hard to contain and it, you know, affects a lot more people than the immediate, like, you know, radius of detonation, right?
Katie Trauth Taylor: [00:23:59] Yes.
Doowan Lee: [00:23:59] The reason I’m making this analogy is that disinformation is very much like that. It’s half-life is so persistent, right? Just like, you know, you cannot get rid of radiation from a nuclear detonation very soon, right? Disinformation is very much like that. It stays radioactive [for] so long. Right. It remains there to keep us divided, right, and subverted for many years to come, if not decades after that disinformation is detonated in our organic discourse space. So to me, it’s a reasonable analogy, right?
Katie Trauth Taylor: [00:24:43] Yes. OK, so I’ll finally ask you to break down this metaphor. So WMD, what do these parts stand for?
Doowan Lee: [00:24:50] So, like, you know, this is something I coined when I was working with a component of our DoD [Department of Defense]. And it stands for, essentially, the “word.” The message, right? That’s the “W.” “M” – the “messenger,” right, how the message is spread and “D,” the “D” stands for the “deed.” So let me unpack, you know, what I mean by that.And typically, this is how disinformation spreads as well. I think I intentionally said that a good story is only as good as it travels earlier, because precisely that’s one of the first principles of information operations. So especially for, like, you know, violent extremist organizations, the word will have, you know, three components. First, you know, self victimization. Self victimization is a powerful, powerful mechanism because essentially you’re trying to show that this is in their right to fight back. Right. Whether it’s real or not, it is inconsequential. But all powerful stories would typically start with self victimization, right? Think about Luke Skywalker, right? The rise of the galactic evil empire, right? And everyone is bowing to the emperor, right? So I think that is a really important notion. However, there is a secondary notion that I also highlight about the word, and that is the dehumanization of others. So, you know, I don’t have to give you historical examples because I think most people are very familiar with this notion, right? So and they go hand in hand, right? The more, you know, self victimized you are, the more you can dehumanize the others, right? And I think this is what’s happening in our organic discourse at this point as well. If you think about the extent of our current political polarization and you will see victimization and dehumanization taking place on both sides of this ideological spectrum. And to me, the last one is essentially the sense of honor and duty. Right. It’s always pervasive, like, you know, to restore order or to restore the collective or public life or to restore peace or to restore our national pride. So typically know that the story will have these three elements and for the “M,” the messenger. I think this is what’s really different this time, right? Because, you know, even if you have a great story, perhaps you may have the best story, right? If you don’t have a lot of people sharing the story, right. And essentially you’ll be drawn out, right? So outpacing the other side is key when it comes to storytelling or information operations, essentially. How do you essentially accelerate better than others? I think this is where information communication technology is playing a disproportionate role right now because there are so many available automated mechanisms. Right. So colloquially, we call them trolls, bots, sock puppets. Right. Or even hacked accounts. Right. Why do they matter? Because essentially they let you spread your word faster than the other side. Right. And to me, this is the real threat of current disinformation. The reason being we are trying to stay with the truth, right? I think that is the soft underbelly that the Chinese Communist Party and the Kremlin have found to exploit. And the last part is the deed, right? Because, you know, you can talk all you want, but unless you can show that you’re actually walking the walk, it’s hard to be very persuasive, especially to those who are either sitting on the fence or on the other side of the fence, right? And I think this is what the seventh state has done so well, and that is if you look at their propaganda products, you know, they may stage spectacular attacks or operations. Right. But to me, that is not the main effect they’re looking for. Why do they always have camcorders or why do they always videotape what they do? Because in terms of operational security, that would make no sense, right? But that’s the content they can use for recruitment. Right?
Katie Trauth Taylor: [00:29:50] Right.
Doowan Lee: [00:29:50] To solicit donations, right? To essentially… To amplify their message where they don’t have a lot of control, right? So to me, WMD, yes, disinformation is becoming this massive weapons of mass destruction. But also, you know, there is one difference between news and disinformation, and that is, you know, I’ve worked very closely on the North Korean nuclear weapons program. So, you know, one thing that is very obvious is: can they deliver the warhead? Right. So the delivery system is almost as important as the warhead. Right.
Katie Trauth Taylor: [00:30:32] Right.
Doowan Lee: [00:30:32] So a lot of times they’ll spend more money and more resources building this delivery system, right. Whether that’s a submarine or a ballistic missile system. Right. That is very engineering intensive, very costly. But this information is cheap to deliver to another country, to another environment. Right. So it is perhaps the most cost. What I call if this is the most cost effective remote weapons system. Right. So we send drones, right. To go after back-actors in overseas environments. Right. That is not easy, right?
Katie Trauth Taylor: [00:31:16] Mmhm.
Doowan Lee: [00:31:17] But sending a conspiracy narrative over the Internet to, you know, in fact, foreign audiences or foreign political communities, right? You know that it is so cost effective and so fast, right? So even sending a nuke to another country takes hours, not hours, minutes, right?
Katie Trauth Taylor: [00:31:39] Yeah.
Doowan Lee: [00:31:39] Sending a tweet is instantaneous, right?
Katie Trauth Taylor: [00:31:42] Yeah. Yeah.
Doowan Lee: [00:31:43] So it’s not just an effective weapon of mass destruction, but also it is perhaps the most effective, I would call, remote warfare weapon system.
Katie Trauth Taylor: [00:31:54] You know, can you dive into this a little more and share some of the tactics that are used by foreign governments to spread disinformation? You’ve talked about social media. Yeah, I’d just be curious what other tactics?
“Say it first, say it loud, say it often, and say it with others.”
Doowan Lee: [00:32:07] Yes, I think to me. So I’ll try to make it a little bit colloquial because I want always like, you know, my audience to have something to latch on, something to remember. Right. So when I used to train and educate psychological operations professionals, I used to tell them to remember four things: Say it first, say it loud, say it often, and say it with others. So let me unpack what I mean by these four things. So, “say it first” is that you really need communication specialists, or like, you know, essentially experts who can essentially produce their own interpretations of events faster than anybody else. First move advantage in the information environment, it is… Is almost everything. You say it first, right? And then essentially you’re putting the other side on the defensive, right?
Katie Trauth Taylor: [00:33:13] Mmhm.
Doowan Lee: [00:33:13] And they’re busy catching up as opposed to projecting their own narratives or interpretations. Right. So one thing that the Chinese Communist Party and the Kremlin do really well is having this army of communications specialists and they are active and agile. Something happens, they come out with a statement like this *snaps,* right?
Katie Trauth Taylor: [00:33:38] Mmhm.
Doowan Lee: [00:33:38] And we’re fact checking whether our statement is correct or not. And the thing is, it’s very shallow. Right. But they are fast, agile. It’s both quantity and quality, because I think, as you know, revolutionary, like, you know, state, party states, you know, they understand the importance of propaganda. So, you know, for us, like public affairs officers, they are almost always secondary to kinetic operations. Right. It’s the other way around, especially with some of the authoritarian regimes that we are competing against. So that is: say it first. Right. Which will require retooling how we understand information operations quite a bit. We are still very analytic and very reactive. Right. As opposed to trying to get into the fray before anybody else, right?
Katie Trauth Taylor: [00:34:34] Mmhm.
Doowan Lee: [00:34:34] We’re getting better. We’re getting better. Our public affairs officers are getting much better than before. And a lot of our government agencies and also, like even public sector organizations, are very active on social media. So let me move to the second part, which is, “say it loud,” right? Again, the notion of this volume that essentially can drown out competing narratives or competing interpretations of the same event, right? And this is where I think especially the CCP and the Kremlin employ a large number of automated dissemination mechanisms, like I said earlier, bots, trolls, sock puppets, hacked accounts, and etc.. Right. And then I said, “you say often,” like, you know, if you actually calculate the frequency of their statements and how their statements are being amplified, these automated accounts, that is much higher than the other side, the Democratic side of this equation.
Katie Trauth Taylor: [00:35:44] Mmhm.
Doowan Lee: [00:35:44] So volume does matter. Frequency does matter. Right. And the last part is, “say it with others.” And that is, again, I really want to re-emphasize this notion of third party validation, right? If you say it yourself all the time, that is just your opinion. But if others come to your aid, right. And amplify your narrative. Right, that essentially gives you this perception of representativeness. Right. And I think if you look at the CCP’s, like, public statements or they’re, like, information operations, they employ a lot of their partners, like third party countries, organizations, they have a lot of journalists, scholars, or thought leaders on their payroll. Right. Not because they love these people, but because they understand the importance of third party validation. So I think these are some of the tactics they’re using very well. You know, I am a perpetual optimist. I think we’re getting better. We’re getting better. I think, you know, I do think that innovation is on our side. I think open society promotes innovation much better than in a closed society. So they had to learn it the hard way and experimenting with a lot of different tactics, but I do think the true innovative side of this art is still on our side. We just haven’t tapped into it yet.
Katie Trauth Taylor: [00:37:25] So let’s go there and let’s talk about innovators, innovation teams, innovation leaders, what they should be doing in terms of institutional priorities around disinformation, what threats or risks are posed to their companies, not just, you know, I think we’ve talked quite a bit. We’ve touched on the role, of course, that the public sector plays in this in terms of national security. But what about industry and what threats exist when it comes to disinformation? You actually shared with me a wonderful research study on this, and we’ll link that in the show notes here, as well as the other references that we’ve touched on. But this research study actually shows there’s a significant amount of revenue loss and and risk and threat to the corporate world when it comes to disinformation.
“I want to highlight that disinformation is also a dual-use threat. So it’s not just dual-use technology.”
Doowan Lee: [00:38:16] Yeah, I think, you know, one thing that we have to really appreciate is that, you know, I love talking about dual-use technology, because to me, like, you know, information communication technology is a very powerful dual-use technology, right? Additive manufacturing is a dual-use technology. But I also want to highlight that, you know, disinformation is also a dual-use threat. So it’s not just dual-use technology. We also have to think about dual-use threats. Right. So disinformation or, you know, nested within perhaps a broader information operations campaign. It’s not exclusive to the public sector or national security. So there are two, I mean, important notions to take into account. Number one, you know, how strong we are externally is just a function of our economic and societal vitality. Full stop. In fact, national defense is very expensive. It is the most expensive item in our national budget. Bar none, right? So maintaining our economic vitality is perhaps the most important element of our national security. In that regard, in that regard, how disinformation undermines our economic integrity is something that we’re not paying enough attention to. So I’ll give you a few examples. Even just for 2019, you know, the estimated damage to our economy from disinformation amounts to about seventy-nine billion.
Katie Trauth Taylor: [00:40:15] Wow.
Doowan Lee: [00:40:15] That’s a huge chunk of our financial resources, right?
Katie Trauth Taylor: [00:40:18] Mmhm.
Doowan Lee: [00:40:18] And, you know, we called it “short and distort,” right? So there are a lot of stock market manipulators out there, too. So imagine, right, you know, imagine that you have a competitor in your industry, Katie, right? And tomorrow somebody is spreading this rumor that you caught the Covid virus, right? And, essentially, you’ll be compromised for the next few weeks. Right.
Katie Trauth Taylor: [00:40:48] OK.
“What would be the intent of that disinformation campaign?”
Doowan Lee: [00:40:49] What would be the intent of that disinformation campaign? So perhaps they will reach out to the same sort of partners or clients that you’re working with. And then say, “hey, Katie is compromised, so you should come work with us.”
Katie Trauth Taylor: [00:41:06] Sure.
Doowan Lee: [00:41:06] So this is, in fact, very pervasive. I cannot give you specific names, but several CEOs in our country have faced these kinds of disinformation campaigns this year alone.
Katie Trauth Taylor: [00:41:19] Wow.
Doowan Lee: [00:41:19] Essentially. And immediately that company’s stock would dip, right?
Katie Trauth Taylor: [00:41:25] Mm hmm.
Doowan Lee: [00:41:26] And then you have to recover and you have to come out with your corrective statement and everything. It’s a huge resource-suck to that affected company, right?
Katie Trauth Taylor: [00:41:36] Absolutely. Yes. Yes. You think of the stock market just having so much of its, you know, vulnerability sort of comes from the fact that it’s driven by perceptions…
Doowan Lee: [00:41:48] Exactly!
Katie Trauth Taylor: [00:41:48] …To a big extent.
Doowan Lee: [00:41:49] And I’ll give another example. So there was like, you know, a big industry conference in Las Vegas. I’m not going to say which company, because it’s still pretty sensitive.
Katie Trauth Taylor: [00:42:07] Sure.
Doowan Lee: [00:42:07] But I can say… What’s that the electric car company? Tesla! So Tesla had a demo car at this show and a Russian robotic company released this staged video where Tesla, the Tesla vehicle went into this, you know, essentially automated, like, you know, gadget and trying to show that, you know, essentially Tesla’s autopilot system was not working properly. It was a PR stunt just to get their name out, right?
Katie Trauth Taylor: [00:42:50] Mm hmm.
Doowan Lee: [00:42:50] It worked really, really well. However, on the flip side, for the next 24 hours, Tesla’s stock price dropped by two percent.
Katie Trauth Taylor: [00:43:00] Wow. So once again, “say it loud. Say it first.”
Doowan Lee: [00:43:04] Just released this final video. Right. And then essentially Tesla had to, you know, they lost a lot of value in the stock market at least for the next twenty-four hours, right?
Katie Trauth Taylor: [00:43:16] Right, right.
Doowan Lee: [00:43:17] For like, you know, “short and distort” like for traders? That’s like an eternity.
Katie Trauth Taylor: [00:43:23] Mm hmm.
Doowan Lee: [00:43:23] You can do a lot of things in 24 hours. Right.
Katie Trauth Taylor: [00:43:26] Right. Right.
Doowan Lee: [00:43:27] This threat is real. And of course, trying to come out to correct that misperception is always very time consuming and resource intensive.
Disinformation & Misinformation Effect
Katie Trauth Taylor: [00:43:38] Absolutely. You know, I’m thinking, too, the role, if we zoom in particularly to innovation teams, we’ve talked on this podcast before about strategic latency or the unknown security and safety threats underlying the innovations that we create and how we can’t always see those threats at the beginning when we’re first coming up with an innovative idea. So would you have advice or insights to share with innovators? What’s their responsibility when it comes to, you know, foreseeing those potential future risks and threats?
Doowan Lee: [00:44:14] Yeah. So, you know, I think about that a lot because I interface with a lot of, like founders, or like startups and also their backers and donors, right? So and I would tell them, like, you know, to think about three things. Right. And so I call them political opportunity, network awareness, and strategic framing. So let me unpack what I mean by that. Political opportunity means that, you know, again, let’s go back to the notion of dual-use technology and dual-use threat, essentially whatever… If you’re really good at innovation, which means there’s a heightened probability that malign actors will use your innovation, it’s as simple as that, because innovation by definition, is something that you can do at scale, essentially. To me, the quintessential definition of innovation is how you solve large-scale problems by, you know, essentially at cost. Essentially: how do you solve large-scale problems cheaper, faster and better than anybody else? To me, that is really what innovation should stand for. Right. But you think about like, you know, drone technology. So, like, you know, we pioneered this technology. We integrated with our overseas operations really well. Um, but it’s not something you can just contain. So from like ’15 and ’16 we’ve seen violent extremist organizations strapping commercial drones with explosive devices and then flying them to a security checkpoint…
Katie Trauth Taylor: [00:46:15] Right.
Doowan Lee: [00:46:15] …. or to a military base or police station.
Katie Trauth Taylor: [00:46:18] Mm hmm.
Doowan Lee: [00:46:19] And it doesn’t take a lot of engineering because the engineering is already baked into this technology. Right?
Katie Trauth Taylor: [00:46:24] Right. Right.
“Innovation means that you’re actually taking a lot of engineering into a deployable solution.”
Doowan Lee: [00:46:25] So innovation means that you’re actually taking a lot of engineering into a, you know, deployable solution, right? It’s as simple as that. You know, it has to be deployable and it has to be cost-effective, right? And that means it has a lot of technology and engineering baked in. Now, I can just use it to different purposes. Again, dual-use threat. Right? So to me, like, you know, what I mean by political opportunity is that always, you know, factoring in the possibility of political exploitation, right? That, your innovation, your technology will be exploited, right? And the idea is that to not kick the can down the street, understand that notion that, you know, political opportunity up front. Right. Because that leads to the second component, what I call network awareness. And that is, hey, if my innovation, if my technology is most likely to become a dual-tech, no, dual-use technology or dual-use threat, who are the stakeholders? You know, who will shape perhaps public oversight? Public policy and so on, so forth? And it does a couple of things for you, if you are an innovator and doesn’t have to be technological, right?
Katie Trauth Taylor: [00:47:55] Mm hmm.
Doowan Lee: [00:47:55] We experience like, you know, organizational innovations every day too, right? They can be organizational, it can be normative, it can be technological. But when you exercise this network awareness and that is who are the other stakeholders outside my domain expertise who still govern how this technology, this innovation, will be used, right? It does two things for you: a) it broadens your stakeholder space, which is always a good thing. Whether they are your future customers, partners or, you know, donors, doesn’t matter. Right. Essentially, you are showing a lot of political maturity by doing so, right?
Katie Trauth Taylor: [00:48:40] Yes. Yeah.
Doowan Lee: [00:48:41] Second, and this is essentially what I try to do in Silicon Valley, and that is it gives you this thought leadership dividend. And that is you’re not just building something. You’re essentially articulating the entire ecosystem behind something you’re building, right?
Katie Trauth Taylor: [00:49:02] Yes.
“Thought-leadership is perhaps the most cost-effective marketing strategy.”
Doowan Lee: [00:49:02] So this is not something nebulous or like, you know, soft because I’ll get very pragmatic with this statement: thought-leadership is perhaps the most cost-effective marketing strategy.
Katie Trauth Taylor: [00:49:16] Sure, yes!
Doowan Lee: [00:49:17] That expertise like, you know, ecosystem. Right. People come to you. Right. Just like disinformation is the most effective remote warfare weapon system. Thought leadership is also the most effective, most cost effective marketing strategy. The last part is what I call strategic framing. And that is, you know, I talk about, always talk about, three things, right, when it comes to storytelling. The problem, the solution and the motivation. So let me unpack what I mean by them. And I think that this is something that I’ve been sharing with a lot of my friends, both in the government and the private sector, especially innovators, when they try to tell their stories. They alway try… They’re too enamored with their own solutions.
Katie Trauth Taylor: [00:50:14] Mm hmm.
Doowan Lee: [00:50:14] So I have the fanciest hammer, right? And this is the shiniest, it’s the lightest, it’s the most powerful hammer. And this is why you need to listen to me. Right. I think that should be the second step. The problem should be, you know, what are you solving?
Katie Trauth Taylor: [00:50:30] Yes!
Doowan Lee: [00:50:31] What are you helping address? Because going back to dual-use and dual-threat, right? A hammer can build a house or a hammer can demolish a house, right?
Katie Trauth Taylor: [00:50:43] Absolutely.
Doowan Lee: [00:50:45] So, really intensely, you have to do a lot of market research to do so. Right. And that is what is the extent of the problem you’re trying to solve? What is the gravity, right? And who has actually the money to solve this problem, right? So again, if you understand the problem, but not just the conceptual component, but the spread of it, right? How widespread it is and how intense it is. So, again, like, you know, a good cop show will always start with a crime scene, right?
Katie Trauth Taylor: [00:51:21] That’s right.
Doowan Lee: [00:51:22] Storytelling for innovation is no different, right?
Katie Trauth Taylor: [00:51:24] Thank you for that metaphor. That’s wonderful.
Doowan Lee: [00:51:28] Like, you know, graphically and grotesquely….
Katie Trauth Taylor: [00:51:30] Right!
Doowan Lee: [00:51:30] Describe the crime scene. The broken thing, right?
Katie Trauth Taylor: [00:51:33] Yes.
Doowan Lee: [00:51:34] Because once you see something that is so terribly broken or terribly costly, what’s the next question that comes to anybody’s mind? And then how do I solve it?
Katie Trauth Taylor: [00:51:45] Yes. Yes.
Doowan Lee: [00:51:46] Like, you know, who has done it? Who’s the, you know, who’s the perpetrator of this crime and so on and so forth. So that’s what I call the diagnostic frame. And the second part is, OK, like, you know, what’s the solution?
Katie Trauth Taylor: [00:51:59] Yes. What’s the prognosis?
Doowan Lee: [00:52:01] Yes, people will ask for solutions once they understand how terrible this problem-set is. People will become naturally curious about how do I fix it? And this is when you present your diagnostic frame. And this is not as simple as it sounds, because you also have to think about, you know, what I call the disproportionate impact that your solution provides, right? And that is, you know, people are not interested in linear solutions anymore. And there it is. And again, this is where you have to do a lot of thorough market research. Right. So, you know, the damage is about a hundred billion dollars a year. Right. And then my solution costs only one million. So do the math. How much you’re saving a year. Right. So establishing that disproportionate impact of your solution is really key. Right.
Katie Trauth Taylor: [00:52:57] Definitely.
Doowan Lee: [00:52:58] On top of that, you know, you have to understand, I can list three things about your solution space. Number one. Right. What are the required domains of expertise to do it? And that is, are you an engineer? Are you an organizational specialist or are you just a conceptualist like a big thinker? Right. So, again, each solution requires different components. And also you have to demonstrate this supreme expertise on which… That you know the components you need. Right. To provide this disproportionate impact. So it’s not just the impact, but also understanding which part can be combined the right way to generate that, you know, unparalleled impact. And the last part is what I call the motivational frame. And that is why should I care? Right. So, you know, this is what I call, you know, type three arrow. And that is you can have the right data, you can have the right algorithms, you can have the right, you know, machine-learning algorithms, artificial intelligence. But, you know, why should I care, right? If it’s for the wrong problem, right? Why should I care? And to me, this is where you want to be a little bit normatively driven, right? And that is know, for example, why do I care about this information? Because it undermines democracy and open society, right? Democracy and open society are far removed from, perhaps, detection algorithms of disinformation. But these are the normative motivators of what I pursue, right. So if you don’t articulate this motivational aspect of your storytelling, you know, you’re going to lose out a lot of, you know, relevant stakeholders because most people do not understand technical complexities, but they deeply care about whether you’re passionate or not. Right. And passion cannot come from just technology. What is the problem-set, right? It has to come from some kind of sense of duty, honor, right? Or public good. Right? So I always encourage my friends to think about these three things when they tell their stories, especially about innovation, because to me, I’ll [unclear wording] two things that I really deeply care about innovation. And that is, you’re doing essentially two things: you’re solving problems, right? I mean, that’s it, you’re solving problems. If you’re not solving problems, you’re not in the business, to begin with.
Katie Trauth Taylor: [00:55:51] Absolutely, yes.
Doowan Lee: [00:55:52] Number two, you are essentially doing what we call “decision science,” right? How do you solve problems? By enabling people to make better decisions, right? So to me, like, you know, when I tell stories, I always have that in mind. When I do this interview with you, Katie, right? You know, there are always two questions in the back of my mind, and that is by having this conversation, what is the problem you are trying to solve together, right?
Katie Trauth Taylor: [00:56:18] Yeah.
Doowan Lee: [00:56:18] And what are we trying to do to help others make better decisions about innovation, disinformation, or foreign information operations and so on, so forth. So to me, I think those are the two main sort of principles I want to leave on the table at the end of this interview, and that is focus on what problems you’re solving and focus on how you are enabling others to make better decisions.
Katie Trauth Taylor: [00:56:51] Absolutely, yes. Thank you, Doowan. This has been a packed hour together. I wish we could talk into the day and much more about all of this, but I’m very grateful for the frameworks that you’ve provided here, especially this last one that we’re leaving listeners with, which is, you know, build your innovation narrative around the diagnostic frame. What’s the problem? The prognosis, you know, what solution is available? And motivation. Why should we care? And what’s the public impact that you’ll leave behind even if you fail?
Doowan Lee: [00:57:23] Yeah. Thank you so much for having me, Katie. I think it’s been, you know, the fastest one hour I’ve had in a long while.
Katie Trauth Taylor: [00:57:30] Yes.
Doowan Lee: [00:57:32] But then again, I’m a recovering academic, so…
Katie Trauth Taylor: [00:57:35] Yes!
Doowan Lee: [00:57:35] …I can [unclear wording], you know, for hours on end. So thank you for having me. It’s been a pleasure to talk to you about this topic.
Katie Trauth Taylor: [00:57:44] Thank you so much. And thank you everyone for listening in. We’ll leave all of the resources and readings that we mentioned in this conversation in the show notes. Please reach out with any other questions and advice that you have on how you and your innovation teams are tackling disinformation in this very speedy era that we live in. Thank you. So much for being on the podcast.
Doowan Lee: [00:58:04] Thank you so much. You have a wonderful day and we’ll talk soon.
Katie Trauth Taylor: [00:58:07] Thank you.
Doowan Lee: [00:58:08] All right. Bye bye.
Katie Trauth Taylor: [00:58:11] Thanks for listening to this week’s episode. Be sure to follow us on social media and add your voice to the conversation. You can find us at Untold Content.
You can listen to more episodes of Untold Stories of Innovation Podcast.
*Interviews are not endorsements of individuals or businesses.
One Response
A superb logical development, an attrative subject handled exceedingly well.
Your proudful dad,