Exploring the Implications of Innovation with Zach Davis

“Why is innovation happening? I used to think that the answer had to do with necessity. That technology and innovation unfolded when people had something that they needed. The more I studied it, the more I realized that a lot of this innovation comes from: “why?” I was messing around with this. I wanted it. I was curious. I was interested. I didn’t intend for this to happen, this was serendipitous, this all just kind of came together” -Zach Davis, Senior Fellow at Lawrence Livermore National Laboratory

Innovation is almost always looked upon as a positive thing. However, what happens when a brand-new innovative idea has unforeseen implications? What happens when innovations are not necessarily created to have a positive impact in the first place? Today, Zach Davis, Senior Fellow at Lawrence Livermore National Laboratory, delves into understanding the potential threats and risks of innovation. Zach specializes in analysis of weapons of mass destruction and emerging technologies. He shares  how the business world communicates with the government and vice versa, as well as the complexities of innovative technologies. While necessity is the mother of invention, what is necessary might not strictly be what is beneficial and inspirational. Instead of strictly necessity, let curiosity, passion, and interest spark innovative ideas. And with the global community brought on by technological innovation, storytelling and who gets to craft the narratives are essential questions to explore when considering “strategic latencies,” aka the ways innovation can be utilized and explored. Make sure to check out https://cgsr.llnl.gov/ for more on Lawrence Livermore National Laboratory and the complexities of innovative technologies. Interested in looking further? Take a look at an example of the opportunities and pitfalls of global dissemination of information in the article Stealth Research: Lack of Peer-Reviewed Evidence from Healthcare Unicorns, as well as the interesting predictions made by Samuel Huntington in his intriguing innovative prediction regarding conflicts and information in the Post-Cold War landscape: “The Clash of Civilizations and the Remaking of World Order.”

Zachary Davis specializes in analysis of weapons of mass destruction and emerging technologies. He possesses regional expertise in South Asian security. At Lawrence Livermore Laboratory he manages a program on the military effects of emerging technologies. Dr. Davis teaches courses at the Naval Postgraduate School on counter proliferation and interdiction. He is the author of numerous articles and chapters, and the editor of volumes on nuclear proliferation, South Asian security, and strategic latency.

Listen to the Podcast

TRANSCRIPT

This episode is powered by Untold Content’s Innovation Storytelling Training. Increase buy in for your best ideas in this immersive and interactive, story-driven experience. Where your teams refine storytelling techniques for their latest projects, prototypes and pitches—and get inspired by 25 epic examples of impactful innovation stories.

Katie Trauth Taylor: [00:00:04] Welcome to Untold Stories of Innovation, where we amplify untold stories of insight, impact and innovation powered by Untold Content. I’m your host, Katie Trauth Taylor.

Katie Trauth Taylor: [00:00:19] Our guest today is Zach Davis, he is a senior fellow at Lawrence Livermore National Laboratory and a research professor at the Naval Postgraduate School in Monterey, California. Zach, I’m so excited to have you on the podcast to talk about something that we actually haven’t covered much on this podcast to date. And it’s sort of, I would say, the dark side of innovation, if you like, that way of phrasing it. But I think there’s sort of an abounding and popular narrative here in the US and maybe even globally, that innovation and progress and technological advancement is always good. And the culture in Silicon Valley sort of exacerbates this or amplifies it. We say “move fast and break things.” But your leadership is actually, and really importantly, in the opposite line of thinking, which is: what are the threats, what are the risks that are impactful to our national security or to our companies or to our livelihoods when certain risks and outcomes of innovation don’t go the way that were planned or actually when they’re actually invented for the sake of being harmful? So tell us more about your work. I find this to be incredibly fascinating, and I’m really excited to dive in.

Zach Davis: [00:01:38] Well, thanks, Katie. Yeah, technology is always a double-edged sword, and it always has a dual use nature to it. And in fact, as you mentioned in the defense industry and in the United States government, we’re specifically looking for innovation that can be applied towards what you might call the dark side. That’s our goal. And there’s a long history of this. In fact, its – technology seems to always reflect human nature more than… More than anything else. I’m reminded of that first opening scene in 2001: A Space Odyssey, where the apes are in front of the monolith. And it takes them about one minute when they pick up that bone to start banging it on the bones and looking around to see who they can hit with it. So this really is something inherent in human nature. And the technology really just is an expression of human nature. And with all of its, you know, with all of its flaws and itself having a dark side and a light side. Right?

Katie Trauth Taylor: [00:03:14] Yeah. What led you to be interested in this line of research in this line of leadership?

Zach Davis: [00:03:21] Yeah, well, I started off I mean, when I finally got around to going to college, which didn’t come earlier naturally to me, but when I did, I thought I wanted to go into the sciences. I was really interested in science and in physics. And being a surfer, I wanted to study oceanography and I thought that would be a good way to combine my interests. And so when I started going to school and going to those classes, and it didn’t take me long to realize that I was a lot more interested in the effects of technology, of the political consequences, of the decision making, of the outcomes of technology than the technology itself. And so that led me to start trying to look at science policy and technology policy and the decision making that went around that. And that, of course, pretty quickly leads you to nuclear weapons and consequential or strategic technologies that sort of have that capacity to change the world and take it off what you might think of as a sort of steady progress and take the world off into different directions. And so I got interested in nuclear weapons and nuclear strategy and nuclear policy and nuclear history and how those technologies started off and came to be and how they evolved into I mean, nuclear weapons and nuclear technology is a good example of what I was saying about dual use. Right? It can be a wonderful source of energy and is applied to medicine and is a real benefit to mankind and humanity, but is also it carries inherent within it this ability to destroy everything that’s been built and created. And so it’s sort of the quintessential strategic latency type of technology.

Katie Trauth Taylor: [00:05:35] Yeah. So you’ve written a lot about strategic latency. For people, for listeners, who maybe aren’t as familiar with the term: could you help define it for a layperson?

Zach Davis: [00:05:45] Yeah. So it goes by a lot of names and different, you know, different institutions, different researchers have different words for emerging technology, emerging and disruptive technology. We hit on the term strategic latency for a couple of reasons. One is we use this very overused word, strategic, right? So what is strategic? And what it means for us is that this is something that is especially consequential, not just every technology, but we want to distinguish how those technologies that have that world changing, something that can shift the balance of power, something that would… Could not be ignored if that was [unclear wording] defining and world power distribution defining. And so that kind of winnows down your list, of course, there’s still a debate about which technologies belong in that category or not. And there are a number that are now sort of candidates for that capability. But nuclear weapons, of course, are sort of the poster child for that kind of strategic capability that is inherent in the technology. The second word, latency, which is another amorphous term and means different things to different people, but latent, for us, refers to that underlying and sometimes yet unexploited, unappreciated aspect of the technology, because quite often these technologies don’t have the obvious application that now, it all, in retrospect, seems clear to us that these things had this latent potential to be used in these ways. But, oftentimes, these technologies lay sort of dormant until they meet up with other technologies, or that’s where we get into the realm of innovation, right? Someone comes up with an idea like, “well, you know, we could…” And then there’s a chain of events that brings that latency to the fore and then it can be exploited either for commercial or military or creative purposes. And so there is a lot of latency around in the technologies that there are lists and lists of technologies that could be exploited in different ways. And today we have the whole biotech field is really genetic engineering going off with latent potential. And there’s quantum computing, hypersonics, space is another area where there’s a tremendous explosion of latent technologies that have the potential – they could be on the list. But yet we don’t know. And maybe 20 years ago, people were talking about, you know, cyber and the Internet. And this has tremendous potential to change the world. And it wasn’t clear how that was going to unfold with iPhones and the like. And it has. And so these things are hard to predict and they are latent until, you know, the human motivation puts that intent behind it and starts to innovate and create new applications.

Katie Trauth Taylor: [00:09:47] So by definition, then, strategic latency, there’s some element of time involved and and sort of doing research on this. So a lot of innovation teams are focused on “futuring,” or trying to predict future trends, especially in consumer behavior, that sort of thing. What is, in your mind, the relationship between strategic latency research and the work of futuring that happens inside of innovation?

Zach Davis: [00:10:13] Well, that’s a really interesting topic-area because people are in love with prediction, right? Then, and people, they want it and they want to know the future and…

Katie Trauth Taylor: [00:10:27] They want to be the one who got it right.

Zach Davis: [00:10:29] They do they want to cash in on it or they want to write science fiction. Right? They want to… They want to see the future. And there’s a great – Jules Verne and H.G. Wells and Isaac Asimov – and they all had great ideas about the technology of the future. And today we have kind of a cottage industry of people trying to predict. But, I think more specific to the defense application world, you know, there’s the question of strategic warning, right? What’s coming? What… If they could do this? Can they do that? And so there’s a focus on what other innovators, other countries, other companies, other groups, other competitors are doing and “what if?” What if they were able to put this with that and, you know, so predicting also is – sorry to say – hard and there’s a sort of a unicorn that we’re chasing in prediction that maybe with enough data. Right? Maybe with big enough computers and enough data and the right algorithms, you know, we could come up with a prediction as to how these technologies are going to join up with human motivations and produce something we haven’t seen before. And is that going to be dangerous and is that going to be applied to military purposes? And so, you know, prediction in the US government really is the realm of the intelligence community where you’re looking at what’s coming and do we need to warn people about it? But it’s also in the realm of defense planning, right? Because you’re building today or you’re planning today for things that won’t come into existence for, sometimes, years and years, takes a long time to build a big naval or space system. And so you want to know what the battlefield looks like now. So that you can prepare for it, but of course, you can’t. So there’s a… It’s a robust industry these days, this notion of foresight and predicting what’s coming and how it’s going to manifest. And, of course, in the private sector, that is business intelligence. Right? That’s what it’s going to be doing.

Katie Trauth Taylor: [00:13:32] Yes. Well, there’s an additional irony and tension here between the motivations of the business world and innovators and the government’s role in national security. There’s really a clash here in terms of how we think about innovation. The innovator is using data and prediction to anticipate what might be profitable and to understand changing behaviors that might lead to a need for certain new technologies without necessarily having to think about the consequences of that to national security. And then, on the other hand, you have national security with – and defense and and national leadership and government – needing to have that as their major priority is safety while also still contributing to economic growth. So these things are in such tension, it seems to me that they really speak different languages.

Zach Davis: [00:14:32] Well, you’re so right about that. And it used to be I mean, at least during the Cold War, that many of the big technology innovations originated in the government. They were the result of big government programs, you know, the Manhattan Project.

Katie Trauth Taylor: [00:14:53] Yes.

Zach Davis: [00:14:55] Stealth or the iPhone. And so these government agencies and government projects were the birthplace of a lot of these world-changing technologies, and that meant, for a time, the government could control those things, right? You could have export controls and you could have secrecy and you could say, you know, not everyone is allowed to know about this and you could control it. Right? That was a way to control the broader effects of the technology. But of course, everything is spun off and the government is really no longer the, you know, the sole source of these kinds of big technological innovations. And it’s really all in the private sector. And that’s where Silicon Valley comes into the picture. And, of course, you know, there it is a different culture, but they have fundamentally different goals than the government. As you said, you know, the government’s main purpose is to protect the people and the nation. And the companies are primarily focused on making money and shareholder value. And so there’s a really big gap. Right. And it is a gap between the cultures. And you said, I mean, the language that people use is, you know, it’s hard to even talk across that gulf of government-speak and defense-speak and intel-speak and Silicon Valley entrepreneurial culture that has grown up with its own language and its own way of doing things. But interestingly, you know, the government, like I was saying, you know, wanting the best newest technology, wanting to understand the battlefield of the future and prepare for it is… At least understands that they need the private sector now. And interestingly, the private sector is global. These Silicon Valley firms are global. They are not US firms in many ways. They’re fully globalized and they have a global customer base and their talent base is global. They get people and experts from the world. And so you’ve got a situation where these formerly secretive defense industrial complex managers are forced to reach out to the primary source of innovation and try to bridge that gap, and so there have been a number of efforts put in place specifically to do that. So there’s something in Silicon Valley here called the “Defense Innovation Unit.” It was created in the last administration specifically to build a bridge and get to know and, as they put it when it was established, be a kind of a, you know, a consulate, be a kind of an embassy for the Defense Department in Silicon Valley and reach out and understand, you know, their needs.

Katie Trauth Taylor: [00:18:41] The terminology, the metaphor alone speaks to the seriousness of the divide that it is a consulate, as if we were sort of entering into foreign territory and having to speak a totally different language.

Zach Davis: [00:18:54] It’s true. And so you saw a couple of real tense moments in that… In particular, you know, with artificial intelligence, with A.I. and the Defense Department realizing that they can’t live without that, that the future battlefield will be very heavily dependent on these tools for a variety of purposes and that there was no recourse. They had to go to Silicon Valley. The CIA already had an outpost and has an outpost in Silicon Valley called In-Q-Tel. And their business model was to look at the technologies that could be applied to their needs, to intelligence needs, and then make sure that those companies didn’t die. Right? So they were in the role of kind of a venture capital investment entity that would make sure that these companies that were struggling, that had something that were really on to something that the government could use, didn’t just die in what they call the “Valley of Death” between a great idea and a first round of funding. But “we just couldn’t make it work. We just couldn’t find the customer base, so we had to close it up.” So that model has been successful. The Defense Innovation Unit model is a little bit different, but they also are looking at startups and they’re looking at ways to build these partnerships and bring the Silicon Valley class into the inner circle of what their defense needs really are. And with A.I., that all kind of focused on something called “Project Maven” in which Google was helping, because they are a leader in this field, and Google was helping the Defense Department and mainly in the massive flows of data. Right? I mean, that’s really the battlefield of the future, are these massive flows of data coming from multiple sources because it’s a multi-domain battlefield, is what we call it now, where, you know, you’ve got the space assets and ground, and unmanned vehicles of various types, massive data flows coming in. And so Project Maven was helping with that, and you saw kind of a rebellion among some of the Google staff. “[unclear wording] kill people, I’m not doing that,” and so it was maybe an instructive experience for both sides and more people in the valley side are coming to understand the totality of what defense means and the defense side, people are coming more to understand the priorities and ethics of the private sector.

Innovation Storytelling Training Podcast Ad

Katie Trauth Taylor: [00:22:36] I’d love for you to keep sharing a little bit more about that. You’ve written about key insights from leading experts on the threats, opportunities and the national security challenges posed by emerging and disruptive technologies. And you’ve already offered some powerful examples, but could you share some more stories coming out of this particular area?

Zach Davis: [00:23:02] Well, yeah, I mean, that’s part of what’s interesting about this whole topic, right, is, you know, it is so expansive, right? It cuts across every stovepipe and, you know, every culture and every country. So I guess the way to think about it is, on the human side, you know, what are people doing, right? What are some of the innovative technologies that are being developed, you know, not in places that we know, but in other countries. And of course, there’s a lot of attention being paid, these days, to China because they are clearly making the investments, they are putting their money where their mouth is and they’re pouring resources, and by that I mean human resources, but monetary resources, they’re putting a lot of effort into these key technology areas and that includes the biotech field. But computing power, you know, neurology, it’s just across the board and they’re putting the pieces together and they’re getting their people trained in the best universities around the world. And they’ve clearly made it a goal to be a leader in a number of these key technologies. And that has people concerned because of the military complications. And one of the interesting aspects of this global competition is that these country cultures are also evident in the competition. So you look at Russia and it’s a whole different thing. And this question of the private sector becomes really important, whereas China has unleashed these companies, although, with a tether – and you saw this in the Tik Tok and WeChat concerns – is that some of their technologies may actually still have a link to the government. And so that raises the well, why? Why… What are they doing with this data? Is this going to be weaponized? In Russia, you just don’t have the private sector and that’s really hurt them. And it’s made it very hard for them to compete in certain areas. Now, of course, they have world-class brainpower, but they just don’t have the financial resources or the entrepreneurial class, and they lose those people to the West, where they go off to make money, right? And so, you know, Russia’s going to be… Going to be hurt by this. And so I guess one of the most interesting aspects of this is the cultural dimension that, as globalized as everything is and as globally available as all this technology is in the defense world, it still comes down to countries. And how different countries make decisions and have a good relationship with these companies and are able to rely on them and have a mutually beneficial relationship, or whether you have to go outside of your own country and pay the price for global technology. So it’s interesting. And, of course, the other way you can categorize it is biotechnologies, right? And a lot of these fields are culturally different, right? So the biotech industry is really different from the biological sciences operate by a different set of scientific norms and principles, and, you know, you go through a different training in order to be credible and productive in that field. So you get into a lot of other sort of disciplinary or multidisciplinary controversies as well. Right? The ethics, the norms and even treaties. Right?

Katie Trauth Taylor: [00:27:55] We’re in a post Theranos world and some of the sort of blind forward momentum without enough, to be frank, medical rigor or scientific review. I think that the culture in that space is starting to change and validation is more critical. But there’s still even, just last year, there was a great research study published about the lack of peer reviewed studies around health care unicorns.

Zach Davis: [00:28:24] Right.

Katie Trauth Taylor: [00:28:25] And I’ll link it in the show notes because it’s a great read, but I think the culture is starting to change. But it’s a slow change. And but again, that’s really critical, not just to… Not so much… To the national security interest in terms of the validity of the medical products that we’re putting out there for the world.

Zach Davis: [00:28:44] Well, that’s one of the interesting aspects of this, is the, you know, the moral questions that come up, which to some extent are embedded in treaties and practices and ways in which the scientific community has operated. But especially if you look at the bio sciences, you know, there were expectations that certain things ought not to be done, that we shouldn’t do certain things. And a lot of those barriers have been tested and kind of bypassed, and so that’s an interesting question in the defense and diplomacy world, a lot of those norms of behavior about things that, you know, should not be done are embodied in treaties. So there’s the biological weapons treaty, chemical weapons treaty, the Nuclear Nonproliferation Treaty, treaty against testing, nuclear weapons, testing in the atmosphere, and things that that humanity had decided were probably better left undone for for the benefit of all, and I think it’s fair to ask that if these norms and practices and institutions and laws and agreements are being tested and proven to be unenforceable and proven to be… When push comes to shove… No longer being practiced and no longer being respected. So you see things like chemical weapons coming back. And that was something that people thought, you know, had been dealt with. That the Chemical Weapons Convention, you know, that… There’s an enforcement and verification agency. But when these international multilateral institutions are reliant on great powers to enforce these ideas, you know, it comes down to great power politics. And right now, great power politics are not putting multilateral agreements ahead of…

Katie Trauth Taylor: [00:31:36] Perhaps what might feel like short-term.

Zach Davis: [00:31:39] More immediate.

Katie Trauth Taylor: [00:31:39] Right. Right. I was going to say in terms of putting that ahead of perhaps what might actually be short-term personal gain or… Interesting.

Zach Davis: [00:31:48] Right. Yeah.

Katie Trauth Taylor: [00:31:50] Could you speak a little bit to – I know your areas of expertise are weapons of mass destruction and some more sort of physical threats – but we are also living in an age of disinformation and data being used to influence and change public opinion or the mindsets of people. Is that an area that you touch on in some of your research as well?

Zach Davis: [00:32:18] Well, I’m one of these drifters who… I am a jack of all trades and a master of none, but it allows me to go and get the right people to answer those kinds of questions. In the new book that we have done for the Special Operations Command has a huge section on that question of disinformation. And, of course, you know, the obvious current examples of how easy it is and the globalization of the information flow has just democratized disruption. And it’s too easy. And you don’t have to really have a lot of resources. It’s not like discovering a new element or fielding a new weapons system. This has really opened up the field because it’s just so easy. And you know how I started off talking about these tools being really merely a reflection of human nature. And I think that’s what you see in the disinformation side. Which is, you know, these flaws, these deficiencies, I mean, it’s… Right? Throughout political philosophy, it’s in the Bible, it’s in all of our understanding of human nature that we have these fears and insecurities and prejudices and confirmation bias. And that’s just a part of being human. And so these tools, these global tools that can get into your pocket, that can reach billions in a nanosecond, have opened up this new realm of influence operations. And again, you know, no norms to guide this, no sense of restraint. And we have hurtled forward in discovering the latent potential of these technologies and the influence on individual group, collective consciousness. And of course, that has tremendous implications for democracy, but also for leadership. When you can undermine decision-making by introducing controversial ideas that are not supportive of leadership directions or leadership, you know, priorities and and it’s just too easy and we seem to be, globally, willing to, you know, to let this tsunami overwhelm us. And what comes on the other side when there’s no ability to distinguish between truth and lies or interpretations, and it comes down to critical thought. I mean, I asked my kids about this, too, because they, you know, they’re on all these social media platforms. And I’m not and I don’t understand. And they say, “hey, you know, Dad, did you hear about this?” And I go, “well that sounds like… That is stupid. That can’t be right.” “Yeah. Well, you know as well, you know…” Fortunately or unfortunately, this new generation is at least extremely cynical. I find. And that maybe our saving grace, right? That they don’t believe… they know it’s all B.S. They don’t believe anybody. They don’t believe me. They don’t believe you. They don’t believe it. They, at least, I mean, it’s a little disheartening because you don’t want to raise a generation of cynics. On the other hand, they at least have this critical capacity to know that all this stuff that’s flowing into their phones and computers and across the you know, the universe may or may not be true. The problem is they don’t believe anybody, right?

Katie Trauth Taylor: [00:36:42] Right, right, right. Well, and I think we’re just now, after all these years, living in a world that is so dominated by social media, we’re just now seeing some of those companies start to block misinformation or take more proactive measures. And that is a problem. But at least we’re getting in that direction now where… It’s taken a long time. But they are starting to… It seems as though… The innovation community, perhaps thanks to some pushback from national security and government leaders, is starting to be more prepared to defend and protect against the widespread rapid fire dissemination of incorrect information.

Zach Davis: [00:37:33] Well, I think that there’s a new global competition for soft power. And all of these, you know, this connectivity and the mechanisms that enable it are opening new possibilities for influence and for global power. And where we used to talk about soft power in a… As being somewhat secondary to hard power, military, economic, real tangible forms of power and soft power being, you know, cultural and conceptual and, you know, other forms of influence that come from global connectivity. And I think in particular that you notice that Chinese entities have now purchased most of the big Hollywood movie studios and are exercising influence over content. And so one of these words that I, I hate but I have to use is that there’s a global competition for narrative. Right? Who’s telling the story? And I guess that is just right in your wheelhouse, Katie.

Katie Trauth Taylor: [00:39:00] Right.

Zach Davis: [00:39:01] Because it’s all about the story and who tells the story and how you tell the story. And the United States, during its time of preponderance throughout the Cold War, really had that primary, that position of being able to tell the story and control the narrative. And so it was American movies and American authors and American TV and American products. So Batman and Levi’s and John Wayne. And we built a narrative of America and the American century. And I think that in the contest for global power now there’s a… That’s all being contested. And several rising powers, China, but also… China and Russia and Iran and many other countries that are rising powers, have a different story to tell. And they say, yeah, we heard the American story and we heard the colonialist story. Great, nice story. But you’re done. We’re done listening to you. And so global media is trying to tell different stories. And so, you know, India has its own story and they have their own sense of the world order and their own sense of history and what the future should look like. So rising powers are trying to change the narrative and they recognize that these… Global connectivity and the contest for soft power is really where the hearts and minds of the world are going to be changed. And whether the new order that is coming will favor, and unfortunately, this comes down to either national or cultural or ethnic… There was a big article that came out around the end of the Cold War by an international relations scholar. His name was Samuel Huntington, very influential. And he posited that now that the Cold War was over and that the restraints on these big national and ethnic rivalries had been eliminated that the future would now sort of devolve into, unfortunately, what… Kind of something more along the lines of what we’re seeing now. It’s that there would be, you know, a Chinese narrative, a European narrative, a South American narrative, a South Asian narrative. That the big divisions in the world would manifest in this sort of globalized competition for, you know, the hearts and minds of the world. And that’s kind of what’s happening. I mean, anyone can tell the story now.

Katie Trauth Taylor: [00:42:22] Right. Right. And like you said, it’s been far too easy for one nation or entity to be able to gain soft power through the use of disinformation. And, you know, all of these topics sort of make you feel a little bit helpless at an individual level if you’re not the one in control of national security decisions. Or… But I’d like to think, as innovators and CTO’s and CIO’s and managers and people who are working from the bottom up of the innovation sphere, hear this – that there are some takeaways for us, individually. We like to, at Untold, we like to research the concept of public intellectualism. And how do we kind of change that concept from being a cool A-list of the people who are showing up on the news circuit to a concept that we can all apply to our daily actions in the world? How are you promoting a world where knowledge is shared and valued and sought and the truth is sought? And what’s your role in that and how do you take ownership of that? And so all of that to be said, I would love for us to kind of riff together at the end here about personal responsibility, truly, because so many of these forces feel out of any individual’s control. And yet within our own spheres of influence, we do have some control over whether or not we allow our innovation stories to discuss the dark side and to have that ethical touch point in our innovation processes whereby we could kill a project because of its implications to the betterment of society. So share with us some of your thoughts on that. I think, you know, how do we as innovators prepare to say the dark side and communicate it and voice that in a world that does not value that?

Zach Davis: [00:44:26] Ooooh! Yeah, Katie, that’s a big one. So, I think that moral reasoning, right, the moral aspect of all of this does come down to the individual level and, you know, and morals are passed along. And I really think, you know, storytelling, as you have defined it, right? But you know all of these concepts and all of the distinguishing between right and wrong and what should be done and what should not be done do come down to individual decisions. You know, it’s like I said, the state’s, nation-states, have less control than ever before. And big institutions, big global entities, big concentrations of power are less influential now. There’s a dispersion of power outward, and that’s how this globalism really took hold. And so, I mean, if there’s good news in that, is that decision-making and authority and agency, whatever you want to call it, has devolved down to individual, local group level. And that is different than the nation state, the nation states pursue what they… What they must. But. Each of us has that, as that decision making authority, and I would go back to the, where we started, in the dual-use nature of technology. You know, people ask “why?” Right? So you can say that “what’s driving all this?” Right? Is it the “dark side?” The money.. Is it greed? What’s what’s driving all this? Why are we… Why is this happening? And I used to think that the answer had to do with, you know, necessity, right? That technology and innovation unfolded when people had something that they needed. They needed to get a task done. They wanted to do something. And so they needed to innovate to achieve their goal. And the more I studied it, the more I realized that, you know, a lot of this innovation comes just from, you know, “why?” Because I was messing around with this. Because I wanted it. Because I was curious. Because I was interested or, you know, I didn’t plan on this. You know, I didn’t intend for this to happen, you know, this was serendipitous, this, you know, all just kind of came together. And so, I think what that means is, at the individual level, much of this innovation and much of the big developments…. Even those that are commanded and resourced and driven by larger interests, right, big Silicon Valley entrepreneurs, governments. You know, terrorist groups. They have their needs, but it does flow to the individual. To be, you know, this is all human. This is where we get focused on the shiny objects and we love our technology. But at the end of the day, this is all just human and it just… It’s in the hands of individuals.

Katie Trauth Taylor: [00:48:53] Right. Right. There’s a moment I’ll never forget. When I was in college, an English professor, we were reading Holocaust literature and we put down our books and she looked up and she said, well, how does it feel to know that a Nazi scientist performing unethical experiments on Jewish populations went home at the end of every night and fed their family and said, hi, honey, and got in bed and wound down and started another day? Right. So like you said, necessity drives action and ethics become more of a gray area when they’re in contrast with our immediate needs, and that’s a dark, really, really dark example. But I think it reminds you that horrors are justified, or, poor actions or irresponsible actions, are justified because of that and checking ourselves individually and being brave enough to say, especially if you are an innovation leader, let’s make sure that our stories of impact cover not only the glossy, beautiful possibilities and the stakeholder interests, but also the public health interests or the public good and the global good.

Zach Davis: [00:50:12] Right. Exactly.

Katie Trauth Taylor: [00:50:14] It’s tough. It’s tough to go against necessity, whether it’s at the highest levels or in your personal… Own personal life. What other advice do you have for innovators as someone who studies these topics constantly?

Zach Davis: [00:50:32] Well, that’s a tricky term. I find that people mean different things when they talk about innovation and that, you know, if you’re in business, that means one thing. Right? That’s a disruptive business model. Right? So we need to innovate. Right. Uber is innovative, not because they came up with any kind of technology, or they… You know, but it was the idea that business.

Katie Trauth Taylor: [00:51:02] Yeah.

Zach Davis: [00:51:03] That’s what… they think of is a better way to make money or save money or… In science, right, innovation is either, you know, new knowledge, let’s discover something new, a new element or a new process, a new way of discovering. So when they innovate, let’s try something different. Let’s experiment. And of course, that means you don’t know what’s going to happen. And then on the battlefield, when you’re talking about innovation, you’re talking about a lot of times tactics and strategy. And all of this involves the adversary or your community, right? Nothing… None of this happens without the surrounding environment and you’re affecting others. And so you are embedded. And whether that be in a competitive sense in the business world, you want to put those people out of business. That’s what Uber, with taxis, right? That was an institution that had long, long been a part of society. And, you know, it’s like, just wipe them out. And, you know, people were a little bit sad about that. On the battlefield, a lot of times, it means either new ways to prevail. But also, I think, you know, one thing that is perhaps paradoxical, but positive is the whole notion of deterrence, right? If you get really good at fighting war. Then it definitely influences the mindset of potential adversaries, let’s not have a war, I don’t think that would work out well for us. So sort of the paradox of nuclear weapons, right? The ability to destroy everything turns out to be the secret to the long peace, right? When global competitors make a rational decision that war no longer can serve the purpose for which it’s prosecuted. You can’t – you can’t win. We can’t come out of it better than we started. Let’s just not do it. Let’s find a way to avoid war. So, you know, I find this notion of innovation really elusive, very evasive and something that is natural, it’s a natural outcome of curiosity, of necessity, of creativity. It’s a part of human nature, right? It’s a part of what we do to survive. And so I’ve often been kind of perplexed by prescriptions, you know, “how to innovate.”

Katie Trauth Taylor: [00:54:18] Yes!

Zach Davis: [00:54:18] And “how to create an innovative workplace,” you know, “how to foster innovation,” you know, how to tease it out of people. Come on. Come on. You can do it! Let’s innovate. Think out of the box, let’s try. I just think it’s a very hard thing to capture and pose. Seems to me like more, you know, more of a unicorn and something that occurs naturally and will always occur and not always… You know, you can’t repress it either. You don’t want terrorists to be innovative. And, you know, I mean, one of the best case studies of innovation in war is the IED, right?

Katie Trauth Taylor: [00:55:05] Mmhm.

Zach Davis: [00:55:05] And that’s a very low tech, you know, and it was something that a number of terrorists around the world came upon as a way to deal with the overwhelming conventional military power that, you know, they wanted to disrupt and they wanted to defeat. And so, you know, the terrorists are really good at innovation.

Katie Trauth Taylor: [00:55:33] Necessity, once again, necessity. Goodness. Thank you. I know we could keep going and we’ve talked for an hour, I can’t believe it. It just flew by to me. I’m sure listeners feel that way as well. Zach, I’m so grateful to have had you on the podcast talking about all of these issues. And I agree with you, we need to continue to fight for nuance in our understandings of innovation. And where can – where can listeners find you?

Zach Davis: [00:56:04] Well, we got a website at the Lawrence Livermore National Laboratory has a little think tank where I work, and that’s the Center for Global Security Research, CGSR. So it’s “CGSR.LLNL.GOV.” And we have a lot of publications on various topics, people who cover various issues. We’re working on a big biosecurity project right now and also working on a big climate and environment and security project. We’ve got lots of publications and lots of interesting lectures that you can click on and learn something from. So I’m reachable through there.

Katie Trauth Taylor: [00:56:54] Wonderful. Thank you so much, Zach Davis, for being on the podcast and for joining us today. Talk to you soon.

Zach Davis: [00:57:02] Thank you. Katie, thanks for letting me ramble.

Katie Trauth Taylor: [00:57:05] Thanks for listening to this week’s episode. Be sure to follow us on social media and add your voice to the conversation. You can find us at Untold Content.

You can listen to more episodes of Untold Stories of Innovation Podcast.

*Interviews are not endorsements of individuals or businesses.

Leave a Reply

Your email address will not be published. Required fields are marked *