In this week’s episode, Jeremi and Zachary are joined by guest Ellen McCarthy to discuss the problems of disinformation in the world today.
Zachary sets this scene with his poem entitled, “Like a Ball of String”
Ellen McCarthy is the ChairWoman and CEO of the Truth in Media Cooperative and Noodle Labs. Ms. McCarthy has over three decades of national security service, in a variety of leadership roles. She has served in many high-level government positions, including: Assistant Secretary of State for the Bureau of Intelligence and Research; Chief Operating Officer of the National Geospatial Intelligence Agency; and Director of the Human Capital Management Office and the Acting Director of Security and Senior Policy Advisor in the Office of the Under Secretary of Defense for Intelligence.
Guests
- Ellen McCarthyCEO of Truth in Media Cooperative and Noodle Labs
Hosts
- Jeremi SuriProfessor of History at the University of Texas at Austin
- Zachary SuriPoet, Co-Host and Co-Producer of This is Democracy
[00:00:00] Intro: This is democracy, a podcast about the people
[00:00:09] Intro: of the United
[00:00:09] Intro: States,
[00:00:11] Intro: a podcast
[00:00:11] Intro: about citizenship,
[00:00:13] Intro: about engaging with politics and the world around you, a podcast about educating yourself on today’s important issues and how to have a voice
[00:00:21] Intro: in what happens next.
[00:00:28] Jeremi: Welcome to our new episode of This Is Democracy. This week we are going to discuss the problems and opportunities of disinformation in the world today. And we’re very fortunate. We’re joined by a new friend, someone I’ve just had the chance to get to know, who I think has done some of the most interesting work in both the public and private sector on understanding this problem and thinking about how a democracy can insulate and protect itself against the disinformation coming from places like Russia and China, which I’m sure we’ll talk about.
[00:00:55] Jeremi: This is Ellen McCarthy, who is the chairwoman and CEO of the [00:01:00] Truth in Media Cooperative. What a great title. Truth in Media Cooperative and Noodle Labs. Ellen, thanks for joining us.
[00:01:06] Ellen: Thank you so much for having me. It’s great to be here.
[00:01:06] Jeremi: Ellen has an incredibly distinguished background. Uh, I’m so glad I’ve had the chance to get to meet her.
[00:01:13] Jeremi: She has over three decades of national security service and a variety of really impressive leadership roles. Many of them, many, many high level roles, uh, including assistant secretary of state for the Bureau of Intelligence and Research. And those of you who don’t spend a lot of time with the state department, INR as it’s called, is really the gold standard in many ways for serious analytical research on the world.
[00:01:34] Jeremi: She was, uh, assistant secretary of state in the Bureau. of intelligence and research. She was chief operating officer of the National Geospatial Intelligence Agency and director of the Human Capital Management Office and acting director of security and senior policy advisor in the office of the undersecretary of defense for intelligence.
[00:01:53] Jeremi: And, and I could go on, uh, so many other, uh, similar titles. One day you’ll all be seeing her as secretary of state or [00:02:00] secretary of defense. I’m quite sure. Uh, we’re fortunate that Thank you for taking the time to join us today. Zachary, we have your poem to start us out, of course. What’s the title of your poem today?
[00:02:10] Jeremi: Like a Ball of String. Like a Ball of String. Let’s hear it. That truth is
[00:02:15] Zachary: free, they often choose to say. In truth, the truth is simply more complex than can be captured in a single text. It must be sought again each dawning day, and caught in hand, one can’t just wish or pray. It must be badgered, be harassed, and vexed, made spellbound, distracted, and fiercely hexed till donkeys can be made to cease their bray.
[00:02:40] Zachary: And yet it’s true, truth is a simple thing, That comes and goes, is never in between. So must birds die, or never cease to sing, So must a river flow, or fade to green, So must a man embrace, or learn to sting, Learn not to sting. Embrace what can be seen and [00:03:00] follow the truth like a ball
[00:03:01] Jeremi: of string. Hmm. I love that last line.
[00:03:05] Jeremi: Follow the truth like a ball of string. I also love, uh, your double usage of truth. The line, In truth, the truth is simply more complex. What are you, what are you saying in this poem, Zachary?
[00:03:16] Zachary: I think my poem is about how, uh, how difficult it can be sometimes to pursue the truth, um, and how it requires a sort of constant.
[00:03:24] Zachary: pursuit and dedication, uh, to truth telling, uh, and also to, to, to seeking the truth, but how at the same time, it’s something simple. It’s something that, that one chooses, that one has, or one seeks, or one doesn’t, uh, and at some level, it’s a very sort of simple distinction.
[00:03:41] Jeremi: Yeah. Alan, your thoughts on this?
[00:03:44] Ellen: I also love the use of the word bird. Yes. Um, you know, you’re aware there’s a movement called the birds don’t fly movement. Are you, are you familiar with that? Yes, yes. I don’t know if that was what you were thinking at the time. I know it was far more philosophical than that, but that’s what [00:04:00] struck me.
[00:04:01] Ellen: Yeah.
[00:04:01] Jeremi: Were you thinking about that, Zachary? Uh, not really,
[00:04:04] Zachary: but I think it fits perfectly. It fits perfectly.
[00:04:07] Jeremi: And just explain what, what the birds don’t fly movement is.
[00:04:11] Ellen: Do you want me to explain? Yes. So there’s a, you know, I’m not going to, I don’t know the whole history, but I do know that there’s a young man a little older than Zachary who just thought for fun, he would put out sort of the message that birds are really created by the government, that they really don’t fly.
[00:04:26] Ellen: They’re small drones put out by the government to monitor people. And you know, he shared it in social media and he started picking up a following. And the next thing you know, he’s, he’s traveling around, there’s a whole movement and there are people who are believing right now that birds don’t fly.
[00:04:41] Ellen: Drones and machines so that Apple wouldn’t fly. That they are government drones. And and he did it as a joke. sure. And still but it’s also to highlight that sort of this world we live in right now. It’s so easy to to almost brainwash people with with things that are not true with stories that are not
[00:04:59] Jeremi: [00:05:00] true.
[00:05:00] Jeremi: and and how do you understand that phenomenon there’s always been an element in our society and in every society that believes in conspiracies. The most famous being the JFK assassination that somehow that was organized by government. been people who distrust authority, but it does seem this is different today, at least different from recent memory.
[00:05:19] Jeremi: How is it different? How do you understand
[00:05:21] Ellen: that? So, you know, I love in our country, we’ve never really trusted anything, you know, we’ve never really trusted the media. We’ve always had conspiracy theories. Politicians have always, uh, guided by persuasion, and they’ve always used whatever tool they can to try and manipulate people to come on board with their thoughts.
[00:05:40] Ellen: So none of this is true. I mean, none of this, none of this is different now. But what’s different now is that it’s the scale of information. It’s the amount of information and the speed that we, that information is, um, is is sort of moved around and shared. And I think on some level, there’s the there’s our brains are just [00:06:00] not built wired to actually be able to incorporate all of this information.
[00:06:03] Ellen: There’s a there’s a push component in terms of there’s just more information that’s out there that doesn’t meet standards would not be marked as quality information. But then there’s also this, we’re just not wired to take in all the information, all the sensory stuff that we’re faced with every day.
[00:06:20] Ellen: And so at some point, we just We stop, you know,
[00:06:24] Jeremi: right, right. And we just believe what we believe. We look to a familiar face or something that reinforces our emotions. Is it true that foreign actors are taking advantage of this? And I know it’s controversial to talk about this, but you have the advantage of having studied this as almost no one else has.
[00:06:41] Jeremi: Is it true that Russia has been manipulating parts of our political system? Yes, yes, yes,
[00:06:46] Ellen: yes. And yes. And that’s, you know, that’s the thing that gets me the most right now. Is that. You know, we usually in our country, you know, tend to change and get better and stronger when bad things happen. There’s kinetic things when planes fly into [00:07:00] buildings, but right now we’re in the state of cognitive war, and I don’t know if people realize it, and it’s, we’re being taken advantage of.
[00:07:07] Ellen: So this, this incredible. open society that we live in, this very transparent freedom of speech, freedom, you know, the fact that we’re very open is also our greatest vulnerability in terms of, um, how we’re taking advantage of. And so we, we’re not perfect. And the fact that You know, things happen, um, bad things happen, and then you’ve got these outside entities that take advantage of it, and they can do it quickly and easily and cheaply.
[00:07:36] Ellen: So you know, there’s a police shooting in a, in, in Chicago, or there’s a riot in another town, and the fact that very quickly you can, you can take that story and you can add to it, you can, um, manipulate it, you can now with the explosion of AI, you can, you can Media, you can create videos and you can take this bad situation and [00:08:00] inflame it and make it even worse.
[00:08:02] Ellen: And and so now we’ve gotten to a point where we’re not at all willing to accept that we do make mistakes because what we see is just so bad and so big and we’ve become so polarized as a result.
[00:08:14] Jeremi: Wow. Zachary? Yes.
[00:08:17] Zachary: Um, you mentioned AI and the influence that technology can have on this issue. Uh, to what extent do you think technology is, is shaping the way that we, we view the world and, and, and, and really augmenting this issue, uh, in a way that maybe we haven’t seen before.
[00:08:33] Ellen: So using AI as an example, so using technology as an example, I love technology. I love new things. So if you think about, you know, so many people cite that the state of our country is completely to blame on social media. And if you think about, you know, social media was a couple of guys who actually wanted to date girls, and so they created a dating app, and it was meant for good.
[00:08:54] Ellen: You know, it was not intended to go out, come out and, and, and create this divided country that we’re [00:09:00] in right now. Most technology, most new things are done for good reasons. They’re done to help. And the problem is, is that there’s always a jerk out there who’s just going to take advantage of this really good thing and use it for bad.
[00:09:11] Ellen: So you know, Zach, you mentioned AI. I actually think AI. is is going to be amazing. I so I come from the intelligence community where I was an intelligence analyst, and I had to sift through lots and lots of information, um, using structured analytic techniques, um, you know, looking at the sourcing of data, verifying the sourcing and doing peer review on this.
[00:09:35] Ellen: And then when I served up a piece of knowledge, intelligence analysis to the secretary of state. or to a submarine officer or to a law enforcement officer. When I passed them that information, it was based on it was the best, most unbiased, most now, it wasn’t always right, but it was the best that we had at the moment.
[00:09:53] Ellen: And the intent was to give that person an opportunity to see it, read it, hear it and give them decision advantage. [00:10:00] And so, um, Who’s doing that now? I mean, now, now there’s so much information that the ability to do it’s just gotten harder and harder to do that. So being able to sift, you know, good information out from not so quality, so good quality information from not such good quality information is really, really hard.
[00:10:21] Ellen: And, you know, you have places like, uh, so look at some of the major and, uh, involvements are in right now in Israel with Hamas, Russia with Ukraine, and You know, even on the ground, we’re having a really hard time understanding what’s, what is quality information and what’s not. And that’s very scary when we’re making operational decisions, we’re making policy decisions, or we’re just the average citizen trying to figure out where we stand on this.
[00:10:46] Ellen: Right, right. And so we’re not, we’re not, it’s hard to get, it’s hard to be informed.
[00:10:50] Jeremi: So how are we addressing that? It seems to me there are two levels of this problem that you’ve broken down very, very intelligently, you know, one is, are having good enough information to make our [00:11:00] own decisions about who’s responsible?
[00:11:02] Jeremi: Are there tunnels under hospitals, things of that sort, right? But then there’s also the question of, as you pointed to earlier, bad actors overseas. I think you called them jerks. Yeah,
[00:11:12] Ellen: they’re jerks. Yeah, who are just taking advantage of our incredible And, and
[00:11:16] Jeremi: encouraging hate, encouraging division. And now with
[00:11:18] Ellen: AI, they can do it quicker, cheaper, it used to be hard to do that, but now you could be a disenfranchised teenager and name the country.
[00:11:25] Jeremi: There’s an asymmetrical advantage. It’s less expensive to send, and less expensive and easier to send propaganda than to defend against it. So you’ve been in the bowels of the beast at the State Department and the Pentagon. What are we doing to respond to that?
[00:11:39] Ellen: So not enough. I mean, I think we need to do more.
[00:11:41] Ellen: Again, I use the example of we’re very good at building boats and building guns. But this is different. You know, this is so when you’re dealing with information, it’s really a touchy issue, by the way. So, because it’s dealing with people, it’s dealing with it. And so, you know, anytime the government is involved in anything related to [00:12:00] information, there’s this huge sensitivity to, you know, our constitutional rights and freedom of speech.
[00:12:05] Ellen: And, you know, our, you know, We’re very careful when the government gets in and starts messing with information. And rightfully so. Very rightfully so, which is why I think that this is actually a problem that’s more than the government. I think it’s a, it’s a, it’s a private sector. We all need to get together and figure this one out, um, kind of problem.
[00:12:21] Ellen: But um, You know, you have capabilities. So you’ve got the State Department has the Global Engagement Center, which is very focused on, you know, ensuring that other countries aren’t writing things about the United States that are not consistent with who we are, or really what we’re trying to do. They’re trying to help with help with the narrative.
[00:12:40] Ellen: It’s just but it’s not enough, because there’s so much out there right now. And, and in an AI world, your ability to sort of spread that faster, deeper is just making it harder and harder to keep up with. So I think there’s a different way than what we’re doing. Sure, sure.
[00:12:53] Jeremi: Zachary?
[00:12:54] Zachary: Yeah, I wanted to ask, um, I think a lot of young people hear this and maybe [00:13:00] even more quickly than their parents have realized the dangers of misinformation and disinformation in the world.
[00:13:07] Zachary: How do we prevent this from, uh, making young people lose trust in media outlets, uh, in trusted, uh, people in their community? How do we prevent this from, from, from this knowledge? Uh, and this recognition from driving young people into their own sort of silos.
[00:13:25] Ellen: So I think it’s already happened. I think that’s, and, and Zachary, the data already shows it, you know, so you’ve got all these opinion polls out there, um, and I, and by the way, I’m not, I know this sounds very negative.
[00:13:36] Ellen: I think there’s a positive to all of this, but, um, so right now you look at Gallup, you look at Quinnipiac, you look at all the polling organizations, and they’ll just show you the trust in. Almost every institution in our country, regardless of demographic, regardless of your Gen Z or a baby boomer, is at the lowest it’s ever been, and it continues to climb.
[00:13:55] Ellen: So that’s trust in media, trust in government, trust in religion, trust in academia, trust in science, [00:14:00] like we just don’t trust in health, you know, we, we are, it’s the lowest it has ever been, and that, that actually really scares me because I just don’t understand how You know, democracy continues to thrive when you don’t trust science or you don’t trust innovation, um, or you don’t trust decision making.
[00:14:18] Ellen: I, I, and so that, that, that really does. And we’re heading into an election year. And so election integrity is something that I’m very worried about right now. And it’s not about being political because both sides of the aisle. It’s a foundation of our democracy. So what I say to a young person is, and by the way, young people already know this.
[00:14:39] Ellen: So at my cooperative, we actually have an initiative called NextGen and we’re, we’re, we’re, our focus is we’re a network of networks. So we are composed of media organizations, academia, technology, um, science, and our focus is. building standards and measures for quality information. [00:15:00] So our view is instead of fighting miss and disinformation, which quite frankly hasn’t been working, all the data shows that we’re not trusting anything anymore.
[00:15:10] Ellen: Maybe we need to create a new approach. Maybe we need to democratize and monetize quality information. And you know, that actually kind of makes sense because if you think about it, Zach, I bet if somebody told you that President Trump wears diapers. You probably would go look for everything on President Trump wearing diapers.
[00:15:33] Ellen: And what you’re doing when you do that is you’re monetizing, you’re actually contributing to, you’re putting money in the pockets of somebody who is putting out Disinformation. You’re feeding that beast because that’s, that’s why we live in the environment we live in right now. We have democratized poor quality information.
[00:15:51] Ellen: People make too much money. They thrive off of putting out information that doesn’t make quality. And so the minute you move your eyes [00:16:00] to that, the minute we say, Hey, Zachary, that’s bad information. You’re looking at it and you’re giving them money. So maybe the answer is, is you figure out a way to create.
[00:16:09] Ellen: demand, enable demand for quality information. And the example I use is food. So I’m old. I grew up with Wonder Bread. Now I have kids that are college age. They have never eaten Wonder Bread. Now, why is that? It’s because in the last 40 years, I know what GMOs are. I know what saturated fat is. I know what sugar content is.
[00:16:32] Ellen: And by the way, my kids do too. Now that doesn’t mean you can’t buy Wonder Bread, but they, you also have other choices. And. And we have a market for food that is whole grain and healthy. And that doesn’t mean every once in a while I’m not hankering for a piece of Wonder Bread, but I can make those choices.
[00:16:48] Ellen: Why can’t we do the same for information? Why can’t we give you and your counterparts, um, access to quality information in a way that you want and you demand and you can get [00:17:00] without having to pay two, three, four hundred dollars a month for subscriptions? to information that falls behind firewalls or is you have to pay for and so that’s what our focus is right now.
[00:17:11] Ellen: We’re actually running a project out of the Belfer Center at the Kennedy School and we’re testing this theory that there is. that you can create or enable demand for quality information, you can move eyes, you can create a market, and when you create a market, you incentivize investment, you also move dollars from bad information to quality information, and we’ve picked it as our first policy area because organic food was not created overnight.
[00:17:38] Ellen: I don’t think quality information is going to be created overnight, but we’ve picked election integrity as our first policy or it’s kind of a big one right now. Um, You know, billions of people over 40 countries are heading to the polls in this environment of polarization and A. I. We’re very worried about the integrity of those elections in our country.
[00:17:59] Ellen: Elections tend to be [00:18:00] run at a state and local level and at a time when there is all these incredible uncertainty. So, uh, we’re very tools that can mess things up, and they just don’t have the resources, the cyber resources, or the people to help manage through this election process. And so what we’re doing at the Kennedy School is we’re identifying what are some three or four baseline standards that every sector can agree to, because academia has its standards, technology has its standards, media and journalism has its standards.
[00:18:29] Ellen: Let’s see if we can harmonize three or four that will apply to everyone. that they all agree to, and we’re also building an AI enabled dashboard that is modeled off the dashboard that Johns Hopkins put out during the pandemic. And so it’s a resource that we’re going to, that will be open to voters and election officials and anybody.
[00:18:48] Ellen: And so if they want to identify what is in the information they’re consuming about election integrity, they have a place to go to do that. Now, this is just our first little baby step, but the theory is, [00:19:00] if people have If they’re curious and they really want to understand what’s really going on, if you give them something that’s trusted, because they’ve been involved in the development of the standards and measurements, and they’re familiar with this tool that will be AI enabled, um, that eyes
[00:19:14] Jeremi: will move.
[00:19:15] Jeremi: Isn’t the problem, though, that it’s a segmented market? So, you know, New York Times subscriptions are higher than they’ve ever been. Wall Street Journal subscriptions are pretty high, uh, New Yorker subscriptions are very high. But at the same time, millions of people are listening to Steve Bannon and Newsmax and all sorts of, uh, crazy stuff.
[00:19:34] Jeremi: And so it’s a segmented market and I’m sure you will succeed, you probably already are, in getting lots of eyeballs. But it’s probably already the people. Who don’t fall into the trap of disinformation.
[00:19:46] Ellen: Using my food analogy. Yeah. But there’s people who still buy Wonder Bread. Exactly. So it’s, you know, so I may not get to the Steve Bannon acolytes, but I do think, I don’t, I want to prove that there actually are people out there who, who they maybe can’t [00:20:00] afford to get to behind the New York Times firewall, but they still want to know.
[00:20:03] Ellen: Is this polling site closed? Is this candidate still in the race? You know, what is, is, is, you know, who’s still, and they, and it’s about putting the, I, I think our focus needs to be more on the consumer of data than regulating the producer of data. I think that’s the other thing that’s critically important.
[00:20:19] Ellen: It’s about enabling and creating demand. It’s almost starting a movement where, Zach, I want you to get all your friends out there and say, I want to support this effort to, to create economic incentive to invest in. information that meets standards that you agree with. Now, you may not agree with the information.
[00:20:38] Ellen: This gets back to my intel roots. I all the time would provide intelligence analysis to people who didn’t want to hear it, and the same thing applies to just the media environment. You get to be snarky. You get to be critical. You get to question things, but question it based on an understanding that what you’re working off of is based on standards and measurements that [00:21:00] you agree with and and go from there.
[00:21:02] Ellen: Inform the
[00:21:02] Jeremi: debate. So how do you deal with an issue like vaccines, where there are people who clearly you know this better than I do. No matter how many facts you provide them with, they don’t believe that vaccines work.
[00:21:14] Ellen: You know, I’m not sure I’m gonna in all honesty, I’m not sure you get and you know, there’s always been people like that, you know, so that’s again, that gets to the history of there has been yellow journalism, we have believed in conspiracy theories, we’ve never trusted government, that’s kind of who we are.
[00:21:27] Ellen: And I don’t think that’s ever going to change. What’s changed now is the scale and speed of our ability to move that information and the impact it’s having on trust and everything. I think it’s not about the data itself. It’s about the fact that now we no longer trust the institutions that are keeping us together.
[00:21:45] Ellen: And so can you imagine in a world where there’s not quality information, we’re not making great decisions on economics, on innovation, on investment, to your point, on health, we’re making poor decisions. And I know there’s always going to be some people who are going to be [00:22:00] anti vaccine, but I have to believe that if there is some tools, some Capability, something that everyone agrees that if we all know what a GMO is, we can all also know what provenance of data is.
[00:22:13] Ellen: And when we make a decision to consume something, that day we want something where we understand the provenance of that particular article or image. Or, you know, AI created video. We just want to understand where it came
[00:22:26] Jeremi: from. It makes a lot of sense, especially for an educated audience. And, of course, education is part of this.
[00:22:31] Jeremi: One of the things that we stress as historians when we’re educating and writing is about the importance of knowing who produced the information. And I know that’s crucially important in the intel world. So, uh, is that part of your project, where there’s a clear, uh, origin address?
[00:22:47] Ellen: So, you know, as you’re looking, so, you know, people ask, well, well, tell me what kind of standards and measurements you’re looking at.
[00:22:52] Ellen: And I’ll tell you, I think sourcing or provenance is probably, and the measurement of that is probably the number one. Um, and, you know, [00:23:00] so from, as a former Intel analyst, um, sourcing was everything. You know, you wouldn’t just write an Intel report on the first source that you got on the first collection on the first article,
[00:23:11] Jeremi: the first thing you googled as my students do, it was time.
[00:23:13] Jeremi: It was the top of the Google. You had to,
[00:23:14] Ellen: I mean, you had to validate that source. You had to share that source. Um, and that’s, and that’s actually an interesting standard. So instead of, instead of maybe having to buy a subscription to name the paper, maybe there’s a source. That is consistently verified in this list of standards, you know, that is always providing timely, reliable, you know, it’s, it’s maybe, maybe it’s synthetically produced, who knows, but, you know, that now, you know, that if that information came from a source that is consistently meeting standards, there’s somebody you should consider and making a decision on whether to get the vaccine
[00:23:50] Jeremi: or not.
[00:23:51] Jeremi: That makes a lot of sense. Uh, so what are the opportunities? We always ask this question in our interviews with people who have such knowledge as you do, Alan. [00:24:00] Uh, what are the things that young people who are listening right now, what can they do? How can they get involved? What difference can they
[00:24:05] Ellen: make?
[00:24:05] Ellen: Oh my gosh. So in, in the, in my effort at the Tim cooperative, um, I mentioned we have this next gen initiative because the reality is, is that the way. way young people consume information is different from the way I consumed information. And so, you know, if we are trying to create quality information, I need to understand what you, Zachary, look at is what is a standard for quality information?
[00:24:28] Ellen: How do you consume it? You know, a lot, a lot of what we’re trying to do right now is really looking at how you how you build media capabilities at the local level. And there’s a reason for that. So you look at it. Polarization and where we are right now, and it really starts at the local level, right? And it tends to happen in areas where there is no local media.
[00:24:52] Ellen: Um, you know, we have these news deserts and it makes perfect sense if you think about it. So there’s no local media, there’s no storytellers in your [00:25:00] tribe anymore. And what happens is people are more easily, um, uh, recruited to extreme views. They tend to be less civically engaged. And the really interesting data point is these towns have a really hard time economically with bonds ratings and securing loans.
[00:25:16] Ellen: And that’s where this all begins. And as somebody who started as a local journalist, it makes perfect sense. If you have no one there who’s keeping your school board accountable or your local government accountable or helping with the local business, like if you don’t have that connection. And by the way, all the data shows, while we don’t trust the federal government, we still kind of trust trust anything local.
[00:25:37] Ellen: So the data shows that, to me, you need to create an a way to share information at the local level level that can be supported economically. And that’s not the old local newspaper model. It’s something new. I’m just not sure what it is. And where we can use help is You know, we’re creating these standards.
[00:25:56] Ellen: We’re socializing these standards. We’re creating this market for [00:26:00] information quality. I need help with what are the, what are the tools? What are those great ideas? You know, we’re looking at a dashboard, but I don’t think a dashboard is ultimately going to be that thing. Um, so what, you know, so how do you create quality information?
[00:26:16] Ellen: How do you create a market? And something we’re doing at the Tim Co op is we’re going to do a series of tech challenges or symposia where we bring in folks who have great ideas and we invest in them to, to, to propagate those ideas and continue building them. And so that’s where I could use the help.
[00:26:33] Jeremi: Absolutely. And you didn’t realize this, Ellen, but, but Zachary is a local journalist. He’s been writing on state and local politics in New Haven and Connecticut. Zachary, is this why you’re drawn to local journalism? I think so. I think it’s
[00:26:45] Zachary: still a place where one can. interact with these issues in a personal way.
[00:26:50] Zachary: I think we still trust people that we meet, that we know, that we shake hands with. Um, and I think as a local journalist, but also as someone who consumes local journalism, [00:27:00] I think it’s much easier to trust something when you know, when you know that, uh, someone, you know, well, or someone you feel like, you know, well, uh, through the pages of a newspaper talked to someone, uh, firsthand in your community, or when you do that yourself.
[00:27:14] Zachary: Um, and I think that. There’s still a space, um, in which we as human beings trust that instinctually more than we do, um, this sort of national or international news, which is really hard to wrap our minds around.
[00:27:27] Ellen: And the data shows it. So you’re absolutely right. And thank you for what you’re doing. Keep doing what you’re doing.
[00:27:32] Ellen: And I think that’s, so when I look at creating and enabling a market for quality information, it really does. It’s as if you look out in the longterm, it’s, it’s really about, it’s about. Empowering, enabling, investing in those trusted capabilities that provide quality information and then pointing people’s eyes to that.
[00:27:52] Ellen: And so that’s,
[00:27:53] Jeremi: yeah, so, so Zachary, do you think that’s possible? Is it? Do you think what Ellen is laying out here, which is actually a very idealistic [00:28:00] vision, creating a more vibrant marketplace of local journalism and empowering new actors, not The, the old fuddy duddy, uh, local reporters, uh, do you, do you see that as, as a possibility moving forward?
[00:28:12] Jeremi: I do.
[00:28:13] Zachary: Um, but I also think that it’s important, um, for those of us who are consuming media to still look to the local sources, uh, that are still there. Um, I think certainly, uh, the evidence shows that, uh, the market. For local journalism is waning and that there’s less there’s less of a media landscape at the local level, but there are a lot of places, I think, in the United States where local journalism is still alive, if not thriving on.
[00:28:38] Zachary: And I think that we can, uh, in addition to this, uh, admirable project to try and, uh, combat this information and misinformation at a national and international level. I think, uh, as consumers of Local media, we can actually make a huge difference just by supporting, uh, public media organizations, uh, and, and local journalism in general.
[00:28:59] Zachary: [00:29:00] Do you agree? I do, I
[00:29:01] Ellen: actually do agree. And, you know, I, I think maybe I am being a little bit ideology, ideological, but I also think this is actually, Um, this is actually a problem that can be fixed on a very big level, and here’s why. Um, so now getting back to the sort of the big social media organizations, the big platforms, um, that right now are being told, you know, we’re trying to regulate the supply of content.
[00:29:24] Ellen: We’ve got government coming in and saying, we’re going to do this, we’re going to do this, we’re going to do this, and it’s just not working. Um, you know, the, the, The Fox Newses did not want to be the truth police. The platforms did not want to be the truth police. Social media does not want to be the truth police.
[00:29:37] Ellen: They just don’t want to do it, and it doesn’t work. You moderate content, you fact check, and it doesn’t work. So that’s why all those organizations are getting rid of their trust and safety people. But I think if you can create this market for these, these, these are the things we expect, and information that meets quality standards, almost like the good housekeeping seal of approval, social media will be very much on board.
[00:29:59] Ellen: They This [00:30:00] is about taking a page from the book where they make money and just moving it to another set of products. So I, I can see that if, you know, and we’re looking at this, we’re looking at this little test case of election integrity. If people move towards information that says these polls are still open, they’re, the social media platforms will be all on board with that.
[00:30:20] Ellen: So. I mean, I think it’s wholly fixable, and I think our timing is perfect for this discussion, by the way, because everybody is afraid of AI. They all, you know, the world is ending, we’re going to lose our job, everything’s over. You know, we’re like this with every new invention, by the way. I don’t believe that’s going to happen for a second.
[00:30:38] Ellen: I actually think that it’s paramount that we as a country, though, understand the government, set up the framework to I don’t think the government’s going to do it, but I think that we have to, we somehow have to get our arms around this. But I think AI is going to be a resource for incredible good. Sure, sure.
[00:30:57] Ellen: In terms of helping with this problem.
[00:30:59] Jeremi: [00:31:00] Absolutely. It allows us to sort through information with incredible power. Now, of course, what we’re sorting for is, is the question. I mean, I guess the, the last question for us really, and I, and probably the most important one is, where is the capital for this going to come from?
[00:31:14] Jeremi: Traditionally. Um, we have had in the United States an aversion to government sponsored media. On the other hand, though, one of the most important innovations during the Cold War was that the federal government in particular invested, uh, especially during Lyndon Johnson’s presidency and thereafter in national public radio and the public broadcasting, uh, corporation and things of that sort.
[00:31:36] Jeremi: And most of us have at some point watched the BBC. I think that’s Zachary’s favorite source of news still, right? And Deutsche Welle, and so many societies have, uh government supported, high quality news that works because it’s not, uh, incentivized to get clicks. It’s incentivized to pursue the truth in one way or another.
[00:31:55] Jeremi: Is that part of this? Do we need government funding? Where’s the money gonna come from? So I
[00:31:58] Ellen: don’t think we need Well, I So right now, I [00:32:00] don’t think the government wants to be involved in this at all. In fact, in this effort that we’ve got right now, we do have government that’s serving as observers. But I You know, you don’t Any time you get government, especially on the topic of election integrity, while I think the Department of Homeland Security is very interested, state and local governments are very interested in what we’re doing right now.
[00:32:18] Ellen: I think I’ve talked before about, you know, we need to treat information like a critical role. infrastructure like food, water, nuclear power, and if you look at most of this country’s critical infrastructures, they started as private sector, self organized efforts. The electric grid was a bunch of power companies, minus Texas, who got together.
[00:32:38] Ellen: They created their own standards, their own certifications. If there was a blackout, they would go and investigate. They held one another accountable. They would find one another. The government would encroach. They’d say, no, no, no, no, no. We’ve got this. We’ve got this. It started in 1968, the National North American Electric Regulatory Cooperative.
[00:32:55] Ellen: But it was only in the 90s that then they found where there were certain areas where government absolutely had [00:33:00] to become engaged. So if there’s blackouts in major cities that are affecting the entire country, We need to we need some government regulation, and I think the same thing can apply here. So right now, I don’t think we need a lot of government involvement, but in terms of the financing for this, you know, I see the long term economic model is, as I as I mentioned, just the just the act of moving eyes from One platform to another creates, there’s a, there’s a resource model there.
[00:33:27] Ellen: But I also talked about, you know, incentivizing investment. Part of the problem we have right now is there are already some really pretty good media outlets. All Sides is one that I’m a big fan of. They can’t scale because they can’t compete against. So now if you have created these standards that are being taught to, technologies are being developed to, journalists are abiding by, you know, maybe now you can start better scaling your investment.
[00:33:53] Ellen: Already by by, by moving eyes and creating demand, which will create investment. You’re building, you’re [00:34:00] building organic food. So we’re building the
[00:34:01] Jeremi: organic food market. You know, it’s such an interesting way you describe this because in some ways the patron saint of our podcast is Franklin Roosevelt, and if you think about one of the great accomplishments of the New Deal, it was rural electrification.
[00:34:14] Jeremi: So Lyndon Johnson and others of that generation grew up.
[00:34:27] Jeremi: But also had government assistance, right? So it’s this balance. It’s what I hear you saying is a public private partnership. Right, so
[00:34:32] Ellen: if you use the food analogy, it’s just a perfect one. Also subsidized. I might like my McDonald’s hamburger, or I might like my organic chicken, but the one thing I know that no matter what I eat, I’m not going to die.
[00:34:43] Ellen: Because the government is involved in making sure it’s not poisoned. Exactly, the
[00:34:45] Jeremi: FDA. We
[00:34:47] Ellen: will need government involved, but we just don’t know where it is right now. Can I also share with you the reason I love FDR? Have you ever heard of Wild Bill Donovan? Of
[00:34:54] Jeremi: course, OSS, the basis for the CIA. He is my hero.
[00:34:57] Jeremi: Your job exists because he [00:35:00] created the Office of Strategic Right, and I think
[00:35:01] Ellen: we need another Wild Bill Donovan.
[00:35:04] Jeremi: Zachary, final words, uh for you. Do you think this is a vision that Ellen’s laid out, I think, so articulately and so creatively? Do you think this is a vision that can draw young people like yourself who care about news, care about truth, and care about fighting this disinformation?
[00:35:20] Jeremi: Yes,
[00:35:21] Zachary: I think so. And I think that, um, I really like the approach of, of using technology to control technology, or using technology, uh, as, as a step forward, as opposed to trying to prevent. What seems at least like inevitable technological advancement. Um, and so on that note, I think young people should.
[00:35:43] Zachary: Young people in particular who are interested in technology, as many of us are, should see this kind of work as part of that field, but also as maybe part of the mission of technology in our world today.
[00:35:59] Jeremi: You know, I think [00:36:00] today, Ellen, you have helped us really unpack one of the most important concepts in our democracy.
[00:36:05] Jeremi: And it does go back to FDR. It goes back to James Madison too. That democracy is a constantly unfolding system. It’s a system that has guidelines and principles. But with every new generation, we’re reinventing the uses of technology, the uses of resources to protect our democracy and to take it into a new context.
[00:36:24] Jeremi: And that’s what you’re describing. We’re not going to go back to the world of the journalism of the 1960s and 70s with Walter Cronkite. My students don’t even know who Walter Cronkite is. We’re not going to go back. to that world. We’re probably not going to go back to the world of the New York Times.
[00:36:37] Jeremi: The New York Times is probably, as much as I love it, an archaic institution. It’s going to be a different kind of media and information scape. And the way we beat the jerks of disinformation, as you called them, is by providing an alternative mode and an alternative incentive structure. And I think you’ve laid that out really well.
[00:36:55] Jeremi: Thank you so much for joining us. Thank you. Thank you both. It’s really been a wonderful [00:37:00] discussion. Thank you, Zachary, for your moving poem as always. This was a particularly good one, I think, and thank you most of all to our loyal listeners for joining us for this episode of This Is Democracy.
[00:37:18] Outro: This podcast is produced by the Liberal Arts ITS Development Studio. And the College of Liberal Arts at the University of Texas at Austin.
[00:37:26] Outro: The music in
[00:37:26] Outro: this episode was written
[00:37:27] Outro: and recorded by Harris Codini. Stay tuned for a new episode every week. You can
[00:37:33] Outro: find This is Democracy on Apple podcasts, Spotify,
[00:37:36] Outro: and Stitcher.
[00:37:38] Outro: See you next time.