Categories
podcast

Can blockchain revolutionize journalism? And make it less polarizing?

Can blockchain tech reinvent journalism—and reduce toxic polarization in the process? In this episode, Zachary Elwood talks with Don Templeman, founder of Aemula, a radically new kind of news platform. Inspired by the decentralization and transparency of cryptocurrency and other blockchain-based technologies, Aemula aims to create a bias-resistant newsroom of the future—one where algorithms are public, incentives reward nuance, and toxic polarization is nudged downward by design. Whether you’re a blockchain skeptic or a media reform enthusiast, this is a conversation about what’s broken in journalism—and one bold idea for fixing it.

Episode links:

Resources related to or mentioned in this talk:

TRANSCRIPT

(Transcripts are automatic and will contain errors.)

Zachary: Hello. This is the People Who Read People Podcast with me, Zach Elwood. This is a podcast about understanding, behavior and psychology. You can learn more about [email protected]. There’s a guy in New York City named Don Templeman working on a news site. That might just be the future of journalism. A complete rethinking about how news should work a new paradigm at the risk of using an overused word.

You might make an analogy to Bitcoin and cryptocurrency just as that is an entirely new way of thinking about money in an attempt to make currency trustless and decentralized. This is doing the same thing for journalism. [00:01:00] I am a little hesitant to use that analogy because so many people have a negative view of cryptocurrency or maybe don’t understand why so many people see it as exciting.

But leaving aside your views on crypto, the important part is that this is a dramatic re-imagining of our news system. Uh, from the ground up, Don Tillman’s project is called Aula, which you can [email protected]. That’s A-E-M-U-L-A. Don thinks it’s possible that he’s creating the newsroom of the future, and after meeting with him and talking to him about the news system, the news ecosystem, and about technology and about politics and polarization dynamics, I think it’s possible he’s right.

I’m impressed with Don and think he’s onto something very big and very important. I think he has a lot of smart ideas. And I think no one else is doing what he’s doing. [00:02:00] Taylor Dotson is the author of The Divide, which I think is one of the best books about American polarization. Taylor also expressed his support for Aula saying, Don Templeman is laying the foundations for a trustworthy informational environment at scale.

The digital newsrooms of the future will look something like annular. I wanted to try to instill in you some of the excitement I had on learning about Don’s work. You’ll like this episode if you’re interested in better ways of doing news and journalism, or if you’re interested in how blockchain technology can be used to create healthier social incentives.

Or if you’re curious to know why people are excited about blockchain technology and why so many see it as a game changer with broad applications in many industries. If you didn’t already know, I’ve written two books on polarization and for the last 1.5 years I’ve been working on that pretty much full time with some [00:03:00] nonprofits and doing my own work.

Like with my Substack and my podcast and various interviews and writings. I myself focus on cultural change as a way to improve things in these areas as opposed to systemic changes. It is not that I don’t think systemic changes have their place, it’s just that I think most systemic changes are unlikely to succeed because we’re so polarized that we’ll never agree on making those systemic changes.

For example, let’s say that we were a hundred percent certain that rank choice voting would lead to less polarization and discord, which is not certain many people would disagree with that. But let’s say we were certain, I don’t think we’d ever see Republicans and Democrats get on the same page to pass something, to change things in that area.

I think toxic polarization leads to us becoming polarized over pretty much everything of significance. So [00:04:00] even if theoretically many people supported something and were in agreement, as soon as a Democrat or a Republican leader becomes associated with that idea will likely become polarized over it.

That’s just what polarization tends to do to us. It makes us fight over stuff in unreasonable ways. As I’ve talked about in a previous episode, we can have a tendency to instinctively think something like, well, if the bad guys are for this thing, we should be against it. So that’s why, that’s one of the reasons I focus on cultural change, trying to arouse the general demand for less contemptuous and toxic ways of engaging.

I have a past episode where I talk with David Foster about cultural change versus systemic change. David has worked on proposing changes to media and news systems, and again, to be clear, I’m not saying that systemic change focus is a bad thing or a [00:05:00] waste of time. I think we need people thinking about all these things.

I just personally think that there’s a lot of low hanging fruit. In the cultural change area, and we need more people working on that. But the interesting thing about Don’s Project Aula, what made it exciting to me is that it was something that was a private sector thing, not something that needed to be mandated by the government or passed via legislation.

And so if Aula became successful and. Became used by millions of people. Eventually, it could really shift incentives and change the culture without anyone ever being able to say this was forced upon us, or that it was associated with one side or the other. All it has to do is what it sets out to do. Be a great news site that people want to use, and it’s other benefits that are about better, less polarized, polarizing incentives and ways of engaging.

Those benefits will unfold indirectly just [00:06:00] as a part of it becoming popular. So, okay. Uh, what is aula? Well, it’s a news platform, but it operates completely unlike other news platforms, it might be easier to walk through some of the ways it works that make it unlike other news platforms. For one, it is decentralized.

You might have heard this word used to describe cryptocurrency. Crypto is a decentralized currency. But what does decentralized mean? It means there is no one actually in charge of it. Don sets up the way the system works, for example, the way that Aula decides to promote submitted articles to readers.

And then Ambula operates on its own. It has its own baked in rules that cannot be changed. Although some rules can be changed by a community of people who vote to change it. Similar to, uh, some other blockchain based. Services. Aula is also [00:07:00] transparent. It’s algorithms, how it works are in full view, visible to all.

There’s nothing hidden. So these two things about Aula, uh, the fact that no one’s in charge of it and the fact that it is transparent, help build trust. Unlike other major sources of news now or in the past, there are no editors deciding what to feature. This means that it takes away perceptions of bias.

It removes the tendency to see, uh, to be angry at the news platform itself, or the editors, or the owners for their bias or their propaganda or their malice or these kinds of things. Now, people may not like the way Alo works and the content it exposes to them, but that’s a different story. They can trust that the algorithm is transparent and public, and if amulet is working properly, people will like the things it surfaces to them and want to keep using it.

And because it’s decentralized and operates on its own, it is infinitely scalable. [00:08:00] Unlike existing traditional newsrooms, ulus billing and money distribution is also entirely automated and transparent. Subscriber money goes into a pool, and then content creators get automatically paid based on how much engagement their articles get.

But wait, you may be saying or thinking there’s no one in charge. That sounds like it would be pure chaos. Wouldn’t it turn into a madhouse? Wouldn’t it be like an out of control four chan or Reddit thread or something like this? But that’s where Ambula is meant to shine. It uses a sophisticated content recommendation protocol that tries to both A, give people what they want based on other articles and writers they’ve liked.

While also B, moving their content recommendations in less polarized, less fringe directions, and more in the direction of ideas and news that have appealed to a broad range of people. Now keep in mind that when I say polarized or [00:09:00] fringe here, we’re talking also about contempt and animosity and uh, just plain obliviousness here.

We’re not talking about ideas. So it’s not just about moving people towards ideas and stances that are moderate or in the middle, so to speak. It’s about moving people towards less contemptuous takes and coverage. Coverage that understands and respects a broad range of views that is not oblivious to the ways that many people see issues and see stances.

Another way to understand this is from the journalist perspective. The view of someone who submits an article to Ambula because of how the algorithm works, that journalist or pundit will have an incentive to try to speak to a broad range of people as opposed to just people on one political side, as opposed to just venting to and speaking to one’s allies.

The algorithm creates the [00:10:00] incentive to try to reach more people and and be more persuasive. I think that’s what’s exciting about Amila. It is meant to create a self-sustaining, infinitely scalable system that has much healthier, more social, less polarized incentives than our current news ecosystem. If it were to become popular, it could create a seismic shift in how people create and consume news.

It would give power to journalists and pundits. Who take more respectful, less polarized, more nuanced approaches, it would lead to less polarized, hateful discourse. It would lead to more nuanced discussions. It would lead to more creative compromises becoming visible. As I said, I don’t often get excited about ideas for changing the system, but I’m excited about Don’s project.

I recommend you sign up to Aula on the main site, which is aula.com. Again, that’s A-E-M-U-L-A. And I recommend that you also sign up for the [00:11:00] Aula Substack, which [email protected]. Okay. Here’s the talk that I had recently with Don Templeman, founder of aula. Hey Don, thanks for joining me.

Don: Yeah, thanks so much for having me on.

Always a pleasure. Oh yeah, my pleasure. Uh, so maybe we can start with. Uh, you know, as, as you, as you and I have both seen, uh, trying to explain this to some people recently, it can be kind of hard to communicate in, in a short form your vision for this thing, which I think speaks to what a kind of paradigm shift it is and how people think about news and journalism.

But, uh, maybe we could start with an analogy or two because, and I’ll, and I’ll give you a rough sense of how I see it. I kind of view it as like, sort of like the constitution, you know, of a, of a country can help set things up to run in ways that are [00:12:00] self-policing and help create good outcomes. Basically, what you’re trying to do is something like that that creates a self-sustaining system with good incentives for a news and journalism platform.

Is that a good analogy? And maybe you can talk about that.

Don: Yeah, exactly. That’s. What we’re trying to do is set up an ecosystem, an incentive structure, with rules that allow writers to go out and produce high quality independent journalism and readers to be able to consume it and know that they can all trust that it’s a credibly neutral platform.

It’s a high trust environment, and with constitutions, that’s essentially when they’re using countries, what the intent of them is. But what you’re relying on with constitutions is for the execution of that vision of those rules and values that you’re setting up from the start. You’re relying on other people to execute that vision.

So like you’re relying on the judiciary process to actually work and make sure that it’s not [00:13:00] corrupted or executive functions, everything like that. Whereas what we’re trying to do is remove that reliance on trusting individual other humans. By building it on decentralized technologies that are inherently trustless.

So you’re not necessarily having to rely on other individuals to make sure that you can trust information you’re getting. You can see that like this is a program that is running, that there isn’t any way to have a malicious act outside influence, try to censor or push specific narratives. Uh, you can just join the ecosystem.

It’s open to everyone and everyone can rest easy That. Everything is running according to the rules that we’ve all, uh, accepted.

Zachary: Yeah, I think that’s where people, uh, especially people who aren’t that familiar with, you know, blockchain or decentralized structures. Some people can struggle with this because I’ve seen, you know, when I was explaining this to some people, I think a lot of people will think like, well, somebody’s gotta be in charge of these, you know, [00:14:00] editorial decisions or pro what we promote or what articles get promoted.

But the, uh, maybe you could talk a little bit about what. The decentralized technology really means, and you know, I guess an analogy for it is sort of like Bitcoin or other cryptocurrencies create a self sustaining or self-policing system that works on its own. That’s what you’re trying to do for, for journalism, right?

Don: Absolutely. And I think the Bitcoin analogy is good because it provides like a simple structure that you can use to start to understand. A lot of the mechanisms that we’re using, uh, where we’re building Ethereum or building Aula on Ethereum. And so with Bitcoin it’s a digital currency. And traditionally with currencies throughout human history, what you’re relying on to be able to transact and make sure that no one is kind of creating their own bank balances, no one’s printing their own money.

Is you’re relying on trusted third parties. So these are [00:15:00] institutions like banks or governments or federal reserves that are maintaining the currency system. And what Bitcoin is doing for currency is they’re removing that trusted intermediary and saying that you can trust this digital protocol where everyone is coming together and collectively agreeing on the state of the protocol, essentially like the transactions that are happening, everyone’s bank balances.

So that we can all go and transact freely without having to rely on banks or governments or middlemen. The same thing is happening with Ethereum, and what they’re trying to do is create a world computer, essentially one computer that everyone can come in and work with. Uh, it’s essentially a network. So when you think of traditional corporate internet platforms, when you think of Facebook or Substack.

These companies are running their platforms on their own servers that they fully control. So Facebook has, they control the gates so they can say who has [00:16:00] access to Facebook, they can delete posts on Facebook. They can change the algorithms that Facebook is running and should using to create your feeds for you because they fully control it.

Whereas if we build a protocol on top of Ethereum, no one inherently controls that computer that we’re using to run this program on top of. So no one controls who has access. No one has the ability to remove content or sensor content. Uh, so everyone is able to come and contribute freely and work together to collaborate, uh, with the shared mission, which is essentially the protocol that we’re putting together for aula, specifically, a protocol for producing and distributing independent journalism,

Zachary: right?

So anyone can submit, uh, content to it, the system. Automatically promotes content to people that, uh, it thinks will be interested in that. And maybe we could talk a little bit about, ’cause I think a lot of people hearing that, they’re like, well, nobody’s in charge. It’s just gonna devolve into [00:17:00] madness and chaos and, you know, we, we’ve seen how these things can play out.

So, uh, so maybe you could talk a little bit about, you know, how you’re, how you’re making the, the, how the algorithm is promoting content and how you’re, uh. Trying to reward people that speak to a broader range of people to try to break the usual incentives for kind of like speaking to bubbles and such.

Don: Yeah. I’ll start very high level because I think curation is such an important aspect of news and just interacting with information online in general, because we’re in such a digitally interconnected global society. Where most people feel like they have some pulse on what is going on globally, but when you think about your worldview that you’ve created, you really are only able to create it either based on stuff that you’ve directly experienced in your own life, which is a very small subset of that information and information that has [00:18:00] been reported to you from third parties, and in most cases, this is.

Third parties that are strangers to you. These are news reporters, people in different countries reporting the news, people on social media that are sharing posts. And so a lot of the time you’re relying and trusting these strangers on the internet to provide you information that you’re then using to generate your worldview, that you go out and share with other people and use to form your own basis for your own belief system.

Zachary: Right? We’re all, we’re all products of our surrounding ecosystem. Yeah.

Don: exactly. And historically what we’ve had to do is rely on trusted intermediaries to handle that curation process. So with legacy institutional publications, you’re relying on the credibility of the New York Times, the Wall Street Journal, the Washington Post, because they have such a long track record of generating the high quality professional journalism that I know.

If I subscribe to you. New York Times, I’m trusting that their [00:19:00] editorial board is going to go out, sift through all of the information that’s being generated in the world on a daily basis and condense it and curate it into something that is relevant to me. It’s engaging and is something that I can trust to be able to actually build my own belief system off of that.

A similar process happens on social media, but that is more algorithmic where you’re saying like, I will. Create a Facebook account, I’ll create an X account and I will read the stuff that comes up in their news speed and I’m trusting their algorithms to promote content to me that I will find engaging and I’ll follow people and subscribe to people that align with my beliefs.

And I’ll use that to form, uh, the basis for my belief system. However, you’d need to start to look into the incentive structures of how these different institutions are forming these curation algorithms. So with Facebook with X, they’re all free to use advertiser driven. [00:20:00] And so they’re trying to optimize their algorithms to promote content that captures your attention they can use to sell advertising.

And you’re more of a product in those ecosystems where I’m giving you my user data, I’m giving you my preferences, and they’re using that to sell to advertisers to target you with personalized advertising. The type of content that ends up getting promoted by those algorithms is more of that like click beatty, rage Beatty type stuff, where it’s.

It gets people arguing in the comments, it gets people sharing it with their friends, it gets people talking about it, and that’s kind of that amplification cycle of this inflammatory environment where all of a sudden everything online seems so much more polarized than it is in our actual day-to-day lives of interacting with individuals.

Because if you’re actually just sharing what the vast majority of us actually experience and believe, it’s not really that exciting and not something that gets people like retweeting things on x. [00:21:00] So what we’re trying to do is create a different incentive structure with algorithms that we’re using to curate content, because if we’re not relying on a editorial board to go out and cherry pick articles, we’re gonna have to rely on algorithms just given the vast, vast complexity of all the information going on and being generated on a daily basis.

So the first thing is all of our algorithms, we want to be completely open source, fully transparent. Anyone can come on and see like, why am I seeing the content I’m seeing? Writers can see like, what are the goals I’m trying to hit with how I can receive larger promotion for my content work on the platform.

So everything needs to be completely open source and. For open source, fully transparent algorithms to actually be usable. They need to be human readable and easy to understand. So you can’t rely on machine learning, artificial intelligence, like these black boxes where you don’t really know what’s going on in it.

You just know their goals. So we’re trying to have a very [00:22:00] simple human readable algorithm. But then on top of that, what we’re trying to do, our main goal is to be able to reverse these trends of polarization that we’re seeing in the media. And we can do that by promoting articles that receive a lot of diverse support that are written from a more moderate perspective, that have been, that have gone through a peer-to-peer editorial process that may receive feedback, may have been backed up by more research.

So we can include all of these as inputs into how we’re ranking articles and our system. But what we’re able to do without getting kind of too into the details on this now, is. We don’t need to know. We as aula in this context. We don’t need to know any underlying data about the users. We don’t need to know any underlying content of the articles.

We’re just looking at relationally, how are people interacting with specific articles? How are people react? Interacting with specific authors, [00:23:00] because then we’re able to back out and see. Roughly who agrees with whom on the platform. We don’t necessarily need to know what their perspectives are. We don’t need to know if they’re left leaning or right leaning.

We don’t have to try to put content labels on these types of perspectives, but we can roughly see like where people fall in this general population of platform users. And we can map out like what is the actual central consensus viewpoint on the platform and where are the fringes. And once we have this map, we can say people who are writing.

From this moderate center are likely making better arguments than people that are maybe getting a lot of engagement from a small group of people, but are all on the fringe, and we can look at people who are agreeing with those articles and see if they’re all coming from diverse backgrounds and different pockets of ideologies.

Then whoever’s writing that article is probably making arguments that are based in fact, that are sound arguments, reasonable arguments that are easy to engage with from people of [00:24:00] all sides. Once you are able to promote and rank articles based on that diversity, you’re able to start to create new bridges and, uh, kind of new pathways for people to discover new perspectives because it allows ’em to slowly over time start to become exposed to new perspectives rather than.

Showing someone, uh, argument from the complete opposite side of the aisle where they’ll probably quickly dismiss it and discredit it as, uh, false. Even if it is making strong sound arguments, we can show them something that is kind of adjacent to them or slightly more moderate than their current point of view that they’re likely to agree with.

And then over time you can slowly depolarize the entire ecosystem.

Zachary: So, um, yeah, I think the interesting thing about this, the thing I think a lot about is it, it kind of relates to my experience in poker and thinking about game theory and stuff is I think a lot of people think [00:25:00] that if you made your algorithm, uh, your strategy, so to say public and everyone knows it, that it opens it up to gaming and exploitation.

But I think the, the thing that you’re trying to do and, and other companies try to do with these kind of open source strategies. Is you’re trying to create a strategy that has built-in incentives so that even if someone was trying to game it, that’s a good thing from your perspective, right? It’s like, so if somebody is trying to game this algorithm, they’d be trying to create content that speaks to a wide variety of people and like, you know, so that, that’s a good thing.

So I think you’re, you’re creating the, the very incentive that, that you want to see, even if people try to exploit it, right? Am I understanding that correctly? Exactly. That’s, we want you to

Don: try to gain the system because you’ll see. People posting on X or people posting on Substack, how to growth hack your audience.

What are the games you have to play on the platform to try to gain exposure? And with ambulance, like the incentives we’re putting in place are if you’re [00:26:00] trying to gain the system, if you’re trying to increase your exposure, increase your monetization, earn the financial rewards that we’re trying to put out there, you’ll start to be riding from a depolarized perspective.

Inherently, uh, which we think will benefit people over the long run because we want individuals bringing their own unique perspectives. If you’re relying on humans to generate content, they’re gonna be bringing their own biases no matter where they fall. But we think if the ecosystem has those incentives towards depolarization, then that will happen as a second order effect, and everyone is able to still have the liberty to write and interact freely.

Uh, but it’s just, that’s, if that’s where the money is, that’s what people will start to align their engagement for. And it’s, most of the time people are moving the other way where it’s like they may have a more moderate perspective, but then they’ll try to use more inflammatory language or kind of over like, make everything more over exuberant, [00:27:00] like the Mr.

Beatification of YouTube just to try to get clicks and engagement. Or it’s, we can allow people to actually just share the more moderate point of view and they’ll actually receive more monetization that way.

Zachary: Yeah, and it’s worth throwing in too for people thinking like this is some sort of like, because people hear moderate or centrist and they start thinking like, well you’re, you’re trying to change people’s beliefs towards some moderate or center its beliefs.

But I think the important thing is. A lot of what we’re talking about when, in terms of moderation is the, is the contempt that people have for other views, so it’s like mm-hmm. It’s not necessarily a moderate or in the middle stance that somebody have or that might be popular. It’s somebody might have a view that many people think is extreme even, but if they’re expressing it in a persuasive way and not like.

Demonizing, you know, groups of people that’s, it’s all about how you express it. So I just wanted to make that clarification. ’cause a lot of people will kind of mingle like beliefs and, and, and this level of like engagement and contempt. And I think, you know, it’s important to [00:28:00] to point out like, you know, it, there’s all sorts of views that could be, that could, that could gain purchase, uh, in, in in the audience.

Right, exactly. So, yeah, and I wanted to say too, yeah, I think, I think people, I mean for, for me, I. I think so much about the incentives and the systemic incentives, and that’s why it’s hard for me to get too upset about people’s behavior because I, I just see so many ways that this systemic thing that we’re in, like this, this, this toxic conflict kind of scenario and the various incentives of various structures like media and politics, there’s so many systemic elements to this, which to me is like, and, and so many people I think focus on specific people as being.

Agents and powerful agents, you know, for example, they might say like Fox News or M-S-N-B-C are, are making bad decisions and, and, and polarizing us. But I think it’s important to see that they are part of a system and they are playing by the rules of that system. And they [00:29:00] are, you know, whether they know it or, or not.

There’s various, you know, range of people trying to specifically rile people up or, or they’re just biased or actually believe what they’re doing. But there’s various ways that the systemic sys the, the system. Incentivizes polarizing behaviors. And I think, I think when pe when you see that clearly as I think, I think you and I see that aspect of things clearly, it really shows the importance of creating better incentives and not getting so been outta shape on specific actors.

And it’s like, can we, can we work on these foundational incentives? Right? And that’s what’s so exciting to me about your thing because I, I actually see very few. Systemic things that could actually work because, you know, for example, if it was a government based system, we’re so polarized that it’s very unlikely that we’d ever get on the same page about passing some big systemic change government wise.

And I, that’s why like, I think like rank choice voting is kind of a, a dead end for that reason. ’cause I think it’d be very hard, [00:30:00] you know, we’ll become polarized over that in various ways. Yeah. Uh, so the various things that people might propose, I think are. Are difficult to get past, but I like the organicness of, of, of yours and the fact that it might grow organically and be a real paradigm, uh, shift there.

But yeah, there’s just so many incentives baked into these various systems. Yeah.

Don: Yeah. And that was sort of the genesis of the idea is I was starting to realize that everything I was reading online. Or in the press was seemingly more polarized than my actual like day-to-day experience of speaking with friends, meeting new people, and actually talking about things.

And I may be an optimist in this regard, but I think most people, when you’re interacting directly, you’re able to find some common ground. And if you actually spend the time to have that conversation, it’s typically. Things aren’t as inflammatory as they seem online. And when I started to try to understand [00:31:00] like what is actually driving this, like what are the underlying motives that are driving people to be more polarized, there are a lot of contributing factors, but what seemed to be one of the largest contributors was just the incentive structures of our media, our media systems, because.

Corporate social media platforms, you’re relying on advertising supported free to use platforms. These algorithms that are, uh, incentivizing people to promote more inflammatory clickbait type articles or if institutional publications, you’re running into this audience capture type scenario where. You’re, they’re having to carve out their own market share within their own niche to capture market share within these, uh, kind of media markets and the audiences that they then capture, they’re trying to support with providing them.

Information that aligns with their audience’s beliefs, and then they’re hiring journalists that are able to write from those [00:32:00] perspectives. The editorial, the editors on these editorial boards are selecting information to align with those perspectives. The investors, the entire ecosystem that they’re creating is all aligned to support this niche perspective of their audience.

And so that’s why you start to see this fragmented landscape where it’s so difficult to be a reader of one publication. Then switch and start reading another and you’re like, this seems like in a completely different world over here. It’s you switch between world. Yeah. Fox News and M-S-N-B-C. It’s always completely different

Zachary: narratives.

Yeah,

Don: exactly.

Zachary: Yeah. And I think too, it’s also like just the various incentives, you know, not even intentional. It’s like there’s a lot of true believers that like say you’re, you know, an editor or a journalist at New York Times and. You, you find it really hard to understand the quote, other side’s point of view that can’t help but, you know, leak into the things that go out.

And so there’s, there’s compounding aspects of how we form these two [00:33:00] divergent narratives with true belief or, you know, kind of subpar incentives that work together. And yeah, it’s just a whole stew of, of things, biases and bubbles of information and lack of understanding the better arguments on the other side, et cetera, et cetera.

Um, yeah. So, um, yeah, I wanted to ask you, I mean, I think, I think one thing when people hear about this kind of approach, uh, I’ve seen this in, uh, in, in other digital efforts too, uh, where, where people have a response of like, sort of like they have to ai where they’re like, oh, you’re trying to take all the, the soul and humanity out of, um, you know, news and journalism.

You know, you know, you’re, you’re, you’re destroying the. The traditions and the humanity and the human choices. But I think that’s lacking, uh, not seeing what your vision would be. Maybe you could talk a little bit about what you see that kind of argument is missing.

Don: Um, I’m glad you mentioned AI in that context, because that is what we’re trying to [00:34:00] avoid.

We’re specifically building an ecosystem that is verifiably human and relying on human generated reporting. Because we want to set us up in this new age of AI’s out there. You can get AI summaries on pretty much any news event that’s happening. Uh, if you Google something, the first thing you see now is like Google’s AI generated summary.

We’re moving away from actual human reporting. And if we want to focus on human flourishing and human creativity and humans actually reporting their experiences through the news. We need to have a more human-centric ecosystem than trying to go these AI routes. And so what we’re trying to do is leverage this decentralized technology to support humans in actually generating human created content.

And one of the interesting things that we’re able to do with that is. We can verify someone as a real human. There’s multiple [00:35:00] methods to do this. I think the one that most people may be familiar with is Sam Altman’s World Coin or now just re-branded as world. Uh, but I don’t know if you’ve seen this, the one where you scan your retinas and then you get a proof that you’re a real human.

Uh, a lot of people have issues with scanning their retinas, how that data is used. But, uh, there are multiple different methods that you’re able to. On your own device, prove that you’re a real human, and then use that proof without giving up any underlying personal data to say, this is my account. I’m a real person, I’m not a bot, I’m not an ai.

And we can verify, we can use that proof as verification of our users, that they’re real people. And so what we’re able to do is we’re able to assign a higher reputation to people who have verified, uh, as being real humans and not AI bot. And so as we start to see more and more prevalence of AI agents operating online.

We’re already seeing it on X, where [00:36:00] there’s a bunch of autonomous AI bots that are posting, uh, even on other platforms like forecaster, you’re familiar, people are already complaining about like getting in arguments with someone and then realizing that it’s actually just an AI bot that they’re arguing with.

We can. Start to verify that like we are a fully human-centric ecosystem, so that when our readers log in, they know that like, oh, I’m reading a real news report that was generated by a actual journalist, and I can trust that this is real information and not something that is being spotted by some AI bot.

Zachary: Yeah, that was one of the misunderstandings when I told someone just a about this just yesterday, they were like, oh, it’ll be some AI kind of, uh, parsing of the, you know, they thought, they thought it was using AI to like parse different viewpoints and do something in the middle, but, so yeah. That’s another common Yeah.

Misunderstanding.

Don: It’s actually the, just one quick point. It’s, we’re trying to do the inverse of that. I think a lot of. Companies are trying to skew towards like how can we leverage [00:37:00] AI in our platform because that’s the hot buzzword and kind of like venture capital at the moment. Or if you’re trying to raise money or hire people.

We’re trying to do the opposite because if you look at these AI companies, if you look at these large language models, the information that they’re able to give is based on their training data sets. And for the most part, you can start to see the differences in answers that these LLMs give just based on their training data.

Like for instance, Google’s Gemini is based on Google index sites. You have Perplexity, which is based on, uh, they got that massive scandal with having Index New York Times paywall articles. And so New York Times was suing perplexity for that training data. Uh, open AI has had access to now Microsoft’s like GitHub, so they have all of these code repositories.

So different LLMs have different answers just based on the training data sets, and it’s really just shows that AI are [00:38:00] in their current iteration or just ways to collect data and summarize it for people who are giving these prompts, but for them to be able to fully understand the human world, like stay up to date with current events.

They need some way to determine like what is high quality, relevant information.

Zachary: Yeah.

Don: And so if we’re able to provide, Hey, we have this platform, it’s all content that’s generated by people that we’ve verified as humans. It’s gone through this robust moderation protocol. We understand like the context of this is a more widely considered accepted true belief, whatever this article is.

We can use that to provide it to LLMs as a basis for fundamental training data so that they can actually be better at summarizing and giving us better context in our daily lives. So they’re more powerful tools. So we’re more trying to create a fundamental training data layer for AI systems rather than using AI systems, uh, to help [00:39:00] curate content.

Zachary: Yeah, I mean, if, if you’re successful at this, there’s just all sorts of ways that. You, you could use, you know what the content that’s marked as high value or, or, or persuasive to many people, you could, there’s so many ways you could use that in other ways. Yeah,

Don: absolutely. And the important thing, like, like I mentioned with the New York Times and Perplexity and how Perplexity was using unlicensed articles from the New York Times, what we’re able to do, since we already know the verified owners of all of the content.

And we know that they own the copyright of that content, they can then have the full freedom to say, I want to license my content to LLMs. We can facilitate that for them and then pass through the payments directly to the authors as the owners of the actual underlying content, which is much more difficult to do in a centralized experience.

So like if you’re on Facebook or if you’re on X in your user terms, you’re pretty much signing off that like. [00:40:00] Anything I post on x I’m agreeing to just let be ingested by Grock on X. As its training data, uh, we can say that anything you’re posting to aula, you now have full control to determine if you want this to be licensed.

And if it does get licensed, then you get paid. And so you’re compensated for actually giving this information to be used. Is training data

Zachary: very cool. Uh, the other, the other thing I can imagine. Uh, people objecting to for this, which is just kind of a subset of, of how people object to depolarization and, and bridge building type work is, and there would be different ways this would show up on the left and the right, but it there’d be that objection of like.

You’re trying to control our thought. You’re trying to, you know, bring our, you know, create some system to make us, uh, more moderate, you know, in, in a, in a political, middle, middle, uh, of the road way. But I think I’ll just give my, my reaction to that and you can respond to it. ’cause I, I think that what that misses is [00:41:00] this is trying to create.

A system that brings out the best of humanity. It’s not one way or another we’re gonna be controlled by the things around us. And this is creating a system that’s trying to bring out the best of humanity and to, to prevent us from becoming into these divergent narratives where we’re we, we have so much contempt for each other.

So I think it’s, it’s, and, and it’s not trying to control people, because at the end of the day, you’re gonna react to the things you react to on that platform and be shown. Things that correspond to what you like, even if it recommends some other, you know, things that thinks you might also like, that are a little bit more, you know, depolarized.

So I think it’s, I think the, yeah, the counter argument is like, no, it’s not trying to control you. You can use the system however you want. If you don’t want to use the system, obviously you can go use another system. Uh, it’s still a free world. Uh, but it, you know, the, the goal is that it gives many people what they want, right.

Don: Yeah, yeah, that [00:42:00] we’re trying to, we don’t want to control everyone and like bring their perspectives into the center. What we’re more trying to do is create a more accurate representation of what people’s actual viewpoints are, because I would argue that on corporate social platforms, they’re more doing a worse job of pulling you more towards the fringes.

Like if you just create a new account. On Facebook or X and you go and you start interacting with data or posts, you’ll start to see how you start to get recommended down these pathways towards radicalization. And they’ve actually done studies on this, on TikTok and YouTube. Like how quickly do you get pulled into these more like radical fringe belief systems?

And when you look just overall like. What is like a distribution of people who holds certain beliefs? Like most people will be fairly moderate. There’s small amounts of peoples on, uh, people on the fringes, but if you look at the type of content and the voices that are getting promoted on social [00:43:00] media, you see that there’s a lot of weight given to those small fringe beliefs.

They’re getting an outsized portion of the voice on these social media platforms. Versus all of these people in the moderate center that have expertise on certain areas. They have their posts that should deserve to get engagement, but they’re just not getting the clicks ’cause they don’t drive that sort of ad engagement.

Mm-hmm. So the first thing is like we want people to have a more accurate rep representation of people’s real opinions on certain topics. The other thing is we try to promote and recommend articles to people based on their current individual beliefs. Because we want anyone to be able to sign onto the platform no matter what their perspective is, and we can give them engaging, relevant content.

So if you are someone on the far left or someone on the far right, we can show you stuff that is slightly more mo moderate than your current point of view, but is still on your side of the spectrum. So it’s not just [00:44:00] promoted like moderate, central, central, centrist voices.

Zachary: Mm-hmm.

Don: And through that. You’ll slowly start to see people come more towards the center over time, and we would expect it to start to reflect more what the true underlying population actually believes.

But over the years, if the problem then switches and we feel that people are being kind of sucked too strongly into the center and that we think that there needs to be more diversity of thought, the underlying algorithms are fully community governed, so the community could come together and. We set up these algorithms with the goal of depolarizing the media landscape.

We ne now think that it’s too depolarized and that people need to start exploring new different belief systems so we can then vote to change the underlying incentive structures of the curation algorithms to start to promote people, to explore new viewpoints and promote people that may be speaking up from the fringes.

So it’s. That’s a problem way down the road. Yeah. Like that would be a great problem to have.

Zachary: Right? [00:45:00] It’d be like the cycle, you know, it’s kinda like when in the 19, uh, what was it, 1950s in America when they were like, the political parties need to become more polarized. They’re not different enough. And so, you know, in the, in the utopia that, uh, ambula creates in the future, one day they’ll be like, we, we need to make ambula more, uh, a little bit more polarized and differentiated views more.

And then the, the cycle then can be, begin again. I’m

Don: just kidding. Yeah. Hopefully not go into a cycle of that. But that is, that is the purpose. We’re, we’re not trying to put, we’re not trying to push any narrative. We’re not trying to push any point of view on anyone. We’re trying to let everyone speak freely, speak independently from their own unique perspectives.

People can explore those perspectives freely. And if we have run into issues with how the algorithms promoted content, then people can propose changes. Everyone can vote on it and, uh, accept them. So it’s. Not some corporation trying to put this algorithm on everyone. It’s actually a fully community governed process.

Zachary: Yeah. Even the, uh, I was gonna say, even, [00:46:00] even sites that you wouldn’t think do this can be very, can get you down rabbit holes, like Amazon for example, because I do polarization related research. I was buying a few, uh, books about the conservative Republican views. To for research, and all of a sudden I was getting recommended, like election denial books and like, you know, liberals are garbage humans books, you know, it’s like, uh, just very quickly and it’s like, that’s, uh, interesting.

But it’s, you know, it’s understandable why that works. It’s like, even if they don’t want to do it, that’s just kind of fun naturally how the organic incentives tend to work. Yeah. Uh, but I want to, uh, I wanted to ask you. Yeah, I was kind of. Obviously this is a joke, but I was curious if you thought about do remaining anonymous and being like the, uh, satoshi of, um, you know, blockchain, uh, journalism or something like that.

Don: I, I mean, I will say like I wasn’t never planning to be anonymous, but I will say like there is merit [00:47:00] to the anonymity behind Satoshi No komoto and really like that helped to kind of create more allure and everything about it. Yeah. But there’s also this aspect where it’s. You like with ula, we want it to sort of be a baseless organization because the whole point is we don’t actually have control over what the narrative is.

So it doesn’t matter what my viewpoints are because everything is community governed. Everything’s community moderated. Everyone’s free to join it. We can’t censor anything, so it doesn’t really matter what my beliefs are. So there is some merit to being like, I’ll stay anonymous and Angela can just be this faceless organization.

It can just be a foundation that helps support this without having anyone worry that like my beliefs or biases are affecting the platform. But I think it’s also. Everything’s open source, everything’s fully transparent. So you can also just go in and see that like I verifiably am not able to go in and start to [00:48:00] make changes or sensor the platform.

Uh, once we have fully launched our community governance protocol, uh, I will say we’re very early on in the process that it’s a small kind of like early testing phase of ambulance. So. It is still fairly centralized, just given the size of the community, but we’re building in programmatic breakpoint, so as the community grows and reaches certain diversity metrics.

More and more control gets passed off to the community to a point where I, I no longer have any ability to control sort of how it’s operating.

Zachary: Right. Gotcha. Yeah.

Don: Yeah.

Zachary: Um, yeah, there’s, there’s, there’s trust involved in the, the transparency and the fact that nobody’s pulling the strings. Yeah. Yeah. Um, so may I’m going to, I was gonna switch to more blockchain kind of related questions ’cause they are just some things I’ve wondered on my own and I figure some other people might.

Wonder them. Uh, so I, I’ve always been a little confused about what makes blockchain so special, because I’ve seen some people say like, oh, it’s [00:49:00] just an, just a ledger, an append only ledger where you can only add to it and not edit or subtracted. But, uh, and theoretically that, that, that’s a kind of form of ledger that already exists.

But maybe you could talk a little bit about what is so exciting about blockchain, what I’m missing there.

Don: Yeah, I think it helps to try to stay like fairly abstract in general about it and building an understanding. Uh, but like how I view what the underlying blockchain technology does is it really facilitates coordination among individuals at scale without having to rely on trust of other individual parties.

And that’s done through consensus mechanisms to really what. Blockchain means is it’s a data structure of you have a block of data and everyone through this consensus mechanism agrees that everything in that block is a valid transaction. There’s nothing nefarious going [00:50:00] on. We all agree that we all accept that this is the current state of our ecosystem.

And then once everyone validates that block, it gets added to the chain. And since everything’s chained together, you can’t go back and try to change something in the past. Once it’s added to the chain, then it’s final. And everyone ag, you don’t have to pay any more attention or thought to it because it’s like we’ve all agreed this is a valid block.

We add it to the chain. Now our focus is on validating the next block. If that makes sense. We can speak more to like. Yeah, consensus mechanisms and like how it works in practice. I think I,

Zachary: I guess, and correct me if I’m wrong, but I guess the, the interesting, the exciting thing about this is that it’s, you know, say somebody put in this type of ledger, this append, only if I’m saying that right, append only ledger, say they put it in a server somewhere.

Like the difference is that that would not be trustworthy because somebody, whoever owns that server could go in and change it. Right? Whereas this is creating. A network [00:51:00] based reality, a system that cannot be tampered with because the community agrees on it. Right. Am I understanding that correctly?

Don: Yeah.

Yes. So it’s the network as a whole. So like we’ll use Ethereum and we can use AM in this example as well, but the community as a whole has a universal state. So we’re all working off of the same computer, essentially. Mm-hmm. So we, it’s not like you have some. The state of the network and I have a state of the network and like we can go do our own things.

It’s like we’re all coming to a consensus and agreeing like, this is the current state.

Zachary: It’s almost like a dispersed, uh, server in a way. It’s like a, it’s like a Exactly. Abstract server that’s distributed.

Don: Yes. That, that’s the perfect way to think about it. And like this state, like what we’re all agreeing on is essentially any interaction that you would do on a traditional server.

Like I can go in it, I can write data, I can read data. And what we’re all agreeing on is like no one went outside of [00:52:00] the guidelines of any program that’s running on the system. No one deleted something accidentally. No one is trying to write data that they’re not able to. No one’s able to try to like spend money that they’re, they don’t have.

So we all agree that everything that occurred in this block is a valid interaction and then we can add it to the chain. And now this is added to this universal state that we’re all working off of. But I realize that like transactions, interactions, data, like it’s all kind of abstracted away and it’s difficult to like understand like what is the importance of this?

So in the context of Aula, if you’re a writer writing to Aula and you publish an article, your authorship of that article and that article’s existence is stored in the data of the Ethereum network, that Ethereum virtual machine, that distributed server that everyone’s working off of. And so if we want to be able to verifiably say that you’re the owner of this data and prove that you own that article, we don’t want anyone going [00:53:00] in after the fact and deleting your article off of the server.

We don’t want anyone going in and trying to like sensor your perspectives by saying like, oh, we’re actually gonna take those articles down. So that’s like once you publish an article, everyone agrees that like you are now the owner of the underlying date of that article. And now once it’s added to the chain.

You can rest easy knowing that you have full ownership of your own underlying data.

Zachary: Right? Yeah. And, and, and the exciting thing about all this is that as, as we’ve seen with, um, cryptocurrency, even though it’s in, its, you know, beginning stages, it, it gives an example of how. A wellc created system that catches on can really change incentives and change behaviors, which is what mm-hmm.

You’re, you’re trying to do for journalism, but the, I think the, yeah, the exciting thing that, the thing that excites people is how you can create these systems that have their own life and, and really change incentives and change real life [00:54:00] behaviors and change how people interact. I think, yeah, I think that’s, that is exciting.

Yeah.

Don: Exactly. It’s, it’s everything is self-executing, so you’re not having to rely on trusting a intermediary. And that’s the beauty of it for journalism specifically, is when you look at what creates like a credible, trustworthy, journalistic environment, you need it to be censorship resistant. You need to be able to trust that there’s not outside influence.

You need to make sure that no one’s manipulating the narrative. And you wanna make sure that you can track people’s reputations, that no one’s creating some massive piece in like misinformation campaign and it gets taken down and then they just go and create another account and do it again somewhere else.

We can verify that your real person, your reputation will be tied to whatever you write going forward. If you try to spread misinformation, then that damages your reputation. You now have to work to build a track record of high quality [00:55:00] content to. Work your way out of that. Mm-hmm. And so it just creates this high trust environment where you’re not actually having to rely on people saying like, oh, trust me, I’m gonna work in your best interest.

Because like, while that’s all well and good, it, there’s so many powerful incentives behind controlling and manipulating media narratives that they, any weakness will always be exploited no matter what. Like, we’ve seen this. With the New York Times, like in 2004 when there was the NSA surveillance story, that they were pressured by the government to not post until after the election.

So like we’ve seen that with the New York Times, which you would trust. We’ve seen it with Facebook throughout COVID when they were pressured by the government to suppress stories on COVID lab leak theories and everything. And then it came out after the fact that they were actually being pressured by the government to censor content and.

We’ve seen it with X where Elon buys X and says that he wants to support free speech, [00:56:00] but then immediately gets into a free speech legal battle with Brazil over them trying to censor moderate content. So it’s, and we’ve also seen it with Substack when people were sharing, uh, like Nazi type articles and.

They were not trying to censor it ’cause they said they were supporting free speech. And then everyone says, I think we can all agree that this isn’t something that we want to be sharing here. And so it, anytime there is that point of weakness, it will be pressured. And so what we’re trying to do is remove that point of weakness entirely.

Zachary: Right. And

Don: say we can’t go in and sensor, there is a, I’ll say there’s a moderation protocol. So if someone is sharing Nazi beliefs that are harmful content. It can be taken down and removed by that moderation protocol, but no one can go in and actually sensor underlying narratives. So even if the government wanted to come in and say, we don’t want you posting this story, it’s, you can’t really come to a and ask [00:57:00] that because we don’t have the ability to control that.

Zachary: Mm-hmm. Well, this has been awesome. Yeah. Thanks for joining me, Don. Anything else you’d like to add?

Don: No, I, I, I think this is great. Like I always appreciate any opportunity to talk about Web3 and Web3 and journalism specifically, and no better person to do it with than you just given your experience in the space.

So, uh, I mean, your support and the invitation to come on definitely means a lot.

Zachary: Thanks. Thanks, Don. Okay. Talk to you later. That was a talk with Don Templeman, creator of ula. Sign up for [email protected] or for the substack ula.substack.com. And again, that’s A-E-M-U-L-A. If you enjoyed this talk, I have related episodes in the backlog.

For example, I have a talk with Isaac sa, a creator of Tangle News about polarization in the news and about how he sees tangle news as trying to reduce polarization. You can see episodes and best of compilations for my [00:58:00] [email protected]. You can check out my polarization related books and other [email protected].

Thank you for listening. Music by small skies.