Live from Imagine IF 2025

This week we feature the “Open Communities in the Age of Control” panel, recorded live on September 20th at the Imagine IF conference in Nashville. The discussion dives into the erosion of trust in a digital age dominated by surveillance, opaque algorithms, and centralized platforms.

Trust Revolution host Shawn Yeager joins Matt Odell and Derek Ross to explore how broken incentives turn users into products, with censorship and deep fakes threatening livelihoods and verifiability. They advocate for open, user-controlled communities via protocols like Nostr, emphasizing personal responsibility, parental tools, and creator-owned ecosystems to reclaim digital sovereignty.

With broken money come broken incentives, and from that flow business models that turn us into the product.

— Shawn Yeager

Timestamps

  • 00:00 Panel introduction: Open communities in the age of control
  • 03:30 Broken money, broken incentives, users as products
  • 07:15 Surveillance, opaque algorithms, and censorship
  • 11:00 Deep fakes and the verifiability crisis
  • 15:00 Nostr: User-controlled communities
  • 18:30 Personal responsibility and parental tools
  • 21:00 Creator-owned ecosystems and digital sovereignty

Resources

Transcript

Show full transcript

applause for Matt O'Dell, Derek Ross, and Sean Yeager. Yo, how's it going, guys? We'll be talking today about the future of digital comms, identity, social, and open communities. And my good friends, Derek and Sean, here with us. I think an interesting place to start is diagnosing the problem. We've found ourselves in an increasing digital world. And the status quo has kind of just been built like one foot in front of the other without really kind of like any real planning. And now we're here. So let's start off with Sean. When you think about the current state of digital communities and identity and social, where do you diagnose the problems existing?

I think as with most everything, as an admitted Bitcoiner, it starts with broken money. And with broken money come broken incentives. And from that flow, business models that turn, as we all know, us into the product. and there has been an increasing, I think, drive to try to milk more out of the consumer user and then the advertisers and the businesses. Cory Doctorow has a colorful phrase. Maybe I won't utter it here. But to describe how these cycles roll out. And so where we find ourselves is not only are we not the user, we're the product, but I think increasingly we are seen as something to be packaged. And we see creeping KYC. We see everything happening in the UK with the Online Safety Act.

And yeah, we're not in a good place right now. Very well said. Derek, how do you think about it? Well, I think that over the past few years, we've all come to a place where we know somebody or we interacted online with somebody, followed somebody that has been censored or has been shadow banned, something along those lines. It's becoming more apparent and it's accelerating. It's kind of odd to see it accelerating. Like Sean just said, we're seeing that happen across the European Union, across the, you know, in the UK. we're starting to see this actually even happen recently here in the United States where people can have their whole entire livelihood, their business taken away because they built their business on somebody else's foundation. And they don't own that content. They don't own that

their followers. They don't own their entire social graph and it's disappearing overnight. years and years of hard work can be taken away from you and you can't do anything about it because you built your entire digital life on somebody else's foundation and it's becoming very apparent that there needs to be a better way yeah i think um there's a there's a couple issues that compound on top of each other um that result in the current trajectory that we're that we're going down in terms of big tech and digital platforms. So, I mean, you guys phoned in on censorship and control, which I think is one that people talk about a lot. So, Sean, you've been exploring like kind of this intersection between, you know, AI and Bitcoin. And the other piece here

that is really interesting to me is like this idea of deep fakes and verifiability. How do you think about that in the current paradigm? I think, I mean, and just a brief bit of background, hopefully not a shameless shill, is the point of trust revolution is to pursue two questions. One is how do we as developed nations find ourselves in low trust societies in that we, I think most of us can agree, Pew Research and others would certainly back this up. We don't trust the government. We don't trust the media. We trust healthcare. We don't trust education. We don't trust each other. We don't trust across party lines. That's not a black pill. I think it's just observably true. The second more hopeful question is how and where can we reclaim the trust that we have given or been demanded of and it has been broken? And how can we build trust where we believe it should be? So that's all to say, can we trust our eyes to your question? You know, can we trust the media that we see and we consume? I think what's hopeful about that is the ability to utilize public-private key cryptography to sign, authenticate, attribute media. I think

we're quite a ways away from that being large scale. I think, once again, the incentives are not necessarily aligned for that to be widely adopted, but I think the tools are there. And the big question in my mind, to echo yours, is at what point do we reach this inflection where there is so much questioning and confusion about is what I seeing real that there a broader adoption of the tools that we do have like Noster and these public key pairs to address that challenge But, I mean, aren't we kind of already there? In what way? There in terms of... I think most people, like, when you open your phone, you're like, is that real? Oh, yes. Like, we're very close, if not already across the chasm, right? Yeah, which I mean, and I'll just say one quick thing there is I think much as in sort of prior waves of technology, there has been the need to create a certain literacy and a certain ability to scrutinize.

I hope that it incentivizes and motivates people to become more thoughtful about what they consume and what they question or trust. I think expanding on what you consume is a unique problem in itself because what content I want to consume versus what content I'm forced to consume is very different. Because we are slaves to the algorithms and what these platforms want us to see. We don't really have control over the content. We don't have the control over our attention. And that's part of the problem too. So if you didn't want to see certain types of content, it's really hard to not see it using these existing legacy social platforms. You're being spoon fed. So I mean from like a productive point of view, how do you mitigate that? How do you actually solve that problem? I mean that's easier said than done. Yeah, it's easier said than done.

but we need tools for users that allow them to choose their own algorithm, to choose the type of content they want to see, to choose and curate their, their social feeds. Just because Elon and Mark Zuckerberg say that this is the content that you need to see doesn't mean that I want to see it. Doesn't mean that you want to see it, but I don't have a choice if I use Instagram or Facebook or x twitter like i have to see that algorithm content i i don't have a choice of choosing you know cat pics as my feed if i want to you know if i want a few cats or whatever it is i easily sure i could browse a hashtag or something like that but that's not a good you know that's not a good choice we need more user tools we need more user choice and there are options out there that give users full control over what they want to consume full control over their attention because that's what these platforms are monetizing they're monetizing our attention right like we need a way to take that back it's our it's you know what my eyes see

it's my attention i should be able to designate what gets my attention and do you think the the friction point with that because i do think that's the path forward the friction point with that is it requires a level of personal responsibility from the actual user. Yeah. Like how do we handle that friction? There's some people that just want to scroll, right? They don't have time to build and curate their own feed. And that's fine. For that, you have a choice. But the fact that you don't have a choice is the problem. If you want the spoon-fed content, great. If you don't want the spoon-fed content, you want to be your own algorithm, be in control, you should have that choice in a wide variety of choices. The choices should be open and transparent, and you should be able to decide which path you want to go. And I would say it's also experiential in the sense that if you're not on Nostra, if you haven't tried Nostra... What is Nostra? What is Nostra?

We didn't even talk about that yet. What is Nostra? Well, so like Bitcoin, I'll let Matt talk to this, it is an open protocol. No one controls it, no one owns it, and therefore it is there to be built upon. And the reason I mention it is, I think most of traditional social media and communications channels, one to many, they are not only monetizing our attention, increasingly they're monetizing our outrage. And I think as people that I've observed experience an alternative, Mastodon, others there are out there, I think we all agree that Nostra is the way to go. once you remove the outrage, it is experiential that I feel better, possibly at least not worse, as I have engaged with others on Noster versus X versus Facebook versus others. And so that is all to say, I think part of the key is just giving people a sense of what that's like. And I think they can begin each of us to sort of rewire those receptors those dopamine you know hits that we're

accustomed to getting um but it will take some time i mean you're drilling down on basically this concept of healthy usage of of technology yes um which i would say as a society we're probably very deep into unhealthy usage of of the tools and i mean i see this firsthand with my own life i see this across all different aspects of society right now we have a term for that nowadays it's called doom scrolling like it became so apparent we have that ai psychosis yeah doom scrolling everyone does it a lot of people do it they know they doing it they and continue to do it But one part on this one aspect of this idea of digital health and healthy usage that I think is incredibly key for our society going forward is all three of us are parents. It's specifically, I mean, I think adults use it in very unhealthy ways,

but the question is, like, how does that affect childhood development? and for something like NOSTER that's an open protocol that's not controlled by anybody how do you think I mean we'll start with Sean again how do you think about handling that issue like how how does how does society handle that going forward with kids growing up with basically just a fire hose of information well I am there's my little guy right there my my almost four-year-old, so I'm a dad to a young boy. And so I have a bit of time, but I'll just sort of maybe share an anecdote, which is that we, full credit to my wife, had given, close your ears, Lath, had given maybe an hour to two per morning of screen time so that, you know, she at home could have some space to do some things. It is remarkable, the change, and this will be obvious

to those of you who've done it, but it was remarkable to me that in saying no and ending that and having zero screen time, the change in our son was incredible. And I personally don't know of any better reference point in my life than to have observed that firsthand. So I can only imagine what a young child given a device in their hand, that's not a judgment for anyone who chooses to do that but i just can't imagine the damage that that will do so i feel very passionate about our collective and individual most of all responsibility within our families to find better ways so i mean we're seeing like right now we're seeing a lot of conversation about disenfranchised youth getting radicalized on internet communities it's become a very sensitive conversation. Some of the quote unquote solutions that have been proposed involves restricting

speech, restricting access, adding digital ID, adding age restrictions. I mean, we just saw blue sky, I think in two states just added age restrictions to their app. Derek, how do you, what is the most productive path forward? Because I think the key here is that that is actually a problem. I do think disenfranchised youth are getting radicalized on niche internet communities. But when you're building out something like NOS or an open protocol where you inherently can't age restrict on a top-down level, what is the most productive path? How do we actually solve that in a healthy way? That's a very good question. And it's probably a very hard question. I think I'll say part of it goes back to what Sean was alluding to, is that ultimately parents should parent.

If kids are having issues online, getting radicalized over certain content, and you don't want that to happen to your kid, then you need to restrict access to certain applications. Now, that doesn't mean completely take away because we know that kids today are very social online, so you can still give them apps. So the second part of this is we just need more user controls and we need more apps across the Noster ecosystem that maybe do focus on restricting, filtering, that type of content. So maybe you have, because Noster is widely open and you can do anything you want, maybe somebody builds a NOSTER application that is more suitable for the youth. Maybe restricts certain type of content. It's only bound to certain content filtered relays and you can't use anything else but that. Now the argument is, well, the kid can take the profile, the NSEC and just use another

app. But if you're the parent, you do parenting and you lock down access to certain applications, You only give them access to the parent approved app. I mean, they're your kids. You should be able to say what apps they use. And the personal example was that is I didn't let my kids use TikTok for a very long time. And my kids are now 14 and 16 years old. They now use TikTok. But they wanted to use it years ago when their friends were all using it, you know, 10, 12 years old. And I said, no, you're not using that app. I'm sorry. And they complained a lot. And I was a parent and said, well, I'm sorry, you're not using it. And I used my parental rights to restrict my kids' access to something I didn't want them on. Now they're older. Sure, I let them do it. And the same would go for any Noster app. I would restrict access and block if I wanted to, the access to do that. Because we have the tools to do that. But then as I said on the other side we do need a NOSTER client to step up and build a kid kid environment Well and I think just quickly the thing that so powerful about this in my strong

promotion of Noster, or whatever may come after, is the ability for individuals, for parents in this particular case, to be given the tools to make the choice. Yeah, I think that's the core. It should not come from X. It should not come from the government. It should come from the individuals closest to and most invested in that little human's health. And I think NOSTER is a prime example of what an open protocol does with regard to giving us that power. Yeah, I think you give parents tools so that they can parent better. Absolutely. And have them take responsibility. And it's bigger than NOSTER, right? Absolutely. I mean it's kind of bewildering that you don't like that apple doesn't have built into the iphone or whatever like really granular controls for parents to choose how their kids are interacting with these things I think you bring it down to almost the os level right like because I'm a tech nerd I know how to go in and on my router and block access to my kids devices to certain websites it's I'll say it's easy but is it easy for everybody probably not

not. So we need easier tools for everybody to use. Yeah, I agree. I mean, guys, this has been a great conversation. We've been a little bit more abstract just to bring it, bring it all back together and make it a little bit more actionable to people here that have never used Noster and maybe want to play around with the test. I think, you know, the best way to learn is to, you know, just get your hands dirty and actually use the tools. I mean, Sean, what would be your recommendation to someone who's interested in seeing what's being built up. Yeah, I'll take just a, I'll steal someone else's analogy metaphor is if you were a medieval king and you needed to issue a directive throughout the kingdom to your military, to someone else, as you would probably recall, you would have a signet ring. That signet ring would be heated, pressed into wax. It creates a seal that letters then delivered to Matt, the general. And my signet ring is my private key. It is difficult to mimic, difficult to forge, presumably hard to steal.

That's my piece of property that allows me to sign. The seal is the public key. And so that is all to say, in these ways that have been created and recreated throughout time, Nostert gives you that ownership. Now, with that comes great responsibility. You own that key. You have that signet ring. And so from that understanding that you can own your identity, you can own the ability to attribute your creation or publishing of content, it can be quite simple. So I think Primal's brilliant. I'll disclaimer format, 1031, investor in Primal. Fantastic application. So Primal.net, I think it's a great way to get started. I think it's one of the best consumer UXs. There are many others, depending on where you are on the spectrum from, I just want it to work, Apple-esque style, to, you know, like us, we're nerds and want to dig in. But I would say, in short, Primal.net, take a look.

Great recommendation. I think he handled that really well. Yeah. So while we have a little bit more time, just real quick, Vibe Coding, Nostra, AI, Bitcoin, that's where your focus is right now. Yes. Why is that powerful? Because Soapbox is building tools that allow people that are creators or have their own community to build an application. You can vibe code it. You can build your own app for your own community. And because it's built on Noster, you can own all of that content. So instead of using Discord or Twitter or whatever for your community, you could use Shakespeare to build your own community app, customized how you've always wanted it to be. And you own it. You own all the source code. You own all the data. It's decentralized. You can do whatever you want with it. And nobody can take that away from you. Whereas if your Discord server gets taken down because you're a streamer or a musician or an artist or something, well, you're screwed.

You can't do anything. But if you use soapbox tools and you build Shakespeare, you can own every piece of the puzzle. Yeah, and the key there is you don't need closed API access. You don't need to verify. You don't need to ask permission. You just do it. Yeah, you have the social graph. You have the identity layer. You have the comms protocol, all in Nostra, which is basically like an open API for the world for that. And then on the payment side, you have Bitcoin so that you don't have to get a Stripe API or something like that to integrate payments. No permission required. Just go do it. Yeah, you want to build a website that accepts Bitcoin payments for your product that you're selling or for your personal website or something. You don't need to know any code. You don't need to be a developer on how to do it. You just have a conversation with AI. And you say, build me this website that does this thing, A, B, C, D. And a few minutes later, boom, it's done. and it's yours and you can do whatever you want with it. Love it. Can we have a huge round of applause for Derek and Sean? Thank you guys. Thank you.