🕸 Unpacking the Social Dilemma — Alignment & Algorithms with Vickie Curtis
S1:E15

🕸 Unpacking the Social Dilemma — Alignment & Algorithms with Vickie Curtis

Amelia [00:00:02] [Music overlapping with introduction to the episode] Welcome to Off the Grid, a podcast for small business owners who want to leave social media without losing all their clients.

Amelia [00:00:08] I'm Amelia Hruby, writer, speaker, and founder of Softer Sounds podcast studio. On this show, I share stories, strategies, and experiments for growing your business with radical generosity and energetic sovereignty.

Amelia [00:00:22] Download your free Leaving Social Media Toolkit at softersounds.studio/byeig and join us as we do it all off the grid [intro music jams and then fades out].

Amelia [00:00:36] Hello and welcome to Off the Grid, a podcast about leaving social media without losing all your clients. I'm Amelia Hruby. I am a writer, speaker, and the founder of Softer Sounds, a feminist podcast studio for entrepreneurs and creatives. I am also, as you might have guessed, the host of this podcast, and I am so glad that you're tuned in to today's episode.

Amelia [00:00:59] Today we're going to talk about social media and surveillance capitalism. Now, I've kept most of the social media discussion on this podcast on the level of personal choices. But if you've been listening closely, you know that I actually hold some pretty strong opinions about the harmful nature of social media platforms to our society and to ourselves.

Amelia [00:01:19] So, as we wrap up Season One of the podcast, I'm excited to have a guest today who can help us explore some of those harms and have a deeper and different sort of conversation about social media.

Amelia [00:01:31] Before we dive into our conversation with Vickie, I do have one reminder.

Amelia [00:01:36] As you've heard in the past few episodes, The Refresh is coming up in August. The Refresh is a three-workshop series hosted by me, Amelia, where it will reset your relationship with social media and envision an algorithm-free future for your business or algorithm-free-ish, kind of impossible to escape those these days [laughs wearily]. So, across those three workshops, we're going to Clear Your Fears of doing business differently, Weave the Web of your channels, community, and collaborators, and marketing, and then Make the Map of your business offerings and marketing efforts for Fall.

Amelia [00:02:12] I shared a lot more info about this in Episode 12, so if you want all the details, head there or go ahead and go to softersounds.studio/refresh to learn more and sign up for the workshop. All right, that's it.

Amelia [00:02:26] So, let's dive into today's episode. Today I am joined by Vickie Curtis. Vickie is an Emmy Award-winning filmmaker and writer. Her screen credits include Chasing Coral, The Weight of Water, Anbessa, and many more amazing films.

Amelia [00:02:42] She's most recently been filming in Guatemala, where she's directing Comparsa, a nonfiction film about two lion-hearted teenage sisters living in the outskirts of Guatemala City. But today, she's joining us on the podcast to talk about her Emmy Award-winning writing on a film you might have heard of— The Social Dilemma. Thank you so much for joining me today, Vickie. I'm so excited to have you.

Vickie [00:03:04] Thank you for having me, Amelia. It's lovely to be here.

Amelia [00:03:07] Thanks. So, I am very familiar with The Social Dilemma. I've seen it a few times now, but for folks tuning in who may not be familiar with the film, could you give us just, like, a quick overview of what it's about?

Vickie [00:03:19] Sure. Sometimes the easiest way in to talking about it is how the project got started. Our director, Jeff Orlowski, went to Stanford University for his undergrad degree and is sort of right— right at the time that Facebook was launching as a brand-new baby platform for just a couple of colleges, you know, not monetized—

Amelia [00:03:42] Mmhm.

Vickie [00:03:42] Not a business yet, just this free website you could go sign up and make an account. And at that time, a lot of his friends were getting into this new world of social media and of tech and, you know, the new tech [laughs] being social.

Amelia [00:03:58] Yeah.

Vickie [00:03:58] And that meant jobs at Google and jobs at Twitter and jobs at Facebook. And a decade goes by [laughs] and those— those people are starting to think, "Oh, gosh, wow, this isn't what it— what I thought it was—"

Amelia [00:04:11] Mmhm.

Vickie [00:04:11] "I think we made something we didn't mean to make." Or, "I no longer feel great about the work that I'm doing." And so, he started having those conversations with some of the— some of the current friends, former employees—

Amelia [00:04:23] Mmhm.

Vickie [00:04:23] Of these companies. That's sort of how we started was— is there a story here as these people come forward and say, like, "It's designed in a way that is meant to misdirect your attention and it's designed in a way to make you more outraged, etc.."

Amelia [00:04:41] Mmhm.

Vickie [00:04:42] So, we started— we started to sort of dig into that and got to talk to more and more tech insiders about what was really happening. And these companies, Google and Facebook being the lead algorithm-driven companies—

Amelia [00:04:59] Mmhm.

Vickie [00:04:59] Out there, but also including Twitter and Snapchat and TikTok and these others as well. They're just very secretive about what they're doing. So, it was—

Amelia [00:05:08] Mmhm.

Vickie [00:05:08] Really amazing to have access to people who had been on the inside and could sort of reveal the business model that—

Amelia [00:05:15] Mmhm.

Vickie [00:05:16] Is being adopted. And not only what are the effects it's having on the users, but also the effects that it's having on society at large. And those are sometimes a little harder to see and understand. And so, we really wanted the film to be able to look at not just, like, the personal level of like, "Yeah, maybe you spend too much time on Instagram. You wasted half an hour on TikTok." Like, whatever [laughs].

Amelia [00:05:40] Yeah.

Vickie [00:05:40] Not worth making a movie about, necessarily.

Amelia [00:05:42] Yeah.

Vickie [00:05:42] But when we look at the effect on democracy, the effect on truth, the effect on our ability to communicate with one another and understand what's really going on versus—

Amelia [00:05:52] Mmhm.

Vickie [00:05:53] What is a lie that all of a sudden has this, like, huge ripple— ripple effect consequences, and it becomes an existential crisis. And— and that's really what drew me to the project was like the—

Amelia [00:06:03] Mm.

Vickie [00:06:03] Big picture aspect of— of what social media has done because there's— there's just never been another organization of any kind in human history that is as powerful or reaches as many people.

Amelia [00:06:16] Mm.

Vickie [00:06:16] There's never been a religion that had three billion members. So, it's sort of this crazy new phenomenon that is— that is worth exploring, I think.

Amelia [00:06:25] Yeah, yeah, definitely. I— so many pieces of what you just said really struck me, but one of them was that this film started from— an idea for the film started from conversations with employees inside these companies because I— it's such a— kind of the reverse way I had into these conversations, which was that I felt like something was off, not simply like, "Oh, I'm wasting time on this app." But I could feel my attention being misdirected. I could feel my spending habits being changed. Like I could very clearly feel the influences on me when I was on Instagram particularly, was the main platform I used, but on any social media platforms, I could notice how my Facebook feed was shifting when I was still on Facebook.

Amelia [00:07:12] So, I had this sort of, like, user experience where I was like, "Oh, something is— there is influence being exerted here." And then from my side, you know, all I could do is kind of say, this is my experience. Something I loved about The Social Dilemma is the film then fills in all of these gaps for me around [chuckles], "Okay, here's why you feel that."

Amelia [00:07:30] And the other thing that you said that really stood out to me or what I took from it was just that— what's at stake? One of the many things at stake here is truth. And, of course, you know, we live in a world of many and multiple truths. But— and I think the film really tries to point to this. Like, there are ways that we are lied to and told and taught and convinced to believe in lies through social media platforms.

Amelia [00:07:56] So, I have a tendency to go way to the end of the conversation and then I'll come back to the beginning [laughs]. But thank you for— for just bringing all of that together for us. So, you already kind of mentioned this, that you were really interested and drawn to the project because of the societal impact and the global impact we're seeing of social media platforms. But how did you get involved with the film and what was your role in The Social Dilemma?

Vickie [00:08:23] Well, the sort of core team was a similar makeup to a team of independent filmmakers that had worked together on Chasing Coral. So, I had been the writer of Chasing Coral, working with director Jeff Orlowski, producer Larissa Rhodes, editor Davis Coombe on that project.

Amelia [00:08:45] Mmhm.

Vickie [00:08:45] And— and so, we sort of got that group back together to— to tackle this— this project. It came— it came close on the heels of Chasing Coral. And so, it was that— that— it was really just from the—from the very beginning, I was actually on maternity leave with my [laughs and Amelia joins in] daughter for the first big meeting of the creative team to meet the subjects of the film, like Tristan, for the first time and try to wrap our minds around like, "Is there a story here? What is the story? Who are the characters?"

Amelia [00:09:16] Mmhm.

Vickie [00:09:17] That's my objective, obviously, like the producer's coming at it from a different angle. But that— so— so I popped into that meeting with a— with an eight-week-old baby—

Amelia [00:09:26] Mm.

Vickie [00:09:26] And then hopped back out to maternity leave for a little while. But yeah, getting to be involved from early, early stages was really great. I think it helped us— so my role was as the writer and so always trying to help sort of gear like what is the beginning, middle, end of this? How can people digest it? What's an entry point—

Amelia [00:09:47] Mmhm.

Vickie [00:09:47] For someone who's not thinking about this at all? How do we keep it entertaining also for people who are thinking about the attention economy and feel like they have some basic knowledge of that?

Vickie [00:09:56] So, you're wanting to sort of think about all the different material you have and then which— which threads to follow. And it's an— it's always an imperfect process, especially in documentary. And you talk to whom—

Amelia [00:10:09] Mmhm.

Vickie [00:10:09] You have access to. And— and— but we did try to sort of think through from the very beginning, like, how can this unfold in a way that the audience will understand that—

Amelia [00:10:18] Yeah.

Vickie [00:10:19] It's a bigger deal than— than we think it is? And— and to your point like there— especially sort of lately, it might feel even more obvious to us how we are influenced when we are on these platforms like—

Amelia [00:10:33] Mmhm.

Vickie [00:10:33] How they're advertising things to us that we were just talking about or just searching or things that are obviously relevant to our lives and we can sort of piece together that they're surveilling us and feeding us back these advertisements. But there's, like, so much that we don't see. And I— and I love—

Amelia [00:10:49] Mmhm.

Vickie [00:10:50] Tristan— Tristan's example in the film that it's like we have these Paleolithic hard-wired brains that are going to just have vulnerabilities because our brains really haven't changed their makeup since we were like cave people.

Amelia [00:11:05] Mmhm.

Vickie [00:11:06] And we're wired for social interaction, and we're wired to care what other people think, and we're wired to look at something that seems dangerous or salacious, right? Because those are like protective mechanisms in—

Amelia [00:11:17] Mmhm.

Vickie [00:11:17] Ancient civilization. But once there is this powerful— this powerful force that understands that wiring, it can start to sort of play us like an instrument. And it's the same as the way Tristan explains it. It's the same as— as a magician. A magician can, no matter how smart you are, no matter how many PhDs you have, even if you are a rocket scientist or a brain surgeon, your smartness can't— can't peek behind the curtain, like the trick still works.

Amelia [00:11:45] Mm.

Vickie [00:11:45] We still don't know how the magician hid the ball or whatever— whatever the trick is, right? Because it's exploiting that sensitivity or that vulnerability that all human brains have.

Amelia [00:11:56] Mmhm.

Vickie [00:11:56] So, there's a lot that we just, like, don't even know they're tracking and there's a lot that we don't even feel they're influencing. So, everyone feels like they have a handle on their own social media use [laughs].

Amelia [00:12:09] Yes.

Vickie [00:12:09] Everyone's like sure it's not happening to them. Like they're not being radicalized or lied to.

Amelia [00:12:15] Mmhm.

Vickie [00:12:15] It's the other people [laughs] who are being radicalized or lied to.

Amelia [00:12:19] Yes.

Vickie [00:12:19] And I would just go to say, like, I think we're all vulnerable to that to some degree.

Amelia [00:12:23] Yeah, I would definitely agree with that. I cannot tell you how many times I hear people kind of say like, "Oh, well, I don't really care if Facebook knows everything about me or if Meta knows everything about me. Like, that's fine. Like, you know, I don't need privacy, like whatever, they can know whatever they want." But it's not just simply them knowing things about you. It's what they use that information to influence you to do. And I think that you are completely correct that we are all being influenced and radicalized in these ways in many different directions and many different aspects of our lives.

Amelia [00:12:57] And I think that you know, because the business model of social media platforms is so heavily based on advertising and data mining for the sake of advertising or selling your information to advertisers. Definitely one of the areas where I think we often see it show up or notice it show up perhaps is in our purchasing patterns and is in, you know, seeing the ad for the same thing, you know, 50 times, as you said Tristan put it, like it— you are susceptible to that.

Amelia [00:13:27] It's hard to ignore it when you see— you know, for me, I always use the example of like Girlfriend Collective leggings, which I now own like eight pairs [Vickie laughs] of because they just showed up so much, you know? And then eventually I give them my email for something and then eventually they use all of this sales psychology in addition to all of the sort of psychological tactics happening on social media. And eventually, I give them my money and then I keep doing that.

Amelia [00:13:54] And I think one of the things that's— just to stay with this conversation around advertising for a moment, one of the things that has happened since the film is the real— how do I want to explain this— when Apple changed privacy settings on their devices and no longer— and like made people opt-in or allowed consent to apps tracking them across their phones, it has devastated Meta's ad business. And I'm talking to business owners at all stages of business who are saying, like, you know, it used to cost me— The Wall Street Journal had a podcast, it just did a great episode on this, about like it used to cost me $14 to attain a customer. Now it costs me $100 because Apple is not simply allowing Meta and plenty of other brands and apps to just track your behavior.

Amelia [00:14:40] So, I got a little off the scope here of what we're here to talk about [chuckles]. But I just think it's so insidious and it really all wraps into, like, the ways if you can start to reclaim your attention. Then I— as soon as I started that process, I immediately noticed a— the ways I was being influenced and then b— the new reporting and coverage of these influential tactics, which I think increased tenfold after The Social Dilemma was released. So now that I share that whole spiel [laughs heartily and Vickie joins in]—

Amelia [00:15:12] I want to again— my— all of my podcast interviews are like, we sprint, and then we, like, back up a little [Vickie laughs]. I think that's my— my cadence. So, thanks for bearing with me. You talked about, kind of, how much intention went into crafting the story— of writing the story of the film, getting this kind of arc. Can you walk us through like what do you see as kind of the core story of The Social Dilemma?

Vickie [00:15:35] Oh, man [Amelia and Vickie laugh together]. I think for— for me, you know, I can answer that on so many levels.

Vickie [00:15:41] Like there's a thematic sort of thought level, an intellectual level, which I hope the audience is tracking.

Amelia [00:15:50] Mmhm.

Amelia [00:15:51] That— that we— it sort of zooms out from, like, your relationship with your phone. And some people care, right? Like some people have— are having a bad time with their relationship—

Amelia [00:16:03] Mmhm.

Vickie [00:16:03] With the platforms on their device. And they're— they're being bullied—

Amelia [00:16:06] Mmhm.

Vickie [00:16:06] They're feeling FOMO. They are feeling more anxious or depressed. They're— and they're able to track or—

Amelia [00:16:14] Mmhm.

Vickie [00:16:14] Maybe not able to track that that's— that that's— those things are tied. You know, their psychological state of mind is tied to what's happening on these platforms and the ways the platforms are plucking at their strings.

Amelia [00:16:27] Mmhm.

Vickie [00:16:27] And I will— I will sort of pause and back up again to say, like, you know, talking to these— talking to these insiders from these companies, they sort of gave the algorithm a goal. Right? They said, you know, you can give an algorithm any goal. You can show an algorithm your closet and then show them the weather and say, pick out the best outfit for me, right?

Amelia [00:16:49] Mmhm.

Vickie [00:16:49] And then they'll do a tabulation [chuckles]— said algorithm does a tabulation based on—

Amelia [00:16:54] Yeah.

Vickie [00:16:54] Whether it's sunny or rainy and windy and whatever. And then you sort of like— it spits out the right outfit for you. So, the algorithm in the case of social media was to keep people online—

Amelia [00:17:05] Mmhm.

Vickie [00:17:05] Just keep people on our platform, right? Because the more they're here, the more we can collect data on them, the more we collect data on them, the more we can understand them.

Amelia [00:17:15] Mmhm.

Vickie [00:17:15] More importantly, the more we can understand patterns across hundreds of thousands of people.

Amelia [00:17:20] Mmhm.

Vickie [00:17:20] So, in order to group people as like— these 10,000 are going to be susceptible to this, those 100,000 are more susceptible to that. And so, once they have sort of that unimaginable amounts of trillions of pieces of data like per day [laughs] coming in—

Amelia [00:17:35] Mmhm.

Vickie [00:17:35] An algorithm can process in a way no human being could process and start sort of sorting us. And the algorithm itself figured out that the best way to keep people online was by exploiting their fears, their doubts, and their insecurities.

Amelia [00:17:55] Mmhm.

Vickie [00:17:55] So, the algorithm wasn't asked to make us all angry [laughs], nor was the algorithm asked to make us feel connected and good. The algorithm was just asked—

Amelia [00:18:03] Mmhm.

Vickie [00:18:03] For our attention, and it took a shortcut because it realized it could keep our attention best—

Amelia [00:18:11] Mmhm.

Vickie [00:18:12] For the most part, on a grand scale, if it kept feeding us things that sort of plagued our underlying fears, doubts, and insecurities. Whether those were fears and doubts about what a politician or political party was going to do or fears and doubts about how we looked and knew [laughs] what our friends thought of us.

Amelia [00:18:32] Mmhm.

Vickie [00:18:32] So, from this huge— from societal level, global news level, what it's going to choose to show you.

Amelia [00:18:39] Mmhm.

Vickie [00:18:39] The post your friends are posting that it chooses to highlight for you. You know, they're just trying to keep you on the platform. It doesn't care whether it's making you angry or— or not—

Amelia [00:18:52] Yeah.

Vickie [00:18:52] Making you sad or not, making you depressed, anxious, more likely to self-harm—

Amelia [00:18:56] Mmhm.

Vickie [00:18:56] Like it doesn't care. It doesn't actually know you're a person with a life. It just sees you as— as an objectified thing that is, like, to be exploited for time and attention. So, it wants your eyeballs, and it's willing— it doesn't care about you, so it's willing to do whatever to get your eyeballs.

Vickie [00:19:14] So, it's radically agnostic in that way in that it just like— it's not— it doesn't have a negative, it doesn't have a [laughs]—

Amelia [00:19:21] Yeah.

Vickie [00:19:22] Anything for you. It just has decided this is the easiest way to get its job done. That's sort of, you know, the film, I hope, is helping people understand that yes, they have their own personal relationship with their phone, that that is dictated by an algorithm that has been—

Amelia [00:19:37] Mmhm.

Vickie [00:19:37] Trained to do anything necessary in order to keep their eyes on the phone. That that algorithm mostly chooses to exploit fears, doubts, and insecurities [laughs wearily], and that the— and then zooming out from that, the collective result of that is that it seems that everyone is losing their mind everywhere all at once.

Amelia [00:19:57] Mmhm.

Vickie [00:19:57] It seems that democracies all over the world are losing their democratic values everywhere all at once. It seems that more and more fake news is proliferating everywhere all at once. You know, it's happening in Myanmar, it's happening in Indonesia, it's happening in Brazil, it's happening in the United States.

Amelia [00:20:14] Mmhm.

Vickie [00:20:15] So, it's rare for that sort of global phenomenon to be you know— it’s like how is it possibly cropping up in all these places at the same time? Oh, yeah, we are all subject to these same algorithms. And so, I think they play a really big part in some of the trend— the global trends we're seeing towards unrest and away from democracy.

Amelia [00:20:36] Mmhm. Yeah. I think— I love the arc you just traced, and it is to me, like, the core and hallmark of the film that— that movement from like us having our personal relationships with our phones to seeing global trends and how they are interconnected all through these social media platforms. So, there is a connection between, you know, me scrolling on a social media app and unrest and— and My— Myanmar, right? And those are connected through Meta at this point, which I guess was not Meta [laughs].

Vickie [00:21:12] And I think that for me was a much more powerful realization than like caring whether or not I was wasting time on my phone.

Amelia [00:21:22] Mmhm.

Vickie [00:21:22] So, for the first half of making The Social Dilemma I still had— I still used Instagram on a regular basis because I was just habituated to it, like my finger—

Amelia [00:21:32] Yeah.

Vickie [00:21:32] Will go open that app if I was, like, peeing on the toilet [laughs]—

Amelia [00:21:36] [Laughs] Yes, the bathroom.

Vickie [00:21:37] You know, you just have these little breaks and our finger kind of almost does it without your—

Amelia [00:21:41] Mmhm.

Vickie [00:21:41] Without your conscious decision to reengage in this platform, you're just like, "I'll just check this because it's here."

Amelia [00:21:48] Mmhm.

Vickie [00:21:48] And I didn't feel like I was depressed or anxious, so I was like, "I don't really feel a personal need to step away from this platform."

Amelia [00:21:55] Yeah.

Vickie [00:21:55] And then when I fully saw the grotesque nature in which, like, at the very least, me signing into the platform was feeding a monster that was—

Amelia [00:22:08] Mmhm.

Vickie [00:22:08] Causing all of these problems around the world [laughs]. I was like, "Even if I'm not directly the target of this monster today, right now—"

Amelia [00:22:17] Yeah.

Vickie [00:22:17] "I'm still feeding it. And I don't want to be feeding it because I see the harm that it's causing elsewhere." So, I don't want to give Facebook or Google anymore [laughs] of my data—

Amelia [00:22:28] Yeah.

Vickie [00:22:28] If I can help it— because I just don't want them to keep— to sort of keep running amok like this. I don't want to be part of it. I don't want to be fueling it.

Amelia [00:22:37] Yeah, I felt the same way [Vickie laughs]. Partially because of this— this film [laughs] that you wrote and helped create.

Vickie [00:22:45] It's great to hear that— that those— that those things connected for— [laughs] for some of our audience members.

Amelia [00:22:51] Yeah, definitely. And— and I think that it— this movie, I think it is starting to become a movement. I think we could almost call it that of people leaving social media, you know, for reasons at all levels of the scale we're talking about. Some people are leaving for their mental health, and we are seeing a lot more discourse around social— the impact that social media has on the mental health of individuals.

Amelia [00:23:13] Other people are leaving because of community-level experiences. Other people are leaving because of, like, platform-level problems or global politics. And it's exciting to me to see that happen. And I think, you know, the perspective I try to bring to Off the Grid is that because so many business owners are on social— business owners are on social media, we have to think— it's not just a personal decision, it becomes a business-level decision for us. And there's a whole 'nother layer when you're using social media to make money or to bring clients into your business and that becomes challenging.

Amelia [00:23:49] But I thought it was really promising when we saw major global companies like Lush leave all Meta companies last year after a whistleblower revealed that Facebook had internal data, that Instagram was incredibly harmful to particularly pre-teen and teen girls and people socialized as girls.

Amelia [00:24:11] And so, you know, Lush says that's our core audience but we are not going to take part in hurting our core dem— like the people that we serve. And so, we're out [laughs lightly]. And I just thought that was— that felt really powerful to me.

Amelia [00:24:24] And I think we can make those choices, whether you're a giant global corporation or, like, you know, me, a business of one in Nebraska [laughs and Vickie joins in].

Amelia [00:24:33] Going back yet again to something you shared before, you know, you started talking about the algorithm and the choices the algorithm makes. And one of the most striking elements of the film, everyone talks about it, is the way that your— you and your team chose to personify the algorithm as a human, as making these intelligent choices. So could you— well and I'll just add too like now we see that everywhere, like before The Social Dilemma, I don't think people talked about the algorithm [Vickie laughs] as, like, a sentient thing. And, like, after The Social Dilemma, it's like The Algorithm capital T capital A [Vickie laughs]. We all talk about it as— as personified. So, I think it's had a huge impact— that choice has made such a big impact. I'm wondering, could you talk us through how you came to the decision to do that?

Vickie [00:25:23] Sure, yeah. I think that at some point the director, Jeff, was really— you know, we were— we had all these amazing interviews but they're back— when you try to string them together, it's like back-to-back-to-back—

Amelia [00:25:36] Yeah.

Vickie [00:25:37] Technical interview with a— with a very smart person, sometimes using quite a bit of, like, Silicon Valley jargon. Right?

Amelia [00:25:44] Yeah.

Vickie [00:25:44] So— so as the writer, I'm like, "Hold up. Like, [Amelia laughs] we need some breath. We need some— some characterization either of these people who we're interviewing or in some other way."

Vickie [00:25:56] And Jeff had the idea of like, what if we were to go into our phone and actually sort of see what's happening in there—

Amelia [00:26:02] Mm.

Vickie [00:26:02] Like what's happening in there? What's happening on the other side of our device if you were to follow through the device to the force, that is the thing that's notifying you?

Vickie [00:26:14] What is that thing that's, like, sending a new notification and how are they deciding to send it? So— and obviously it's not interesting to make— to just film like enormous banks of computers—

Amelia [00:26:28] Yeah.

Vickie [00:26:28] As they— we can't relate to what they're doing when you, [laughs] you know—

Amelia [00:26:32] Mhm.

Vickie [00:26:32] And I think Justin in the film talks a little bit about how, like, there's enormous like under— banks of computers under the ocean and in these, like, deserts. And like we can't even believe how much actual like— how many computers it takes to run the algorithm because it is happening—

Amelia [00:26:50] Yeah.

Vickie [00:26:50] Such a capacity for data, right? That it's just like— it's definitely not one person or one building or one neighborhood worth of computers [laughs]—

Amelia [00:26:58] Yeah.

Vickie [00:26:59] More of a bank of computing technology that's going into this constant decision-making. So, the idea— Jeff had the idea of, well, what if we just had— we personified the part of the algorithm that is taking care of one person on the other side of one phone. And then he and I worked to think about, "Okay, but if we do that, then it's also important to see that one person on the other side of the phone so we can kind of see how it's having a real effect on their life."

Amelia [00:27:29] Mm.

Vickie [00:27:29] And at the time we were reading, like, all of these stories about how young people, especially young men, were getting radicalized online, going from some bad experience or some feeling of being misunderstood or some sense of isolation into finding belonging that was entirely directed by the YouTube—

Amelia [00:27:48] Mmhm.

Vickie [00:27:48] By the YouTube algorithm, because—

Amelia [00:27:51] Yeah.

Vickie [00:27:51] A huge majority of what people watch on YouTube is the video that the algorithm feeds them next.

Amelia [00:27:57] Yes.

Vickie [00:27:57] Not something you went to go search to find. So—

Amelia [00:28:00] Yeah.

Vickie [00:28:00] And then you— and then when you research— okay, well, what types of video does the algorithm send you next? It's like, by and large, something that's slightly more extreme than whatever you came to look for yourself.

Amelia [00:28:11] Mmhm.

Vickie [00:28:12] So, it's— it's designed as a [laughs] pathway for radicalization, because once you're radicalized, then you can kind of only find connection in this fringe online world. And— and that's what they want. They want you to feel like you have to come back to that— to the— to the online world, to the platform in order to get your dose of dopamine, in order to get your connection.

Vickie [00:28:32] So, then we sort of built out this character of the algorithm, who is very much based on conversations we had, again, with tech insiders. They say like the algorithm builds a model of you and then it has a sense of— and then it constantly tests content against its model of you and its model of you is based on however many thousands of points of data it has on you.

Vickie [00:28:57] Oftentimes it knows you better than, like, your mother knows you. And that makes sense, right? Because it knows what you're doing 24 hours a day, seven days a week—

Amelia [00:29:07] Yeah.

Vickie [00:29:08] Before you, you know, in the case of young people, since before you even knew that you had given away any of that [laughs].

Amelia [00:29:15] Yeah. It knows I need leggings [laughs].

Vickie [00:29:17] Right?

Amelia [00:29:17] It's watching.

Vickie [00:29:19] And you didn't— and it has also categorized you based on what it knows about 20,000 people just like you, and a bunch of them bought the leggings. So, that's a good indicator that you probably—

Amelia [00:29:31] I will buy them.

Vickie [00:29:31] Will too, right [laughs]?

Amelia [00:29:32] And I did. Many times [laughs].

Vickie [00:29:36] And then you did. And they knew that so then they could find some new person that was kind of like you and start advertising leggings to her. And if it was just leggings like this would not be so scary. But it's also political ideologies and—

Amelia [00:29:50] Yeah.

Vickie [00:29:50] And misinformation and all kinds of things that are being marketed in the same—

Amelia [00:29:53] Mmhm.

Vickie [00:29:53] Way, not just leggings. So yeah, I guess that's sort of how we came up with it. It was like fun to think about the— the ways in which the algorithm has different goals at different times. Sometimes it's trying to engage you, other times it's trying to sell you something, other times it's trying to pull you into more connection, and that will also continue to engage you. So, it's growing it's— it's growing its general web of connections.

Amelia [00:30:18] Mmhm.

Vickie [00:30:18] It's— it's sending you stuff it thinks you won't be able to look away from. And then it's choosing what you will be vulnerable to, of all of the things that its customers, the advertisers, whether they be leggings or political candidate—

Amelia [00:30:33] Yeah.

Vickie [00:30:33] Want— want you to believe and do. And so, you know, the promise that the company is making to its customers and just customers are the advertisers is we can change our users' behavior—

Amelia [00:30:49] Mmhm.

Vickie [00:30:49] Tell us what we want them to do, what you want them to buy, what you want them to think. We can get them to do that. You need—

Amelia [00:30:57] Yeah.

Vickie [00:30:58] 10,000 votes in such and such [laughs], you know, in such and such county. Okay? Like we have a lot of data on those people. We can find the ones who will be most susceptible to your messaging. Or you need to sell more leggings, we can find the women who will be most susceptible to your messaging, and we even know what day of the month she'll be most susceptible to your— to your messaging. So, we'll send it to her right before she gets her period [laughs] or right after—

Amelia [00:31:25] Yeah.

Vickie [00:31:25] She gets her period.

Amelia [00:31:27] Yeah.

Vickie [00:31:27] So, yeah. So, I think it's like— that's what we wanted to sort of personify those three goals of the algorithm and Vincent Kartheiser does a really fun, really fun job at being those three forces in the room that are on the other side of your phone.

Amelia [00:31:42] Yeah, it's— it's so powerful to watch. It was— it is so powerful to watch. And I think, I guess I haven't said yet that anyone who hasn't watched this, maybe you should go watch it [Vickie laughs] on Netflix, but you should if you have not if you're like— pause. You don't have to finish this interview. Just go watch [laughs and Vickie joins in] the film.

Amelia [00:32:00] Because it's so— it just puts together everything we're talking about so eloquently. And I think that what you're— again I just keep coming back to, you know, for the kind of the audience of this podcast, we're primarily business owners or people with personal brands who are using— if you're using social media platforms, using them to make money. And some people are doing that through paid ads. And in doing so, you are in some— you are pay— paying into this system then.

Amelia [00:32:29] Other people are simply— are not doing paid ads but are present on the apps and use them to make connections and use them to, you know, do other things. And I'm not, obviously, maybe it's not obvious, but I want to say clearly, I'm not trying to demonize those individuals.

Amelia [00:32:43] It's complicated to run a business within capitalism, especially if you're someone who is anti-capitalist or capitalist-critical, or just trying to be aware of what's [laughs wearily] happening. But I think this— you know for me, The Social Dilemma just really served as an invitation to think about all of the different ways that being a business owner also gave me some more capital than I perhaps had before, and so I could make different choices.

Amelia [00:33:08] I think one of the things I guess I'm wondering as a writer of this film, like for you personally or for the whole team that worked on it, what do you hope people take away or do after they watch The Social Dilemma?

Vickie [00:33:23] So, I think that this is a multi-faceted, systemic, and global problem—

Amelia [00:33:29] Mmhm. Yeah.

Vickie [00:33:29] And it is deeply interconnected with the— all of the problems that our economic system has sort of birthed—

Amelia [00:33:38] Mmhm.

Vickie [00:33:38] And that we are now—

Amelia [00:33:40] Yeah.

Vickie [00:33:40] Having a reckoning about and with. And so, there's definitely not like a list of like these three things will fix [Amelia laughs] everything—

Amelia [00:33:48] Yeah.

Vickie [00:33:48] But we have an impact campaign and a whole impact production team that is dedicated to the film having an impact [laughs] as— as it sounds.

Amelia [00:34:01] Yeah.

Vickie [00:34:01] It is what it sounds like [Amelia chuckles]. Having an impact beyond just people watching it and saying, "Whoa, that's crazy." Or, "Yeah, I already knew that," or whatever [laughs].

Amelia [00:34:09] [Chuckles] Yeah.

Vickie [00:34:09] And it's great if you can sign off your— your apps for a little while or forever, but there is a larger systemic change, obviously, that needs to happen. We want the film to be part of catalyzing those changes.

Vickie [00:34:24] It's— there's sort of three pillars of the impact campaign and you can learn all about it at thesocialdilemma.com.

Vickie [00:34:31] One being how we change how we use and interact with the platforms.

Vickie [00:34:37] Another, like, on the community-level and—

Vickie [00:34:40] Then also on how we regulate— the community-level involves also like how we design them in the first place because there is—

Amelia [00:34:48] Mm. Mmhm.

Vickie [00:34:48] Also the ability for folks who are designing an app or designing a platform or currently working at these places to say, "This isn't working, this is not sustainable, this is not good for humanity, this is not in line with our goals or our ability to thrive. So, let's redesign it [laughs]."

Amelia [00:35:08] Yeah.

Vickie [00:35:08] Because there's a lot of great things about Twitter, Facebook, Google that you could have— you could have a powerful search engine that didn't track you and manipulate you—

Amelia [00:35:18] Mmhm.

Vickie [00:35:18] You could have a social network that allowed you to reconnect with old friends that didn't track you and manipulate you. So—

Amelia [00:35:26] Mmhm.

Vickie [00:35:27] The positive parts of these platforms are real. We don't need the negative parts in order to have the positive parts, I don't think. And so— so I think that's a part of it is how we use our phones, how we design this stuff in the first place, and then also how we regulate it, which I think—

Amelia [00:35:45] Mm.

Vickie [00:35:45] Shoshana Zuboff is another great resource for thinking about that regulation, as is the Center for Humane Technology. If you want to, like, dive in deeper, I think those are some—

Amelia [00:35:56] Yeah.

Vickie [00:35:56] Resources to— to think about how we could just create more freedom and liberty for the individual humans of the world by regulating what Facebook and Google and companies like them can do.

Amelia [00:36:09] Mmhm.

Vickie [00:36:09] Sometimes people think of, like, regulation as impeding freedom and liberty. But I think in this case, and actually, in quite a few cases, they would be protecting the freedom and liberty of three billion users by saying, "No, you're not allowed to take away agency from these 15-year-olds who are addicted to your platform [laughs]." And you need to find another way to monetize your business, basically.

Amelia [00:36:36] Yeah, I think that— I don't know I just couldn't head nod enough while you were talking [laughs].

Amelia [00:36:41] So, yes, I will direct everyone again to thesocialdilemma.com for the impact campaign on these different pillars. And I will also just add that I kind of a— talking about things being algorithm-free as a shortcut I've used on this podcast because we also have this shorthand like The Algorithm.

Amelia [00:37:01] But what you're pointing to is like algorithms in and of themselves do not have to be harmful or dangerous. They do not have to be manipulative in these ways or spread misinformation or influence us to buy things.

Amelia [00:37:14] Algorithms can be super helpful. Like I like that if I watch a rom-com on Netflix, it serves me another one. That's great [Amelia and Vickie laugh together]. You know, that's a pretty, like, straightforward— you did this, probably this [laughs again]. Fine.

Amelia [00:37:27] Or when I watched The Social Dilemma on Netflix now it, like, serves me The Octopus Teacher. Beautiful [Vickie laughs]. Give me more.

Amelia [00:37:35] But I think that— I think about this a lot because working in podcasting, I work in an industry where recommendations aren't algorithm-ized yet, algorithmically served, like the major podcasting apps. It's still all, like, earned and paid placement. They don't— most of them do not have algorithms in place to serve us more podcast recommendations. And it's a complaint I get from a lot of podcast listeners that they want more things that will, like, track their behavior and serve them things that they might like [laughs lightly]. So, I think about this a lot and how different that is from something like YouTube where you are just being fed more and more extreme things. So—

Vickie [00:38:15] Yeah, like early days of Pandora or whatever, you know, if you use an algorithm to try to figure out what someone might, what might be like, good, useful information for someone or what one might like. Like there's a lot of wiggle room in there that is different than trying to keep them on—

Amelia [00:38:36] Mmhm.

Vickie [00:38:36] Listening to podcasts forever for their entire day [laughs and Amelia joins] and what you might be able to, like, do that in a humane way, right?

Amelia [00:38:48] Mmhm.

Vickie [00:38:48] Like you could design suggestions in a way that was not ultimately—

Amelia [00:38:53] Mmhm.

Vickie [00:38:53] Radicalizing or inhumane. And you can also design into whatever platform you're making, these stopping cues so that people step away.

Amelia [00:39:03] Mmhm.

Vickie [00:39:03] And you can also design like a perturbation where you show them something totally different and totally opposing of— of the view that they have been—

Amelia [00:39:12] Mmhm.

Vickie [00:39:12] The path that they went down. So, all of these things were made by designers who made them the way that they are [laughs].

Amelia [00:39:20] Yeah.

Vickie [00:39:20] And you could pay different designers or designing from a different perspective and— and make them a different way like they are not set in stone. It is not inevitable that they act this way.

Amelia [00:39:32] Mmhm.

Vickie [00:39:32] It's very weird and Wild West type of worlds that we're in with this totally unregulated space around this particular business model. And so, it doesn't have to look like this.

Amelia [00:39:46] Yes.

Vickie [00:39:46] I think that's the most feminist— like my earliest feminist realizations. Were like—

Amelia [00:39:50] Yeah!

Vickie [00:39:50] This is really arbitrary [Amelia and Vickie laugh together]. It doesn't have to be this way.

Amelia [00:39:54] When— I cannot co-sign that enough. That is like the entire message of this whole podcast and probably my entire business and life. Like it does not have to be this way. We can do something different. It can be different.

Vickie [00:40:10] Yes.

Amelia [00:40:11] [Deep sigh] And so on that note, Vickie, thank you so much for joining me on Off the Grid. I'm so grateful to you for working on this film, for being here today, and speaking to our listeners.

Amelia [00:40:23] Listeners, if you're tuned in, thank you for making it all the way to the end with us. You can find show notes with all of the links of everything we talked about at softersounds.studio/off-the-grid or just head to wherever you're listening to this and all the links will be there as well.

Amelia [00:40:43] Don't forget, while you're there to check out The Refresh, I hope to see you at those workshops in August. [Outro music begins to play] And until next time, we will see you off the grid, friends. Bye for now.

Amelia [00:41:04] Thanks for listening to Off the Grid. Find links and resources in the show notes and don't forget to grab your free Leaving Social Media Toolkit at softersounds.studio/byeig. That's softersounds dot studio slash b-y-e-i-g.

Amelia [00:41:20] This podcast is a Softer Sounds production. Our music is by Purple Planet and our logo is by n'atelier Studio. If you'd like to make a podcast of your own, we'd love to help. Find more about our services at softersounds.studio. Until next time, we'll see you off the grid.

Creators and Guests

Amelia Hruby
Host
Amelia Hruby
Founder of Softer Sounds podcast studio & host of Off the Grid: Leaving Social Media Without Losing All Your Clients