Watchtime episode 11: The Social Dilemma – What’s the problem? | MintTwist
CCBot/2.0 (https://commoncrawl.org/faq/)
18.213.192.104
ec2-18-213-192-104.compute-1.amazonaws.com
MintTwist MintTwist Ltd.

Watchtime episode 11: The Social Dilemma – What’s the problem?

Published on

Watchtime episode 11

On this episode, we discuss one of the most popular and controversial Netflix documentaries: The Social Dilemma. We will analyse how the tech experts who once worked for some of the largest tech giants, are now battling against them. Social media algorithms and people seen as products, are some of the topics that this documentary covers. So, what is the real problem here? 

  • Is it that, advertisers have access to our data through social media channels? 
  • Is it the algorithm itself? 
  • Or are the platforms designed in a way that takes control over our behaviour?

In the end, the responsibility is ours to manage how we want to make use of social networking sites. Moreover, as fake news and misinformation are becoming quite popular nowadays, another problem arises: who owns the real truth within social media channels if everyone has their version?

Find the answers to all of this on staying tuned and following the discussion here!  

*This episode was multi-streamed with ReStream

Transcript

Elliott King (00:07):
Hello, and welcome to another episode of WATCHTIME. My name is Elliott King.

Aleksandra King (00:11):
I’m Alexandra King.

Elliott King (00:12):
.. and this this show is brought to you by MintTwist the digital agency, check us out at https://minttwist.com and it’s streamed, multi streamed across multiple video and audio channels with the amazing Restream software. You can check them out at Restream.io. So, Aleksandra, what are we talking about this week?

Aleksandra King (00:31):
Well, today we’re looking at a brilliant Netflix documentary. That’s come out called social dilemma.

Elliott King (00:38):
It is a brilliant documentary. I have to say, I was absolutely fascinated when I watched it. And I think lots of people think the same, because it’s been very popular. Lots of people have been talking about it. Now, why have they been told,

Aleksandra King (00:51):
Well, let’s just discuss what it is first. So for those of you that haven’t watched it, it’s really about how these tech giants manipulate us. You know, and it’s not only about them seeing our data and things like that. It’s about them literally manipulating our brains and changing our behaviour, which is very scary.

Elliott King (01:10):
It is, it is. And the program starts off with a collection of X social media employees, so that people are used to work at Facebook and Google and Twitter and so on and so forth. You have essentially switch sides and they are now actively campaigning against the power of these networks and their influence over our minds.

Aleksandra King (01:31):
That’s right. Some of them have developed things like the like button and they have an excellent understanding of exactly what algorithms go into all this manipulation. So great insight information. We have the former president of Pinterest and extort of magnetization of Facebook, Tim Kendall, brilliant guy. We have the author of 10 arguments for deleting your social media, mr. Lamea. And we have the former VP of growth, uh, and CEO of social capital. So we really have some amazing people on in this country.

Elliott King (02:07):
Yeah. They certainly know their stuff. But the interesting thing for me was, in the first five minutes of the show, the producers of the show are asking these individuals. “So, what exactly is the dilemma? What is the problem with social media?” And it’s very interesting because they find it quite difficult to articulate what the problem is. Uh, certainly succinctly. So the rest of the documentary spends a lot of an hour and a half actually delving deeper into the potential problems. And they knew this by interviewing these experts, but they also intertwine the interviews with a commentary, a functional look at the impact on teenagers, within a family setup.

Aleksandra King (02:51):
It’s a little bit like a, vitamin deficiency where something is starting to go a little bit wrong. Not quite sure what it is. You’re feeling a bit iffy, not quite right, but there’s nothing actually like very distinct happening. So it’s very subtle and it kind of creeps upon you until it’s too late. That’s how I see it.

Elliott King (03:13):
Absolutely. And for those of you out there who have you have children, young children, we’ve got children and their on social media, and it is a concern.

Aleksandra King (03:24):
And we have a constant battle because he’s the tech guy; he loves devices, any birthday, it’s like, yeah, I’ll get you a new device. And I’m the one going, “wow put in the rains enough go play outside.” So it’s like, you know, in your opinion, and I totally understand it. You can’t cut kids off from technology completely. They need their friends; they need that social circle. And, you know, with a lockdown that’s even more important and things like that. However, we also need to make sure that they actually leading healthy lives and not just fulfilling the profit dreams of tech giants.

Elliott King (03:57):
Yeah. And actually, it’s a balance like most things in life, if you get the balance, right. You’ll probably be okay. But going back to the, to the problem, to the specific problem, that they, in the end articulated, was that most of the contents that we see as users on social media is pushed in front of us in order to get us to, you know, to form a specific opinion and actually to change our mind. And now that might not seem completely obvious. But what I mean by that is most of the content that is being shown to us is content that the algorithm believes we will enjoy because the point of the algorithm is to keep us on the network on the social network and engaged with the content.

Aleksandra King (04:49):
Yeah. I think the issue with that is that, for example, when the like button was developed, it was developed to spread positivity and love. And that’s not really what it’s being used for tall, the like button is used to track your patterns and whatever you like, more of that will come your way with the sole purpose of keeping you hooked on the platform, because the more you’re hooked and the more engaged in the content you are, the more that company makes money.

Elliott King (05:16):
That’s right. And to, and to expand on that, you might think, you know, why, why does the algorithm want to keep us on the networks? Why do they want to keep us engaged with the content? Well, it comes down to, you know, this age old saying about social media, that many of you will have heard.  That we as the users, we are not the customers because we get the product for free. Actually, we are the products and it’s sort of very fashionable to say that the social networks are selling our data to the advertisers. I would say that that’s not entirely true. In fact, it’s not true. The platforms have no interest in selling our data to advertisers. What they do definitely have interested on is selling access, to us, to the right customers, to the right appetizers. They’re effectively a marketplace that’s matching advertisers and potential customers.

Aleksandra King (06:19):
Terrible. I don’t like to be thought of it in that way, but…

Elliott King (06:22):
Unfortunately with that, I don’t think many people would argue with that, even the platforms themselves. So it’s kind of created this, this need to the algorithm has to keep us on the platform because we can’t be served ads unless we’re on the platform. And moreover, it has to keep us engaged with the content because it certainly advertisers are wanting to reach customers, but more importantly, they want to engage customers. They want customers to take some full of action. And so the algorithms have got very, very good at serving up content that not only will keep us on the platform, but they will engage with, they will enjoy.

Aleksandra King (07:00):
Right. Well, what I found very interesting watching this documentary was one of the, former employees said, you know, trees are more worth to us when they’re dead then when they’re alive, whales are more worth to us when they’re dead than when they’re alive. And our attention is a lot of mining into our attention, and our time is worth a lot more than our actual wellbeing as people. And that, I find the most frightening. I don’t want to be mined for my time by anyone. I don’t like the idea of any social media platform, even attempting to control me in any way like that. And it’s really, although we all know this, it’s like, how do you actually stop this? How do you stop yourself from becoming addicted and mindful for your time? How do you stop it?

Elliott King (07:55):
Well, the platforms themselves are talking about, you know, better. AI, you know, for me, there’s a fundamental problem. And that is, when you’ve got a content platform and you’re monetizing it by giving access to advertisers. And you’ve got an algorithm of the type that we’re talking about. It’s absolutely inherent that it’s going to create change in the users. If you think about marketing or advertising, the whole point of it is to let it change, you know, in that target viewer. And frankly, it doesn’t really matter if the advert or the marketing material is in a bus stop or in the train station or a newspaper on traditional TV or on social media. But the specific problem we’ve got in social media is this ability for the social media platforms in real time to serve up different content to you based on your individual hopes, dreams, and desires.

Aleksandra King (08:50):
Yeah. So if I look at the data that you’re saying, that’s being to us, and then you think about propaganda and fake news and all of that, you can see how we are in the situation that we’re in now, which is terrible. Really. You have families that are completely torn apart and divided because one half of a family you could say to the other, but how could you even contemplate that the side of politics? Like what planet are you living on? And what, how can you have these views when actually perhaps they might be the ones that are manipulated in their own social media setting, because if they’ve liked a certain type of view, that will be the only view dish to them ever. And you could then say, well, social media is designed to make us narrow-minded, which is the absolute, last thing you want. If you want to have a democracy and a society that you know is empathetic. And, you know, we’re all understanding and communicate well, this social media, as it is now is breeding divisions and breeding ignorance and a one sided view. And that can’t be right in any way.

Elliott King (10:05):
And for me, you just described it really well, this is the problem because of the nature of the way the algorithm has to work, you can say from the point of view of the platforms, it has to give us what we want. And, you know, objectively that at first, it might seem like a good thing that this situation is the result.

Aleksandra King (10:26):
And, you know, I’ll tell you another thing. So, obviously I was on the apprentice and I got a blue tick and all those wonderful things on Instagram. And then, you know, I found myself getting trapped in the Instagram thing where, you know, and also speaking to you and, you know, promoting your platform and helping you grow that’s all good stuff, but there was a point, even when you said to me, you know what, if you want to get all these, you know, your posts to do really well, then you’ve got to engage. You’ve got to put content out and you’ve got to do it on a consistent and timely basis. And I remember I turned around to you and I said, no, I will not be governed by the amount of likes i get on a picture. I will post when I want to post.

Aleksandra King (11:10):
And if I don’t want to post for three months, that is what I shall do. And it was difficult because you do notice the drop in likes, but you know what? Don’t define yourself by your likes. Yes get your engagement, but always be authentic to yourself. So probably step number one in healing, this social dilemma that we’re in is cut that cord, free yourself from that responsibility, use it in a more genuine way. You’ve, you’ve got something great to post? post it. Is it worth it? Is it going to make the world a better place? Do it. is it going to cause polarization and problems? don’t do it. We all have choices to make.

Elliott King (11:49):
Yeah. Interesting. So you’re not advocating to delete your facebook.

Aleksandra King (11:55):
I’m not saying, I mean, it’s not all bad, you know, you want to be connected. You want to have, you know, these platforms, businesses have to make money somehow. You know, I’m not saying advertising is all bad. We’re just saying that, you know if you’re doing the right thing or not, you know, if your content is genuine, you know that when you’ve been spending, you don’t get your screen time and you’ve been on there for three and a half hours. That’s a problem. That’s a huge problem for your kids if they’re getting addicted. So you’ve got to control this. Like you would, anything else, any other drug, any other temptation, you know, you’ve got to work it up.

Elliott King (12:29):
I think that’s very wise. If we can take some personal responsibility and manage ourselves. Yeah. That works. And that can certainly work for hopefully most adults. I think the issue is younger users. Yeah.

Aleksandra King (12:40):
Yes. And the issue is, self-discipline, because if you can’t even get yourself into a gym or you can’t do that little bit of exercise, how are you going to stop your addiction with social media? And I tell you, it’s something, if you have that addiction, and if you’re feeling a prisoner of your own social media, you have an issue to solve. You really do.

Elliott King (13:00):
Yeah. So what about sort of regulation from sort of governments and authorities, do you think that’s…

Aleksandra King (13:09):
Well, actually the documentary does cover this because how can you, if you take fake news, how can you regulate truth when everyone has their own truth? You know, we’ll come back to the family, one side of the family, you can say, do you know, but this is the political view when you are completely wrong and how can you have another point of view? And this side will say, no, but you guys are wrong. You can be.. and everyone believes that truth. And probably there’s a bit of truth in both sides. And then there might be a little bit of fake news in between, but who’s who is in charge of the truth. “Yeah.” Google? Billionaires? or your common sense? just where they have family, the communication is what you need to be doing and the talking through it. And let me hear your point of view. And so, when you like something, maybe like the opposition’s view as well, and then you might be getting a more clear picture in your feed

Elliott King (14:05):
I think for me the truth about truth is it’s always in the eye of the Beamer and different people have different treats and that’s probably, you know, not necessarily a bad thing itself. The problem, the dilemma is when we get into the sort of reality bubbles, and if our own view is constantly reinforced and never challenge that that’s clearly a problem.

Aleksandra King (14:26):
You’re listening to another person’s point of view, if you’ve never, ever seen that in your feed. And listen, I feel sorry, you know, in Facebook for you. And it just occurred to me the other day. I never, you know, I might only see 10 friends’ things cause I like them, you know, what about all the other 200 and whatever the I’m hearing, nothing. I have no idea what they do because Facebook thinks, ‘Oh, because you’ve liked that one. Now you can only get that person.’ It’s just, it’s not right. 

Elliott King (14:49):
Yeah. So, we’ve hopefully covered some of the topics and given you some food for thought, um, what’s your sort of final view? Is there you know, is there an easy, do you think we’re going to see a solution to this social?

Aleksandra King (15:06):
The solution has to come from each individual and then their families as a whole, and then move into the wider community and so on and so on. And really like with us vitamins and everything else, you know, there’s an amount, you know, you’d be insufficient in and there’s a nice medium, amount. And then there’s too much and just really exert common sense with this. But monitor it genuinely monitor it and take it incredibly seriously because it is an incredibly pressing and serious issue in our time. And especially if you’re a parent.

Elliott King (15:43):
Yeah. It’s very wise. But from an individual perspective, there are certain things we can do. Rightly say for me, it will be very, very interesting to see if some of the government authorities around the world have a look at the way these businesses are set up and, and essentially consider splitting them up. So we’ve got the advertising side of the business. We’ve got the content side of the business and we’ve got the algorithms working, both sides of this coin for essentially one, you know, one business, if you potentially looked at separating those things might mitigate some of these problems. You know, I don’t know.

Aleksandra King (16:24):
I wouldn’t know. I wouldn’t necessarily rely on governments to do.

Elliott King (16:30):
It’ll certainly take a while if even if anything like that has ever happened at all.

Aleksandra King (16:36):
Yeah. Take charge yourself. That’s what I say.

Elliott King (16:40):
So thank you very much for listening or watching wherever you are

Aleksandra King (16:44):
Thank you for joining us. We hope you found it informative and helpful in any way.

Elliott King (16:48):
Yeah. So, we had a comment from a viewer last week who said, “the show is wonderful and ‘light-heartedly informative”. So, I hope we’ve light-heartedly informed you, a little bit of interesting knowledge on the “social dilemma”.

Aleksandra King (17:04):
That’s right. Go and watch it. If you haven’t on Netflix, ‘social dilemma’ definitely worth a watch.

Elliott King (17:10):
Absolutely. Until next time. We’ll see you later.

Aleksandra King (17:12):
Bye.

Aleksandra King (17:13):
Thank you for listening to the WATCHTIME podcast brought to you by digital agency MintTwist.

Sign up for the Drop, our monthly insights newsletter