The Sidelines is the supplementary issue to every main edition of The Starting Block. Here you will find the interview transcript and more information about the conversation of the week. The interview is transcribed by Otter.ai and edited for length and clarity. All links provided come from me, and not the guest, unless stated.
Listen to the audio version here. You can also listen to the first part here or read the transcript here.
TRANSCRIPT
TINA CARMILLIA: Hello, my name is Tina Carmillia and this is The Starting Block, a weekly conversation on science and society with an emphasis on disinformation, data, and democracy.
Before we start, I’d like to let you know that the transcript and credits for this conversation are available on The Sidelines, the supplement to every main edition of The Starting Block.
Now, in the same lane as last week: filmmaker Ineza Roussille, to pick up where we left off. Our ‘hero’s journey’ this week takes us to online moderation, asking for our cats’ consent, and finding the Big Bad of the Internet.
Ready? Let’s go.
INEZA ROUSSILLE: Can I ask you some questions now?
TINA CARMILLIA: Yeah, absolutely, go on.
INEZA ROUSSILLE: How does doing this work affect you, personally? Are you completely paranoid about digital security all the time?
TINA CARMILLIA: When I started working on this, I had this very naive approach to it where think that I’ve got nothing to hide, so spy on me if you want. It’s a very bad approach to this.
[Edward] Snowden says that “arguing that you don’t care about privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say.”
INEZA ROUSSILLE: Wow, great quote.
TINA CARMILLIA: So, I stopped having that perspective because I feel like I do need to have an opinion about things, but because I didn’t come into this field with scepticism—I came with like, I’ve-got-nothing-to-hide kind of approach—it didn’t push me to the paranoid side. First I took this approach of “I want to learn everything.” I’ve arrived at the point where I realised that I can’t learn everything, I need to start forming opinions about things because things are moving a lot faster than the rate my opinions are forming. So I’m at the stage where, as much as I talk about new technologies and misinformation, I have to always remind myself that it’s not that they’re villains, you know? Technology is a tool, and we need to know how to use it because we’re the ones who are the toolmakers. I know it’s a wishy-washy answer but…
INEZA ROUSSILLE: No, no, I totally get it.
TINA CARMILLIA: I mean I’m on all the platforms. I’m still on Facebook, I’m still on Whatsapp. I agreed to the terms and conditions that I argued against because I need to be on the platform, like, you are cornered.
INEZA ROUSSILLE: Yeah, totally.
TINA CARMILLIA: You’re given this illusion of choice. The thing is, you don’t really have a choice, you can’t exactly opt-out of cookies on websites, because if you opt-out of it you have no access to the website (Emily Stewart, Vox). So it’s an illusion of choice. I’m sure there’s a way to work around all of this where we don’t have to trade personal data and personal information for public information, for the news news, for education, you know?
INEZA ROUSSILLE: But we’ve gone so far past that anymore. Do you ever thing that we can actually roll it back to a time before cookies? It’s not gonna happen.
I constantly have this dilemma, as well, right? Facebook is the only social media I’m actively on. I have a Twitter thing but I hate Twitter. It's so horrible and toxic. And on that front, that’s the only positive I can give Facebook, because Facebook—what I like about it is that it allows you to choose, each post, ff you want to make public or private, right? You don’t have to choose—your whole account has to be public or your whole account has to be private. That’s what I like about Facebook, but other than that, I hate Facebook, as a company, as everything they stand for, I hate them. But at the same time, I’ve built 15 years of network on Facebook, and giving that up is so hard.
TINA CARMILLIA: It cannot be a one-person struggle because, yeah, you can leave Facebook and give a middle finger to [Mark] Zuckerberg with that departure but no one’s going to notice your departure. So you need a mass action. Everyone has to do it the same way that when the first iteration of WhatsApp’s new terms and conditions —because the wording was so poorly done, everyone thought that they’re giving away personal information to businesses, which, to be fair, is already being done (Manish Singh, Tech Crunch). But it’s being made more obvious with these terms and conditions that they’re forcing users to—and a lot of people made the switch to Signal and Telegram. So you need that kind of massive, widespread information blasted. And you need the alternative already waiting for people to make that switch because you don’t have the alternative, something has to fill that void, and it’s going to be malicious actors.
INEZA ROUSSILLE: But I do feel like a lot of the time this kind of actions—maybe it’s just me—but it feels very reactionary. So, people may make that behavioural change at that moment but it doesn’t last. Generally, people still use Whatsapp, even though they also use Telegram and Signal now, they’re still using WhatsApp, right? I don’t know, in the long term, whether that really had any effect on Facebook.
TINA CARMILLIA: But we see the deaths of a lot of platforms like MySpace and Friendster. But those are slow, miserable deaths that they suffer, people made the switch from those platforms—
INEZA ROUSSILLE: To Facebook.
TINA CARMILLIA: Yeah.
INEZA ROUSSILLE: That was also when social media was kind of figuring itself out, right?
TINA CARMILLIA: Correct, yeah the news feed, was what was appealing about Facebook, and this is the villainous part of Facebook.
INEZA ROUSSILLE: It also felt like a safer—I don’t know, it probably was not safer at all then, but I feel like we were all just more naive. When you think about the things I put on MySpace and Friendster or even the early days of Facebook when you go out one night and put 200 photos on Facebook the next day? You would never do that now. That’s insane. Why would I give Facebook, these photos and information? But it’s too late now it’s there already happens there.
I think one thing definitely that that really annoys me—I’ve kind of gotten over it a bit now. Because I remember when Snapchat first started, and they had the whole, you know, disappears in 24 hours thing? And there were a lot of security concerns at the beginning of that. And then I felt like people just got used to that behaviour. And then, of course, Facebook introduced it to InstaStories, everyone jumped on that and then all the security concerns were completely out the window, and it became just completely normalised.
For me, I mean, firstly, I just have a lot of anxiety on social media, like people posting stuff without consent. But also, it’s also my job right like everyone I put on camera, I have to have explicit consent for and take responsibility for. What I saw with InstaStories is that people feel like they have a right to broadcast at any point, at any time, even in private spaces. If you’re in public spaces, then fine. But in people’s homes? I can have a bunch of friends over, and then at the end of the day, my cats will be on five people’s different Instagram accounts. And not one of them—
TINA CARMILLIA: Asked for your cat’s consent?
INEZA ROUSSILLE: Not just my cat’s consent, but my consent. Not one person thought, “Oh, this is someone’s private home, I should ask them for consent before I put their home up on my Instagram.” Even if it's going to disappear in 24 hours or whatever, I think that InstaStories kind of completely obliterated the idea of consent and private spaces, and that annoys me. That really annoys me.
TINA CARMILLIA: Because if it disappears in 24 hours, you can always do a screen record. And there is an archive function to the stories. Even if it’s 24 hours to the public or to whoever who is following my account, it’s forever for my account, and if you have access to my account, you have access to everything. The 200 photos from 2009? Still there.
INEZA ROUSSILLE: It’s also forever for Facebook right? We cannot be so naive to think that they are actually going to delete it after 24 hours. Like, no way.
TINA CARMILLIA: But we made those decisions, a decade ago and we can’t take your back.
INEZA ROUSSILLE: Yeah, we can’t. It’s completely normalised now.
TINA CARMILLIA: Everybody’s probably just given up, you know? I’ve sold my soul, so now I’m just gonna continue doing it.
INEZA ROUSSILLE: I’ve kind of given up, because I’m always like the weird one, who’s like, “Guys, consent?” And they’re like, “Ugh, okay.” It makes me feel like I’m like the weirdo, the alien who’s like, don’t put up on your InstaStories without consent.
TINA CARMILLIA: I was very apprehensive about Stories as well for similar reasons, but because I have to be on these platforms, I have two TikTok accounts—
INEZA ROUSSILLE: Why?
TINA CARMILLIA: Because TikTok is so algorithm-driven that I wanted to curate my feed based on certain personalities. So I have one that’s me. Another one that’s on another phone that’s a made-up persona and see the differences in the feed, in my recommendations.
I mean, I can choose to not be on these platforms, but along the way I learned about how to minimise the risk and mitigate some concerns. There are certain rules that I have with anyone who lives with me. I don’t want people to be able to identify where I live, based on a photo. So my Instagram Stories have images of beers that I drank, but you wouldn't know where I drank it, because this is just against the background of a tiled wall. When you look at the feed, you’d probably just think that I’m just a beer drinker.
INEZA ROUSSILLE: I’m so glad you think like that too and it’s not just me. Because it’s not just the cat. It’s the entire house behind the cat.
TINA CARMILLIA: There have been many times that I have to DM people saying that, please take down that comment, or please take down that photo. I mean, images, that’s one thing, but sometimes it’s a comment. Sometimes, I do everything I could to make it not obvious where I am or, you know who I’m with or whatever. And then someone would comment, “Oh, isn’t that so-and-so? I saw them there this evening.” So you can place this person at this place so easily through that comment and it might be an innocent comment, maybe acknowledging that we have a mutual acquaintance, but you could be endangering them.
…Everyone I put on camera, I have to have explicit consent and take responsibility for. What I saw with InstaStories is that people feel like they have a right to broadcast at any point, at any time, even in private spaces. If you’re in public spaces, then fine. But in people’s homes? I can have a bunch of friends over, and then at the end of the day, my cats will be on five people’s different Instagram accounts.
INEZA ROUSSILLE: I think that’s totally fair. I wanted to ask you also like, how different was radio to the kind of work you do now? Because I feel like you guys did a lot of like talkback radio, and it’s a different kind of form of having a conversation, as opposed to social media. How do you feel about that, do feel it’s easier to control the information on radio, or not really?
TINA CARMILLIA: Moderation is such a tricky subject because when you talk about moderating conversations, everyone’s going to be up in arms about freedom of speech, and so on. But the thing about radio and other forms of traditional media is that you have gatekeepers, and I’m not using the gatekeeping word in a malicious kind of way. I mean you’re screening these calls. I enjoyed being a producer more than I enjoyed being a presenter, and as a radio producer, it’s my job. I’m the one that’s screening to get potential guests and speakers, do the background checks, vetting their credentials. And when we put them on air, and we have phone lines open, I’m screening every call. I’m talking to them. Having a feel—and you need to have a nose for this kind of thing especially when you don’t have any other cues but an audio cue to be able to see are they going to trick me? Once they go on air, they’re going to say something else and curse the government? And we’re the ones that’s going to be fined, they would get away with it, we’re the ones that’s going to be fined RM500,000 or whatever it was that Malaysiakini was fined for a user comment (Hidir Reduan Abdul Rashid, Malaysiakini).
So, with traditional media you have those roles, you have the news editor that gives you the okay for whether a story goes to print. With broadcasting, it’s the same, you have your editors and your producers that do the quality control, quality check around it.
But with social media, there’s no form of moderation. I mean, you have like Facebook groups and stuff like that where you have moderators you have online forums that have moderators but they’re doing it voluntarily, doing it when they’re online, and people are online at different times. It’s a different kind of moderation, you cannot compare the two forms of moderation. So it’s very difficult with social media because you want to also say that it is a free-for-all space, but is it really, it’s not. It's not because sometimes it’s not about freedom of speech, but it is the monopoly to that freedom of speech—
INEZA ROUSSILLE: —Who has freedom of speech.
TINA CARMILLIA: Yeah. It’s tricky, moderating on social media because you can post anything, and the moderation comes after not before. When you think about publication, let’s say for newspaper versus for posts that you put up on Facebook. For print, you go through your editor, you go through rounds and rounds of vetting, proofreading, sub-editing before it goes to print, before it gets published. But if you publish anything on social media, it’s the reverse. You put it out first. And then, someone has to complain to the Facebook complain committee, whatever it is, and they will come back to you two weeks later saying, “Oh, we’ve reviewed this. There’s nothing wrong with it.” It’s a different process.
INEZA ROUSSILLE: In our context, I mean the Malaysian context, this is kind of inevitable also right because mainstream media is so controlled and so government-owned and restrictive. People are generally just going to go online and say what they’ve got to say, anyway, right. I mean like Fahmi Reza, for example, he uses the social media space to do his activism and troll the government in a way that he could never do in mainstream media. It’s a double-edged sword, right, that social media has to given us so much freedom in that sense I mean, me especially, in the work that I do, 95 per cent of the time, it’s not going to be content that I can ever put on mainstream TV or mainstream whatever, right. So, the online space has been a godsend in that sense, for not just me but a lot of, especially, marginalised voices to put their content out there and say what they want to say that in traditional spaces, they couldn’t.
TINA CARMILLIA: But you see, maybe it's just because the Malaysian authorities aren’t that celik IT (tech-savvy), because it is entirely possible to have your content censored. You can’t possibly censor everything but that’swhat happened in India with a lot of the comments or tweets that were critical of the government and the way they're handling the COVID pandemic. The tweets were taken—they were not taken down, but they’re not viewable in India. And it gives you the illusion that your tweet was not censored. And because these businesses, these platforms— so, Twitter is not an Indian company, but they have to comply to Indian legislations because their business operates in that country. And so you can do the same in Malaysia, they will have to comply, otherwise, they will not be able to do business in Malaysia.
INEZA ROUSSILLE: I mean, I don’t know how much our government does that. I mean they have you know the MCMC laws, the Communications Act and now fake news law again. I feel like they don't need to ask Google or Twitter or whatever to take something down, they can just arrest you and haul you to court (Sean Augustine, FMT).
TINA CARMILLIA: Correct, and force you to take it out. That’s so far the approach. It’s to call you in, take all your equipment, devices, and make you take it down. But like with Fahmi Reza’s Spotify playlists, you see what happened, right? They could make a request to Spotify and Spotify has to comply. And they complied very first, whereas if we Little People made a request about something else, you’d be lucky to hear back from them. So you can exercise that heavy-handed kind of approach if the government wants to.
You asked me about paranoia—I’m not paranoid about technology because I think technology is a lot easier for us humans to band together against because technology is a non-human entity, but when it comes to human rights issues when it’s marginalised communities and all that? That’ss the part that always feels a lot more out of my control.
INEZA ROUSSILLE: Especially if the tech is modelled on human behaviour, right?
TINA CARMILLIA: Oh, yeah.
INEZA ROUSSILLE: I remember there was some kind of bot on Twitter that was speaking based on all the data on Twitter (Elle Hunt, The Guardian). The bot just became this horrible racist, anti-semitic homophobic monster. It was like, yeah, that sounds right.
TINA CARMILLIA: And it’s easy to look at a bot and say there’s something wrong with the bot, but you don’t go beyond that, you just see the bot, as the—what’s the last boss?
INEZA ROUSSILLE: The final boss?
TINA CARMILLIA: Yes. You know what I mean? But the final boss is not the bot itself, its the people behind the bot and how did we get there?
INEZA ROUSSILLE: I was watching this video on Vox the other day on CAPTCHAs and how CAPTCHAs have evolved, and it’s because these computers just got smarter and learned everything that was supposed to trick it. That was kind of scary. It’s just going to out-learn us and be able to imitate us so well, and you how are you going to have a CAPTCHA when a computer can be a human?
TINA CARMILLIA: And we’ve been training them in the early stages, right. And again, illusion of choice—we don’t have a choice but to train them, because otherwise, we can’t log into our accounts.
INEZA ROUSSILLE: Exactly. I didn’t even know. It’s like, oh my god, every time I put in a CAPTCHA, I’ve been contributing to this machine learning and I didn’t even know it. I mean, that’s literally everything you do online.
Is the internet there different on that side of the world, is what you see different?
TINA CARMILLIA: Even before moving here, I do follow Canadian news quite a bit. I mean, a lot of stories around big tech and misinformation in online spaces are coming from the UK and the US, anyway. But one thing I found quite interesting, was the Bill C-10 that’s being discussed here, where they are trying to control online content. I thought this sounds like I’m back in Malaysia because what this bill is essentially trying to do is to control the type of content you see on any social media platform because there’s a rule here where you have to have a certain percentage of airtime dedicated to Canadian artists. So it’s very similar to what we have in Malaysia. This is why with BFM Radio, 30 per cent of our programming is in Malay, which is usually midnight to, you know—when no one‘s listening, but it’s to fulfil that quota you need that 30 per cent of Malay content. So you have something similar here but to extend this for online content as well. So, how do you enforce such a law?
INEZA ROUSSILLE: You will have to change YouTube’s algorithm, basically, right?
TINA CARMILLIA: And I’m just thinking, this is this avery Malaysian move.
INEZA ROUSSILLE: It’s only about getting more Canadian content on social media? Say, they managed to change YouTube’s algorithm or whatever, and you’re seeing 30 per cent Canadian content on your feed. It still doesn’t guarantee that you’re going to click on it.
TINA CARMILLIA: Exactly.
INEZA ROUSSILLE: So, it might just become like YouTube ads it’s there but you just ignore it.
TINA CARMILLIA: To be fair, that kind of rule, if loosely implemented, is fair because I do believe in supporting local artists—
INEZA ROUSSILLE: Same.
TINA CARMILLIA: —wherever local is.
INEZA ROUSSILLE: But what do you think about like this whole anti-trust stuff with the Big Tech in the US?
TINA CARMILLIA: I think Big Tech has to be broken up, because this monopoly that’s going on is making Big Tech, more powerful than governments, like with Facebook and Google threatening to pull out of Australia, serving a whole country. And they have that kind of bargaining power against a big country. I think it’s pretty alarming.
Just think of that in an objective sense: To de-platform, the President of the United States, without context, you don’t need to know it’s Donald Trump. I mean, he is an atrocious guy—but that kind of thing 10 years ago, would be unthinkable.
INEZA ROUSSILLE: But at the same time, 10 years ago, the President of the United States using social media like Trump did was also unthinkable, right?
TINA CARMILLIA: True. But that’s a lot of power for corporations.
INEZA ROUSSILLE: Yeah, it is. It feels like a very American problem. Because, you know, corporations, versus American government, I mean it’s a battle of two evils really, right. But unfortunately it’s not an American problem because these monopolies control literally the whole world. It just reminded me, I was watching Mitchells vs The Machines and the founder of the corporation that made the robots is literally a young guy in a hoodie called Mark. So like, yeah that’s truly the villain of our generation.
TINA CARMILLIA: Yeah, I always struggled with finding the villian. When I did my Ring True mini-series after my press fellowship in Wolfson—I did the mini-series to kind of capture the essence of my old study on misinformation in. science news. And I was going through the script with Caroline Oh, who was also the narrator. But she went through my script, and she wanted me to—Who’s the villain? We’re the heroes [so] who’s the villain? And I always struggle with that kind of storytelling, you know, that you need a hero, you need a villain, because I think that the most effective stories sometimes don’t have a villain.
I keep coming back to the stories of the Thai boys who got trapped in the cave (Helier Cheung and Tessa Wong, BBC). The whole world was rooting for them. And there’s no real human villain. I mean, its nature. So aood story doesn’t necessarily need to have the supervillain. People can still come together, but I don’t know what the compelling narrative is without a villain for the subject that I’m working on.
INEZA ROUSSILLE: It doesn’t necessarily need a villain, but villains do help mobilise, also, especially when there are such an obvious villain like [Jeff] Bezos or something. That’s not hard. But yeah, I do agree that it’s usually way more complex than just hero—villain.
TINA CARMILLIA: It’s the most used technique in storytelling, isn’t it?
INEZA ROUSSILLE: Yes. Hero’s Journey is Film School 101.
I mean tech journalism and all that, if you’re in it, then you kind of know about it or you read about it regularly, but if not, I don’t think the average Internet user thinks about these things all the time, right? Is there much of the newsletter culture here in Malaysia?
TINA CARMILLIA: It’s growing. I’m seeing some journalism friends doing it. I use Substack as a platform. It’s a bit problematic right now with some internal issues but when it came out, it was supposed to be for journalists to curate the news (Oscar Schwartz, The Guardian) . They created the kind of messaging or angle to attract journalists, opinion column writers to have a place to host their writings so something like a competition with Medium, but Medium is website-based, while Substack is a newsletter and goes straight to your inbox.
And then when Twitter rolled out Revue it’s so exciting because I want to subscribe to all these newsletters, because this is kind of like how misinformation spreads in WhatsApp groups and other social messaging groups. You have to be part of this Telegram group, or part of this Whatsapp group to see what false information is being spread. And with newsletters, because it goes to your email, you need to be subscribed, and because the subscription tier if it’s not free, you have to be a member. That is like a WhatsApp group—it’s closed, it’s to only those who are a member of the group, right? I think newsletter is the next place to be.
INEZA ROUSSILLE: I didn’t even think of that.
TINA CARMILLIA: You create a following, it comes straight to your inbox, only you know about it. It’s that exclusivity, that makes it even more enticing.
INEZA ROUSSILLE: I guess the downside—I don’t know if it’s a downside—is that people can’t really respond. I mean, they can always reply to your email, of course but it’s not like they can immediately type a comment and then have a conversation. So does that make you feel a little protected in a way?
TINA CARMILLIA: It’s interesting to see how it plays out—maybe people are also starting to become more disillusioned by messaging apps because you’re just inundated by it. If you’re a member of so many groups, you get people commenting on and on and on and on. So maybe email newsletters? You find your sweet spot there.
INEZA ROUSSILLE: Yeah, no, I’ve learned so much from your newsletter over the last year.
TINA CARMILLIA: I’m glad.
INEZA ROUSSILLE: I get excited when I see it in my inbox and I don’t ever get excited about anything in my inbox.
TINA CARMILLIA: That’s really nice to hear.
INEZA ROUSSILLE: Yeah.
TINA CARMILLIA: And once again, that’s filmmaker Ineza Roussille. If you would like to join me on the show for conversations like this, get in touch here.
Don’t forget to subscribe, if you haven’t, and if you enjoyed this episode, consider sharing it with someone. ‘Til the next one, goodbye for now.