Disclose algorithms, ban micro-targeting says Columbia University professor Anya Schiffrin

Kyle Chua

This is AI generated summarization, which may have errors. For context, always refer to the full article.

Disclose algorithms, ban micro-targeting says Columbia University professor Anya Schiffrin


Regulation will have a 'big role' in curbing disinformation says the director of the Technology, Media, and Communications specialization at Columbia University
Disclose algorithms, ban micro-targeting says Columbia University professor Anya Schiffrin

What will be Facebook’s role in the coming US presidential election? That seems to be the million dollar question leading up to the November polls. 

While the social media giant has taken steps to not get caught in another Cambridge Analytica scandal, its efforts might not be enough to prevent it from dramatically influencing the results of another election, said US professor, author, and former journalist Anya Schiffrin.

“Facebook has just taken advantage of this legal vacuum in our country to go ahead and really further poison an already polarized discourse,” she told Rappler executive editor Maria Ressa in an interview

Schiffrin, who writes about journalism and is the director of the Technology, Media, and Communications specialization at Columbia University’s School of International and Public Affairs, believes that Facebook and other social media companies have to go “beyond disclosure” when it comes to running political ads. 

Beyond disclosure

What does this mean? 

These companies essentially have to do more than just disclosing fact-checking information to the public, and instead commit to far-reaching platform changes that can help combat disinformation. 

“So what I’d like to see the platforms do is commit to a voluntary fairness doctrine where from the get go, they say, you know what, we’re not going to profit off this stuff,” she said.

“What we’re going to do is to have a news feed of quality information on the topics that matter, and that can be debates, town halls provided by the campaigns, provided by, you know, fair commentators and respected media and just do away with all this garbage out there”

She adds that there are a lot of things these companies can do not only to minimize its influence in elections but also to aid in how democracy functions. (READ: Zuckerberg ‘stubborn,’ ‘very dug in’ says ‘Facebook: The Inside Story’ author Steven Levy)

Facebook, for instance, recently announced that they’re banning political ads on the platform a week before the US elections. Schiffrin, however, argues, “Why ban political ads a week before the elections? Why not ban them permanently?”

Regulation’s big role

Shiffrin said that when she was writing her PhD dissertation on online misinformation and disinformation, no one from Facebook ever wanted to speak to her. The primary reason for which she suspects is that the company is trying to avoid government regulation. 

“They started putting forward this idea that the solutions were things like funding, fact checking. I was like, easy for that. You pay for that. And hopefully nobody will regulate you,” she said.

Facebook also put up an Oversight Board, expected to start operations later in the year, to regulate content. Facebook promises not to meddle but is funding the program to the tune of $130 million.

“So I think it’s greed. And I think that it’s ideology. And I think that they conveniently think everybody does this. It’s like the old (saying) if you have a hammer, everything looks like a nail.” 

“They were so committed to the idea that Facebook was great and Facebook could solve problems, social media would connect everybody that they simply couldn’t see that they were actually spreading hatred and killing around the world. And then once they saw that, they couldn’t really handle it and they sort of a lot of them, like Zuckerberg, seemed to have just doubled down.”

For her, the likes of Facebook aren’t real marketplaces for ideas as it continues to profit from political ads and other information campaigns. This is where regulation enters the picture, which can force these companies to disclose their algorithms, ban micro-targeting, and enforce stricter privacy protections, among others. 

“If democracies want to pass regulations, they’re going to have to do it. Ditto all the copyright, which is on the other side of the equation. That’s provisioning news and providing for a supply of quality news. All those tech companies have to pony up money and they should obviously be taxed.”

And for there to be a steady stream of quality news, journalists, who can verify and be critical of information, have to remain employed. 

“I think the main thing we have to do straight away is get more funding to local and quality journalism around the world,” she said.

“We need far more funding for quality information. And a lot of that is going to have to come from the tech companies.”

Likewise, Schiffrin thinks there  has to be funding in systemized media literacy training, which helps people discern what’s true and what’s not. 

Part of this is making sure the youth are doing something to better society instead of spending their time on social media that can lead them to a rabbit hole of conspiracies and dangerous ideologies. 

Schiffrin said she remains optimistic that these problems can eventually be solved because there are now more recommendations coming from think tanks, academia, and civil society. And because she believes the next generation are capable of fixing these problems. – Rappler.com

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Download the Rappler App!