This is AI generated summarization, which may have errors. For context, always refer to the full article.
Tech platforms’ algorithms and critical data that may give people a much better understanding of how the underlying systems work and how the system makes use of user data continue to be largely a mystery.
There is legislation on the way that seeks to get tech platforms to make these elements more transparent. Chief among these is the EU’s proposed Digital Services Act (DSA), the final version of which is being debated upon by the European Parliament and Council of the European Union. It’s considered as “the most aggressive attempt yet to regulate big tech companies” as The Washington Post said in its report.
Rappler CEO and Nobel laureate Maria Ressa, in a testimony given to the European Parliament on Tuesday, February 1, asked to make this data open to journalists.
Ressa explained: “It isn’t just researchers who should have access. It isn’t just NGOs. Journalists should have access to this data…. The DSA has to open up the black box. I’m not a researcher, I’m a journalist who understands networked effects and who understands data. This is how we have survived in the Philippines. Should I have access to that data? I should.”
“We should get to a world where people understand exactly how their data is used because how can you have informed consent if you don’t know what it is; if you don’t know what it’s being used for. Education will catch up. Looking at this black box, it has to go beyond third party researchers,” she added.
The proposed Digital Services Act’s Article 31 says, “Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers.”
It also specified the requirements to be a vetted researcher: “In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.”
Requests for access to data will be approved “for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks” such as negative effects on privacy and freedom of expression, dissemination of illegal content, and disinformation, and the effects of content moderation systems and content recommender systems or algorithms.
Earlier in the session, Ressa emphasized the importance of looking at the problem of disinformation by focusing on the algorithm.
“We should be looking at algorithms, and models because that’s what has transformed our information ecosystem. The choices that were made, for example, the choice to make lies and facts identical – that’s a choice. A journalist wouldn’t make that choice because we’re accountable. But the tech platforms did. They don’t care; they’re truth-agnostic,” Ressa said. – Rappler.com