Most Americans these days get their main news from Google or Facebook, two tools that rely heavily on algorithms. A study in 2015 showed that the way a search engine like Google selects and prioritises search results on political candidates can have an influence on voters’ preferences. 

Similarly, it has been shown that by tweaking the algorithms behind the Facebook newsfeed, the turnout of voters in American elections can be influenced. If Marc Zuckerberg were ever to run for president, he would theoretically have an enormously powerful tool at his disposal. (Note: as recent article in The Guardian investigated the misuse of big data and social media in the context of the Brexit referendum).

Algorithms are everywhere in our everyday life and are exerting a lot of power in our society. They prioritise, classify, connect and filter information, automatically making decisions on our behalf all the time. But as long as the algorithms remain a ‘black box’, we don't know exactly how these decisions are made. 

Are these algorithms always fair? Examples of possible racial bias in algorithms include the risk analysis score that is calculated for prisoners that are up for parole or release (white people appear to get more favourable scores more often) and the service quality of Uber in Washington DC (waiting times are shorter in predominantly white neighbourhoods). Maybe such unfair results are not only due to the algorithms, but the lack of transparency remains a concern. 

So what is going on in these algorithms, and how can we make them more accountable? 

A lot of interesting investigative journalism can still be done in this field. Generally, by trying to ‘poke’ at the algorithms and seeing how they respond - correlating the output to the input - we can try to figure out how they work. Investigative journalists can play this game, collect and analyse the data and determine whether the results are unfair or discriminatory. Or maybe they lead to other negative or undesirable consequences (censorship, law breaking, violations of privacy, false predictions…).

There’s plenty of methodological challenges to deal with, however, you can only really understand why you’re seeing the results you’re getting if you have a deep technological knowledge of how a system was built. There are feedback loops between the algorithms and the people that design them. Algorithms are an instable, dynamic system; results can be changing every day, so tracking in time may be needed. The appropriate size or dimension of sampling needs to be decided, as well as the variables to consider. Then there are plenty of legal and regulatory aspects to look into. 

But perhaps most importantly, we need to ask ourselves what our expectations are. What do we consider to be ‘fair’ algorithms? Different people will have different views on that, but we probably shouldn’t let the algorithms keep deciding it for us.  

Any journalist interested in investigating algorithm accountability can go to algorithmtips.org for help to get started.

© Katrien Vanherck

Earth Investigations Programme

Two New Grant Programmes for Environmental Journalism


BRUSSELS - Today Journalismfund.eu opens two new grant programmes for environmental journalism: one for cross-border teams of journalists and news outlets to investigate environmental affairs, and one for organisations to develop training and support services.

The two programmes will provide more than 9 million euro in direct support over a course of six years, an unprecedented financial injection for independent investigative journalism in Europe.

Journalismfund.eu is hiring

Journalismfund.eu is hiring a Project Coordinator


BRUSSELS - Journalismfund.eu is hiring a Project Coordinator for its Earth Investigations Programme.

Journalismfund.eu continues its Local pilot project


BRUSSELS - Teams of Belgian and Dutch journalists can again apply for grants for local investigative journalism at Journalismfund.eu vzw. After a successful pilot project last year, Flemish Minister of Media Dalle is making 75,000 euros available for this purpose.