AIPCE

Follow us on twitter

Subscribe to Newsletter

Calendar of Events

See More

How self-regulation has responded to news automation

Share this page

Lauri Haapanen

The principal Investigator of a research project 'News automation and ethical guidelines', conducted by the Council for Mass Media in Finland. The project is part of a larger project 'Media Councils in the digital age'.

Journalism has always been shaped by the adaptation of new technology, and the algorithms that nowadays support news production in various ways are no exception. This is an on-going process in which news automation easily takes its place. While its history goes back decades, for example in weather forecasts, in recent years news automation has become a small but growing part of the news production ecosystem, especially in domains like sport, finance and election results.

News automation is the term given to algorithmic processes that convert data into news texts without human intervention. When it works well, it can speed up production, increase the breadth of coverage, enhance accuracy as well as enable new types of personalisation. This means that output can be tuned to satisfy users' personal interests, based both on the information they enter on the website and their past online activities. In practice, personalisation can affect not  only the start page of a website or a mobile application and other services, such as lists of recommended articles, but also the content of a news item.

Naturally, ethical questions also arise. Who codes the algorithm? Is the raw data manipulable? Can targeting content differently to different people create so-called filter bubbles?

And we can ask, how have self-regulatory media councils responded to all this?

No one has an overall picture

To answer this question, I started by going through almost 500 codes of ethics – mainly with word searches automat*, algorithm*, robot* and personali* – but nothing came up (with one exception, which I will return to later). I also conducted an online survey for the Chairs of 35 European media councils, of which 20 responded. The answers confirmed that councils are still waiting and watching.

The most common answer was that the questions are not yet relevant. “News automation is more or less non-existent in our media market, and also new personalization is very rare”. About one in three respondents said that although the matter is not yet acute, they have already discussed the challenges news automation and personalisation might involve and the possible need for guidance.

The survey, along with interviews that I conducted with media practitioners, software developers and researchers, made it clear that no one has the big picture of the state of news automation. By and large, media players implement algorithm-driven solutions as they feel like it, with no obligation to report to authorities, councils – or each other. Because of the inconsistent labelling, it is also difficult for the audience to know what's going on.

The use of algorithms is journalistic work

The exception mentioned above is the Finnish Media Council, which published a Statement on marking news automation and personalization in October 2019. As with the other councils, there was no urgent need for a statement: in the Finnish mediascape, the use of news automation is experimental, personalisation has become a bit more common, and so far the council has received only one complaint related to the use of algorithms.

Nevertheless – and especially since technological and legislative developments as regards algorithms are pretty intense right now – the Finnish Media Council decided to proceed proactively. The underlying idea was that if the media does not regulate itself, someone else will do it sooner or later, and if journalists leave the use of algorithms to regulators, the EU or platform companies, this will jeopardise the freedom of the press.

The council's control extends to (almost all) journalistic media, but not, for example, to social media companies. As the title of the statement suggests, the statement makes recommendations on the marking. These recommendations are still quite loose, because digital development is fast right now and in such an environment, it doesn't make sense to tie practitioners down to overly detailed or binding guidelines that could hamper the development of the media industry and may no longer be applicable in a few years' time.

Instead, the key message is principled: the use of algorithms is part of journalistic work. This notion
stems from the fact that algorithmic tools influence what is published, with what emphasis, and to whom, and these are not mechanical choices but are at the core of journalistic decision-making power. Therefore, the statement reads: Media outlets must therefore ensure that their digital service developers also adhere to the Guidelines for Journalists, when they independently make decisions that influence journalistic content.

What kind of guidelines should there be?

To support the work of media councils in preparing for the future, my next step in the project is to put together a report on what constitutes state-of-the-art in news automation, what ethical considerations this gives rise to, and what response should be made to these considerations in terms of codes of ethics. In other words, what kind of guidelines should there be?

For example, one might argue that when there is a journalist “between” the algorithm and the audience, that is, the algorithm is only an aid, and the journalist checks the text the “robot” generates before it is published, the existing codes of ethics are valid as they stand. Fine. But what about fully automated news production that does not include the human component?

Or one might argue that the basis of self-regulation is to provide a guide for journalists, but its effectiveness lies in the fact that the audience can complain if it suspects that the guidelines have been violated. This is especially important because media councils deal mainly with cases about which there have been complaints. Therefore the guidelines should be sufficiently detailed for members of the audience to grasp, a demand that gets more complex when one takes into account that automation, not to mention the various forms of personalisation, takes place mainly behind the scenes.

I won't take up any more of your time now. These are just some ideas to keep you in the picture while you wait for the report which will – hopefully – provide you with answers, too. It will be published on this website in September.

#PressCouncilsEU