While algorithms encroach and technology advances, Richard Sharpe demands journalism by humans, for humans.
An event occurred in October 2015 which will cast a long shadow over the work of journalism and the jobs of journalists. Associated Press (AP) had been testing a piece of software which generated stories from the quarterly reports issued by listed companies. These reports flood out four times a year and AP wanted a fast method of getting the news on the wires. It had already implemented a software which can generate pyramid-style stories from this highly structured data; but it had also maintained a further stage in the process whereby human being were called on to check the stories generated by the software. In October 2015, however, AP ceased taking this extra step. The software would do it all: no human interaction at all. No salaries; no holidays; no maternity or paternity leave; no sick leave; 24-hours a day the software can produce pyramid stories reporting on the financial results which are seemingly indistinguishable from human output.
Computers take out “computers”
This is not the first time computers have taken over the jobs of humans. The first time computers took over human jobs was the job of the “computers”, i.e. people who did the thousands of small calculations for scientific or technical applications on hard-cranked calculators. Computers, the electronic type, could do this faster and more accurately.
In business the first application in the UK was the Lyons Electronic Office (LEO) computer in 1951. LEO did the bakery valuations application. The application was first run on September 5 and took over the job without manual checking on November 29 1951. Later other jobs were developed for LEO including the payroll. Lyons even ran the payroll for other companies such as Ford Motor Company in 1956. There went the jobs of wages clerks.
The starting point for the application of electronic computers to human jobs was Alan Turing’s 1936 paper. ‘On computable numbers.’ In it Turning devised a theoretical machine to solve a problem in mathematics. He also showed that any mathematical problem which could be reduced into an algorithm, a series of logical steps, could be loaded onto this theoretical machine.
Theory to practice
Turning turned theory into practice while working at Bletchley, using the Colossus machine to help crack German military codes; then at the National Physical Laboratory and afterwards at Manchester University.
The more computing power there was, the faster algorithms could run and therefore extend the range of jobs for which the computer could be used, tackling more complex jobs which would run in a time to make the application practical.
Leo, Colossus and early computers depended upon the technologies developed by the radio and telecommunications industries. With the coming of integrated circuits, the computer industry started to replace single transistors with its own technology.
In 1965 Gordon Moore, a leading integrated circuit engineer, showed that the integrated circuit industry had doubled the number of integrated transistors on the same area of silicone every year. Later he revised this to every two years. More components on the same piece of silicone means faster computing: faster computing brings in applications which can be run as long as they have an algorithm and the time in which they run makes them practical. As the chips get smaller then they need less power until they can be run by battery and be mobile. As long as the computer can run the algorithm fast enough, it becomes a practical application.
The power of computer technology is not ever-increasing. There are finite limits to integration: it is just that semiconductor engineers have found ways to achieve Moore’s law, which was not originally a prediction but an observation of what the engineers were able to do in the early days of integration. There are other types of processor technology now available in the form of quantum mechanics which offer even faster processing speeds than conventional electronics.
As the price of computing fell and continues to fall and the speed increased and continues to increase, so layer after layer of human labour could be transferred to computing: it became practical to do so and, more importantly, profitable. Employers sought out bottlenecks in production processes and tackled them in turn, starting with simple jobs and going onto the more complex ones as prices fell and speed increased. The bottleneck in the newspaper industry was the process of composing in hot metal. Most journalists looked the other way as the jobs of compositors were automated and the bargaining power of compositors was crushed. Today journalists cannot look away from the forward march of automation since it is their jobs which are now on the line.
Humans using the tools developed by humans
This is not to say that there is an inevitable technological determination about this process. It is, after all, humans seeking to improve productivity and/or improve profitability by the application of technologies created by humans. And not all the applications are successful. There are many well documented IT disasters in the public sector where the algorithms where not sufficiently robust and/or the processing power fast enough to create a successful application. There are just as many failures in the private sector, but they mainly remain private. Not so in the financial sector, where there has been a litany of much-publicised IT disasters. For example, a group of academics and traders called Long Term Capital Management thought they had developed a piece of software to play – and beat – the market. But they were wrong-footed, and their fund exploded, when the rules of the financial trading game change after a slump in Russian capital values in 1998.
The steps to computerisation
The process of transferring some aspect of human labour into computing takes several steps. First it has to be identified as a bottleneck: this is in business applications where consultants come in. They regularly review the labour process to seek out these bottlenecks. Then the software has to be created for the application. This often involves reviewing what the labour process is and transferring it into software. Often this involves the collaboration of the very human labourers themselves who will eventually be replaced by the application, if successful.
There are always limits to what the software can do. In the AP application the stories are routinely generated from standardised data contained in company financial reports. These quarterly reports from publically-listed companies are accompanied by a long document which the US Securities and Exchange Commission requires for every quarterly announcement: it is called the 10Q form. It includes the structured data of the financial report but also long passages of text which describe and analyse the context behind these financial results. Software cannot yet drill down into these reports to get exclusive stories: it can only do the mundane task of writing a pyramid story from the numbers. For some years to come, it will take humans to do the analysis.
We can begin to see, therefore, where journalists as human workers can fit into the new pattern of “content generation.” Software can do the mundane task of generating stories from highly structured data. Humans must continue to do the analysis. For now.
We have seen that the application of computing is not a single event but a process depending on the ability of humans to identify bottlenecks and exploit the technology then available. What was not (previously) possible to transfer to computers, (eventually) becomes possible. For example, artificial intelligence (AI) applications can now mimic the reasoning of humans and provide mechanisms for f taking human reasoning into the world of computing. According to Wired magazine, some financial traders are already using AI to make all of their trades, without human intervention. This is an area where speed is essential to catch the price at the right level during the trading period. As early as 2010 researchers at New York University estimated that over 50 percent of all trades on US stock markets were driven by software (Quant Congress).
Associated Press (AP) developed its quarterly reporting system because speed was essential. The analytical side of the process of reporting, currently left to humans, is not yet so time-critical. There will come a time when it is, and then it will be a candidate for the application of AI to what remains of the human involvement in financial reporting. If the many prophets of AI are right then all human intelligence could, at some time in the future with the continued development of computing power, be transferred to algorithms. These could then be captured in software and then run on hardware.
Does this mean that human journalism is just a long rear guard action of retreat in the face of the computerisation of journalism?
Forty-seven percent of employment at risk
In 2013 Frey and Osborne, two Oxford academics, plotted out the likelihood of jobs being computerised. They used the US labour market classifications of 702 job titles. They used a novel methodology to estimate the probability of computerisation of those jobs. According to their estimates, 47% of total US employment is at risk. Here is an extract from their full table focusing on the jobs which are in journalism or close to journalism.
Jobs in or close to journalism with their probability of computerisation
|Rank||Probability of computerisation||Description|
|66||0.015||PR and fund raising manager|
|123||0.038||Writers and authors|
|151||0.067||Broadcast news analysts|
|171||0.1||Radio and TV announcers|
|177||0.11||Reporters and correspondents|
|244||0.31||Film and video editors|
|442||0.81||Word processors and typists|
|481||0.84||Proof readers and copy markers|
|544||0.91||Gaming and sports book writers and runners|
Source: Frey and Osborne (2013)
In this table the least likely roles to be computerised are PR and fund raising managers. The probability of the editor’s job being computerised is 0.055. Reporters and correspondents are at 0.11. Court reporters are at 0.5. Outside of journalism the least likely to be computerised are recreational therapists with a probability of 0.0028. The most likely to be computerised, say Frey and Osborne, are telemarketers with a probability of 0.99.
Frey and Osborne refer to further research by Goos and Manning (2007), entitled ‘Lousy and Lovely Jobs’, because it “captures the essence of the current trend towards labour market polarization, with growing employment in high-income cognitive jobs and low-income manual occupations, accompanied by a hollowing-out of middle-income routine jobs.”
In other words, don’t be in the middle where computerisation is more likely to take over the job.
A major factor which stems the march of computerisation is the employment of humans of social intelligence, Frey and Osborne observe. They do not define social intelligence but give some examples. These include creativity, perception and manipulation, negotiation skills and persuasion skills. Social intelligence relies on our human understanding of each other, our empathy and our ability to use what we call “common sense”. In other words, being human in a human society.
From all of this we can tease out where the humans may be able to flourish or at least survive in journalism, even in the context of the long march of software.
We can flourish where we have the social intelligence which software currently does not have. Where we can empathise: to interview people in different situations. Where we can negotiate for the access to people and data to make our stories. Where we understand the complexity of the world and are able to capture the vital parts of that complexity and present it to our audiences.
Where we can be
We should avoid being in the middle of any labour process: either be an editor or do a job which software cannot deal with because it means dealing with the complexity of people.
Avoid the routine: the more the job of journalists is a routine, one of creating content from the content given to them, the more vulnerable they are.
Be analytical: some forms of developing AI can do this but they are dependent on vast amounts of data and a large amount of computing power, even at today’s speed and prices.
We can anticipate the future of journalism produced by humans in two ways: by looking at the processes of journalism, on the one hand; and on the other hand, by looking at the types of journalism.
With regard to editorial processes, we can review the skills (the things we can do), the knowledge (the things we know), the attitudes (how we normally see things), and the habits (how we regularly do things), of the job of the human journalists, and see where our strengths and weaknesses are.
We have the skills of writing. Computers can do this as long as the journalism they generate is of a regular pattern. We will have to give up the ground of the pyramid story to the computer. But we have other ways of writing, in long and short form which will take AI a long time to capture. We have, for example, the drop lead story with its quirky or anecdotal opening and varied structure. Only a deep knowledge of the human reader allows us to employ these and other forms of ‘non-standard’ journalism.
For the moment, we have a monopoly on the skill of interviewing, although AI systems can engage in conversations with humans and gather information, and they may soon be programmed to ‘look’ for a story just as the human journalist does when interviewing.
Then there is our knowledge. Computers have more information than a human does. But information is not knowledge. Knowledge is information with meaning. At the moment human journalists have the edge here. We have our social intelligence to give information meaning.
AI systems can only acquire attitude when it is programmed into them. Humans have attitudes as a result of the experience of our social lives as formulated into our social intelligence.
In the area of habits the computer seems to beat the human every time. It can repeat the same thing without tiring and in the same way every time. But that does not allow for the mutation of habits which is part of our human development as humans and as individuals. The capacity both to maintain habits and make a judgment on when such habits need to be adjusted – again, our humanity gives us the advantage.
Humans have senses. Computers can “see” through cameras. They can “hear” through microphones. They can “speak” through loudspeakers. But they cannot enjoy the smell of the flowers, the taste of a crisp and chilled white wine, or the touch of the hand of your lover. There is no enjoyment nor sorrow in a computer, only the simulation of them. Here, maybe, is our final advantage. AI is, is after all, artificial intelligence, not artificial feeling or emotion. What may seem to be weakness on our part may turn out to be a strength.
Now we can look at some of the types of journalism to see where human action is still, to some extent, required. I cannot yet predict how fast the computerisation of journalism will develop. This depends much more on the economic issues – costs versus rewards – as it does on the availability of the technology. There will be areas where computerisation could be used but the investment is not available. And there will be times when the computerisation of journalism is set back because it produces such absurd stories that its advocates flinch from pressing forward. These will come, I think, as computerised journalism enters the areas social intelligence is needed to make sense of the story and AI, lacking in social intelligence, proves unable to do the job. But the boundaries will not stay the same forever; already some pioneers are seeking to press beyond existing limitations.
Ten types of journalism
I use 10 types of journalism and judge them to be either highly susceptible to computerisation, through to moderately susceptible and down to hardly susceptible. This is not based on a mathematical model as used by Frey and Osborne but on the arguments above about the qualities (and current limitations) of computerised journalism.
It is clear already that the creation of stories from structured data in a pyramid form is highly susceptible to computerisation. Employers will seek to lower costs and increase speed of publication, so they will increasingly target this form of journalism.
Equally the ability of AI to sift through large amounts of data rapidly using different approaches will mean that any investigative journalism requiring the discovery of patterns in large data sets is ripe for computerisation.
These would seem to be the first available targets.
The next big target in journalism may well be the reporting of structured events, i.e. events which are structured in much the same way as data is structured. Any game played by a set of rules is susceptible to computerised journalism, as long it is reported in a pyramid style. This includes the reporting of sporting events or of court procedures, hence court reporting is high on the list of the types of journalism identified by Frey and Osborne with a 0.5 probability of being subject to AI.
Photo journalism is already subject to technology. Drones are used to gather images. But photo journalists with a camera are able to “see” the best shot and to frame it using their social intelligence to tell them where the human action is. Drones can use cameras to farm images but humans are needed to select the telling image. Thus the area of photo journalism is a medium term target for computerisation.
Unpredictable events, even if reported in a pyramid style, are harder to computerise: low predictability makes for low risk of computerisation. Conversely, if AI can create patterns which make such events more predictable, then they become more susceptible to computerisation.
A lot of journalism today is reviewing of products and services. Since this form of journalism is largely dependent on the prior experience which the reviewer deploys as the basis for comparison and evaluation experience, the possibility of computerisation is low.
There are four areas of journalism which have a very low susceptibility to computerisation: interviewing, investigative journalism based on interviewing, commenting and the journalism of self-expression.
The key skill of interviewing depends on the interaction of the journalist and their human subject. AI systems can mimic the role of humans in conversations but they only mimic it; they do not include the intelligence to focus on what the story is which unfolds in an interview. Yet.
As a result the forms of investigative journalism which depend on interviewing humans, also have a low susceptibility to computerisation.
Writing comment in editorial and in analytical pieces is well beyond the power of AI at the moment.
Finally, and here is an irony for many professional journalists, the journalism of self-expression, the journalism of emotion, the journalism which entails the journalist’s own experience, is very low on the radar for computerisation. It is technically possible for AI systems to generate this type of journalism, but it seems that readers, listeners and viewers need the human being behind the expression to give it validation. At the moment.
Ten forms of journalism and their susceptibility to computerisation
|Type of journalism||Susceptibility to computerisation|
|Reporting on structured data in a pyramid style||High|
|Writing pyramid-style stories from investigations of large data sets||High|
|Reporting of structured events in a pyramid style||Medium|
|Reporting in a pyramid style of unstructured events||Low|
|Investigative journalism based on interviewing||Very low|
There is an eleventh form of journalism which, I think, has a very low susceptibility to computerisation: headline writing. What software could come up with such beauties as “Shit hits fan” when Eric Cantona dropped kicked a fan taunting him? What software could come up with another gem when the Italian designer Versace was killed: “Shoots you, sir!” was a punning eference to a running gag in The Fast Show. The amount of social intelligence needed to make human readers, listeners and viewers smile at a good headline is so vast that it may take a very long time before software can write such headlines. Summary headlines, yes, but gems like these, no.
I asked above, and did not answer the question: is this to be a long retreat in the face of the computerisation of journalism? Retreat is the wrong military analogy: instead we should see this as a protracted asymetrical guerrilla war in which we human journalists use the tactics of the flexible power of scattered forces against a massed army. We will stand where we can but our labour organisations are so weak that we will be defeated, not by the technology itself, but by our employers and the way they deploy it against us.
Journalists have few allies
We journalists do not have many allies, if any at all, in this struggle. I am not sure that our readers, listeners and viewers, on the whole, mind if they are consuming the journalism of humans or of software. They do not seem to care that a growing volume of journalism is only content marketing to boost brands. Some readers, listeners and viewers will and they will follow their favourite human journalists. But most will not.
Some people will benefit from these developments. Employers and the shareholders of content generation companies who can exploit these developments will find their profits higher and their productivity soaring as humans are replaced. Areas of publishing currently under threat because they are loss-making may again become profitable. AI development companies will also benefit. So will consultants and those providing the processing power and storage capacity.
So we journalists need to move and keep moving. Move from the areas most threatened at the moment to areas where our humanity counts more than the speed, low price and accuracy of software journalism. Move to new areas as yet unexplored which we can innovate in our human practice as journalists. I cannot yet predict exactly where those areas are, but we shall find them and we shall survive, certainly not in the numbers we are now but with the best values we can have as humans. I do not want the conclusion to this manifesto to be a wishy-washy assertion of human values; instead it should be a project to project our human values into journalism. These values include empathy with the subjects of our journalism and the objects of journalism: to communicate with other humans.
We have become used to machines which can lift weights we cannot, to machines which can move faster than we can, to machines which can calculate faster than we can, which can make an image more accurate than we can draw or paint it. Our humanity is none the less for that. We can lift weights to improve our body shapes rather than to carry a sack of coal. We can enjoy the experience of a walk despite the existence of fast cars. We can stretch out our understanding of the world using the speed of calculation afforded by technology. Remember how we shifted to other forms of expression or representation once photography took hold of the portrait.
I think I have given ample evidence of what we human journalists need to do to sustain our role in journalism, even though the encroachment of software journalism is not under our control.
This manifesto is based on senior lecturer Richard Sharpe’s closing comments at the UEL Future of Journalism seminar in May 2016.
Goos, M. and Manning, A. (2007), ‘Lousy and lovely jobs: The rising polarization of work in Britain,’ The Review of Economics and Statistics, vol. 89, no. 1, pp. 118–133
Frey, CB, and Osborne, MA, (2013) The Future of Employment: How susceptible are Jobs to Computerisation?, available at: http://www.oxfordmartin.ox.ac.uk/downloads/academic/The_Future_of_Employment.pdf, viewed on 3/7/2016