MIAMI — The communications industry must focus on audiences rather than media, and outcomes rather than outputs, if is to demonstrate the true impact of its work, according to a leading academic at the AMEC Global Summit last week.

Jim Macnamara, distinguished professor in the school of communication at University of Technology Sydney, who is also a specialist in evaluation of public communication and author of several studies on organisational listening within government, corporations and NGOs, has spent the past three years leading evaluation of the World Health Organization’s communications during the pandemic.

Outlining his experience of the project for delegates at the summit – and the wider learnings for communicators about measurement and evaluation – he said: “We were asked by WHO to assess the effectiveness of Covid communications and other campaigns, as rumours of a highly-contagious virus were coming out of China in November 2019. How do you do evaluation when effective public communications can be the difference between life and death? We had to do it fast and we had to do it well – vanity or quasi-metrics and reporting outputs was simply not going to be enough.”

Macnamara said his first task was to reposition measurement and evaluation internally, since there were negative perceptions that it was about “checking up” on the WHO communications team’s work and looking back at what’s been “done and gone”, when it is too late to do anything about it. He said he wanted to create a more positive approach, to show that evaluating against benchmarks and KPIs and gaining insights from analysis to refine strategies and adjust approaches would be positive and helpful, including by being more predictive: “It’s important in planning and designing the future, not just measuring the past.”

The next step was to examine WHO’s measurement and evaluation processes and shift to a more “logical” approach: “We can gather evidence and use it badly, so in addition to evidence you need a rational and logical approach that applies data at the right points. Best practice is running program logic models: from resource, to inputs, to activities, to outputs, to outcomes, to impact. Communications is a cost centre all the way along, until it’s a value centre – when it gets beyond outputs.”

Initially, Macnamara said, his team was relying on traditional and social media analysis, including messaging, sentiment, share of voice and website traffic, but was short of reliable data, not least because of a number of “false logic and fallacies” in the tools being used by WHO’s two media analysis companies at the time, such as including the volume of media mentions and audience reach in an impact report – “No social scientist would say that was a reliable measure of impact” – as well as making assumptions about credible sources, and even still promoting AVEs, complete with a formula also based on assumptions.

“AMEC has done a very good job in challenging AVEs, but unfortunately the practice is still prevalent in our industry,” said Macnamara. “Media metrics are significant sources of information, but mediacentricity is holding the industry back. Media are just part of the puzzle that is public communication and to equate media content directly with impact is fallacious. People ignore what’s in media, they don’t trust what’s in media, they forget what they read, and access information from multiple other sources.

“Medicentriicty results in a narrow range of data used in many evaluations and contributes to false logics. Media take us part of the way from activities and outputs towards outcomes and impact, but we have to go further to get reliable data in relation to outcomes and impact. It’s an example of substitution error – a metric gathered at one level of analysis to show an outcome at a higher level of analysis; it’s not a lack of data, it’s using data in the wrong way. Many professionals regularly get outputs confused with outcomes.”

In helping the WHO team understand how to provide the right data in the right place, Macnamara’s team came up with two simple tests to show the difference between outcomes and outputs. The first was the “doer test” – who is doing the metric you’re recording? – and the second is the “site test” – where is it occurring?

He explained: “Inputs, activities and outputs are what practitioners do. Outcomes are what the audience does in response, and impact is what happens downstream as a result, in business, society and economy. Inputs occur in the organisation as part of planning and production process. Outputs occur in media, not necessarily in the audience’s minds. Impressions is not the number of people who are impressed.

“We need to be audience-centric, not media-centric – communications is about what arrives with our audiences, not what we send out. We need to listen more to people, not just media. There’s only one group of people that matter: the audience, always.”

Macnamara said one of the challenges for the communications industry in understanding measurement and evaluation was the background of most practitioners: “We come from creative arts backgrounds and we are not necessarily trained social scientists – we can’t all be data analysts but we do have to be social scientists, and understand what makes people tick and what makes behaviour change. In the generations we’re bringing on, I tell my students to study social science, not public relations.”

The WHO has now evolved its measurement and evaluation of communications to use nine data sets: traditional media analysis; social media analysis; website statistics; reports from technical staff on the ground; interviews with key informants; surveys with partners and employees; inserting questions about WHO into the Edelman Trust Barometer and Pew Research; public health databases; and behavioural insights.

Internal surveys have shown use of measurement, evaluation and learning has risen from zero in 2019 to 84% of the communications team using the new model in 2022, and trust in the WHO has risen in 15 of the 28 countries surveyed by Edelman.

In conclusion, Macnamara said: “Impact is always difficult and causality is complex. It’s always downstream, but the secret is this: if we can prove clear outcomes – positive attitude, awareness and trust – there’s usually a clear line of contribution to impact.”