27 Jun

Evaluation PR can be reviewed more critically; 7 learnings from the AMEC Summit 2018

Placed on Wednesday, 27 June 2018 by Publistat

This article was also published by Adformatie, one of the major websites for communication professionals in The Netherlands. 

Measuring the impact of communication still focuses too much on PR and measuring output. A more integrated measuring of data including opinion and behaviour is needed. Artificial Intelligence offers many opportunities (more speed, less work), but the role of the analyst/communication professional remains crucial in gaining good insights. This was the main message coming from the AMEC Global Summit 2018 in Barcelona.

Closer to home with more Dutch presence
The annual summit brought together leading agencies, tool providers and communication professionals from all over the world to discuss research on the impact of communication. Last year in Bangkok, the Dutch delegation was limited to three representatives from Publistat. This year and closer to home, the summit proved more popular. A positive development, including two speakers from the Netherlands (Madelon Engels from Achmea International and Jeroen Scholten from Publistat).

Returning to the city of the Barcelona Principles
In Bangkok the emphasis had been primarily on the use of dashboards and fast figures (see Jeroen Scholten’s blog: Beyond the dashboard hype and beware of fiddling statistics). This year, the AMEC community returned to the place where the Barcelona Principles had been launched in 2010: the 7 basic principles for good communication research. The key message of the Barcelona Principles is to look beyond media reporting: to find outcomes (including opinions) and impact (e.g. on sales). 8 years on and once again considerable attention was paid to this topic, in the theme Measurement & the three i’s: Insights, Innovation and Integration driving the future.

7 key summit learnings:

1. Evaluations limit themselves too much to PR
Lots of talk of integration and ‘communication’ at #AmecSummit but debate still narrows to ‘PR’. Comms evaluation needs to think more broadly. This was twittered by Professor Jim Macnamara (University of Technology Sydney) during the summit, after having said it aloud earlier. Together with Madelon Engels from Achmea International, he underlined the importance of listening and zooming, e.g. by follow up calls to critics in NPS measurements, or by setting out the customer journey.

2. The AMEC Integrated Evaluation Framework is a tailored tool
The AMEC Integrated Evaluation Framework – designed as a tool to help with the planning and measurement of campaigns - is widely used, but often appears to be too complex. Numerous simplified variants come and go, especially in presentations of studies into short-term or PR campaigns.

Alex Aiken, Executive Director of Government Communication in the UK, presented the new Framework 2.0 which sets the standard for evaluations in the UK public sector. Available here it is both easy to use and contains a summary of the main metrics involved.

3. The Changing role of the analyst, Artificial Intelligence ever increasingly present
At the start of the summit, industry leaders, including top executives from Cision, Isentia and Ketchum, gave their views on the past and the future. David Rockland (Ketchum Global Research & Analytics) compared the state of affairs ten years ago to the age of the dinosaurs, when I was just starting out as a media analyst in 2008. According to Rockland and other industry leaders, it was a time of newspaper cuttings, advertising value equivalents and post-measurement.

John Croll of Isentia, a major service provider in the field of media monitoring and media analysis in South East Asia, emphasised the important role of analysts in his company and during the analysis. Due to technological developments, the role of the analyst will become more specialised.

Dashboards were barely mentioned at this summit. The speakers mostly showed static dashboards with, for example, monthly KPI scores.

4. Much research still focuses on campaign and PR/marketing
During the summit many good examples of campaign measurement were discussed. Many research projects, like the nominees for the AMEC Awards, are aimed at measuring the success of a campaign and not at long-term research into the impact of reputation management, for example. It also seems that other factors that may play a role in the success of a campaign or product are not taken into account (for example, how whether or not, taking part in a World Cup affects a retailer).

5. Define the target group and measure specifically
The Government Communication Service uses the OASIS-model (Objectives – Audience Insight – Strategy – Implementation – Scoring/Evaluation) for evaluations. In order to gain insight into the target group and to localise it, an advanced tool has been developed for the UK, full of possibilities for making selections. Very useful for organising and evaluating a campaign efficiently.

6. Failing is good, from proving to improving
Many evaluations are stuck in demonstrating success. Various speakers (including Alex Aiken, Jamin Spitzer from Microsoft and Jeroen Scholten from Publistat) emphasised that this should be more aimed at what can be done better.

7. Battle against AVEs is not yet over
From a survey of AMEC-members it appears that the use of advertising values as an indicator is decreasing. It is however, still present and the elimination of advertising value equivalents is still a spearhead of AMEC. The PRCA, representative of the PR industry in the UK, warned during the summit that advertising value equivalents are still being used, mainly outside the UK and North America.

Live artist Linda Saukko-Rauta tried to capture each session in a drawing. Often enlightening and sometimes funny, see the example below of the drawing of the industry leaders session. Drawings from the other sessions can be found here.