Untitled design(1)

This is the first in a series of blog posts for Measurement Month.

With the summer Olympics now in memory only, the eyes of the world are shifting to another international competition, this one involving Chief Communication Officers (CCOs). Which fit and well-trained CCO will top the podiums of AMEC, the GA, PRSA, CIPR, DPRG, PRIA, CPRS, IABC and other public relations/strategic communication associations?

Which performance-minded CCO will be first across the finish line with a winning evaluation case study – one that applies AMEC’s new Integrated Evaluation Framework (IEF)?

The International Association for the Measurement and Evaluation of Communication (AMEC) unveiled the IEF in June. Developed by an expert working group chaired by Richard Bagnall and composed primarily of senior public relations agency consultants and research providers, with support from the AMEC Academic Advisory Group led by Jim Macnamara, the IEF has four parts: model; interactive application; taxonomy; and bibliography. Many evaluation models exist, including AMEC’s previous Valid Metrics model, but few have as complete a theoretical framework supporting the model. The IEF interactive process combines communication planning/objective setting and evaluation/measurement processes in the model’s application. It takes the communication researcher/planner/evaluator through a seven-stage process.

The IEF provides opportunity to measure communication effectiveness at four levels: (1-output) the production and channel distribution of the product/message; (2-outtake) audience message receipt and audience attention and engagement; (3-outcome) audience change particularly behavioural action; and (4-impact) organizational change. For years academic and professional measurement experts have been clear on one point: measurements solely at levels one and two are not sufficient determinants of communication effectiveness. Levels three and four are needed. It will be interesting to see if CCOs employ all four levels.

Untitled design

As with any of the existing multi-stage evaluation models, there are questions about which multi-stage process is applicable to each of the following: any one-off communication activity (or product); a complimentary set of activities as part of a behavior-changing communication campaign; or sets of activities and campaigns as part of an overall stakeholder relationship program. Is the IEF only for the evaluation of communication campaigns? Or, can its seven-stage process be used for a single message activity in any paid, earned, shared or owned (PESO) media channel? Does the seven-stage process apply equally to a comprehensive and dialogic stakeholder program? These are questions that only CCOs can answer, by first piloting and then by adopting – or adapting – the model to every day practice.

Inherent difficulties abound, though. For example, anecdotal evidence would suggest that a typical CCO may have a ‘program’ for each primary stakeholder, but this program may not be orderly nor strategic and thus difficult to evaluate. We should also note that a conveyor belt of one-off communication activity is likely normal in the average communication department, more so than the development and execution of a collection of full-on campaigns.

The inclusion of the IEF in association award program entries will determine the take up and utility of the IEF over the next two or three years – this being the most visible way that the CCO marketplace will judge the usefulness of the IEF. Looking forward to the upcoming association awards, will the IEF format become the prerequisite to an award? Judging by the entries submitted to the 2016 AMEC awards program, setting level 3 measurable outcome and level 4 impact objectives and then evaluating communication effectiveness against those objectives may prove a challenge. Obviously, for some, the IEF is a stretch.

Certainly, AMEC will place resources behind the promotion of IEF knowledge transfer, starting with the current Measurement Month. The Task Force on Standardization of Communication Planning/Objective-Setting and Evaluation/Measurement Models[1]  sponsored by the Institute for Public Relations, which I manage, is developing a number of research projects that will address questions about the utility of these models, the barriers to measurement and the ability to standardize. The Task Force sees these projects, based on qualitative research methods, as a complement to the quantitative surveys of CCO attitudes to measurement. The IPR Measurement Commission will play its expert part as well.

I should make one final point. The purpose of the IEF is to evaluate the ‘goods’ that a communication department produces: activities (messages/channels); campaigns; and programs. If applied properly, the results of the performance of those goods can be shown to affect the operational goals formulated by business units or other functions within an organization, or the high level corporate goals of the organization itself.

On the other hand, the IEF doesn’t explicitly evaluate the other aspect of department performance management: ‘services’ – the immaterial exchange of value that a communication department provides.

Communication department services are intangible, perishable and time dependent and involve interaction with an internal client. While communication department ‘goods’ are consumed at the strategy execution phase of an organization or business unit strategic management process, communication department ‘services’ affect both the initial formulation of formal strategy as well as its reformation as emergent strategy – besides the actual execution of those strategies.

Much of the ‘intangible’ performance added value in a communication department is derived from delivering strategic counsel, advice where and when organizational and business/functional level strategics occurs, such as on: ongoing risk avoidance; strategic constituencies mapping; stakeholder positioning identification; competitor scenario analysis; internal coalition engagement and collaboration; possible side effects of organizational decision selection; the ethics and the legitimization of choices; stakeholder engagement leading to behavioral change enactment; and the why and how of utilizing communication effects to attain organizational goal achievement.

The academic domains of ‘strategic public relations management’ (J. Grunig, 2006; de Bussey, 2013), ‘strategic communication management’ (de Beer, Steyn, & Rensburg, 2013) and ‘strategic communication’ (Hallihan et al, 2007; Holtzhausen, & Zerfass, 2013) consider the construct of the communication department strategist within the realm of organizational strategy. In each case, what remain to be unpacked are the services that a strategist delivers, not simply the strategic role enacted. Concepts such as strategic thinking or counsel are presented as desirable, but these foundations of strategic services are not further explicated. Nor has academia and the profession examined in detail the evaluation of communication department services.

While good work has and is being done on the evaluation of the goods a communication department produces, to date less attention has been given to the services provided. Perhaps, it is at the intersection of ‘strategic services’ that strategic public relations management, strategic communication management, strategic communication and the evaluation/measurement communities of scholars and professionals can find common ground.

Until then, CCOs can explore the IEF, as the ‘goods’ template for their upcoming awards program entry.

References:
de Beer, E., Steyn, B., & Rensburg, R. (2013). The Pretoria School of Thought: From strategy to governance and sustainability. In Sriramesh, K., Zerfass, A., & Kim, J-N (Eds.) Public relations and communication management: Current trends and emerging topics (pp.303-323). New York; Routledge.

de Bussey, N. M. (2013). Refurnishing the Grunig edifice: Strategic public relations management, strategic communication and organizational leadership. In Sriramesh, K., Zerfass, A., & Kim, J-N (Eds.) Public relations and communication management: Current trends and emerging topics (pp.79-92). New York; Routledge.

Grunig, J. E. (2006). Furnishing the edifice: Ongoing research on public relations as a strategic management function. Journal of Public Relations Research, 18, 151–176.

Hallahan, K., Holtzhausen, D. R., van Ruler, B., Vercic, D., & Sriramesh, K. (2007). Defining strategic communication. International Journal of Strategic Communication, 1(1), 3–35.

Holtzhausen, D. R., & Zerfass, A. (2013). Strategic communication: Pillars and Perspectives of an alternative paradigm. In Sriramesh, K., Zerfass, A., & Kim, J-N (Eds.) Public relations and communication management: Current trends and emerging topics (pp. 283-302). New York; Routledge.


[1] Members of the international Task Force are: Mike Ziviani, Jim Macnamara, Tim Marklein, Forrest Anderson, Rebecca Swenson, Nathan Gilkerson, David Geddes, Allyson Hugley, Mark-Steffen Buchele, Sophia Volk and Alex Buhmann.

Fraser LikelyFraser Likely is President of Likely Communication Strategies and a Lecturer at the University of Ottawa. He is a member of AMEC and the IPR Measurement Commission. He is the Manager of the Task Force on the Standardization of Evaluation/Measurement Models.

Heidy Modarelli handles Growth & Marketing for IPR. She has previously written for Entrepreneur, TechCrunch, The Next Web, and VentureBeat.
Follow on Twitter

Leave a Reply