Skip to main content

Parallel universes: What academics and consultants can learn from each other

Management consultants and academics studying management share a common topic.  In principle, the insights of each should be helpful to the other.  There is, however, less exchange of ideas than one might expect. 

There are senior academics in business schools who have active consultancy practices.  Elsewhere in the university environment and in the consulting world there is less interchange between the two groups.  Many consultants working long hours under pressure do not make a priority of keeping up with the latest academic findings, although they may be familiar with a few classic papers.  Many academics are dismissive of anything not published in peer-reviewed journals and engaged in their own rat-race for publications and tenured positions.  Both groups struggle with the demand for novelty in an applied discipline that depends to a considerable extent on unchanging human nature and can be critical of each other’s approaches.  They would, however, do well to listen to each other.

What management consultants can learn from academics

Academics have studied the weaknesses that affect management decision making.  Consultants, with a keen appreciation of how their clients think, can often share the same flaws.  They should guard against them by learning from the researchers’ analysis.

Evidence-based approaches are preferable to other alternatives

Academics specialising in management seek to build a body of knowledge based on evidence gathered through research.  Managers, however, are typically under pressure to take decisions in the face of incomplete information and often do not take decisions on the basis of the best available evidence.  Jeffrey Pfeffer and Robert Sutton, professors at Stanford University, list the substitutes that managers use instead.  They include:

  • personal experience, which inevitably is limited;
  • obsolete knowledge acquired earlier in their careers;
  • marketing information and hype;
  • ideology, for example the belief that financial reward is an effective motivational tool in all circumstances;
  • the uncritical imitation of top performers, imitating the superficial elements of a successful solution without understanding the underlying mechanisms. 

Consultants share the time pressures of their clients and may have additional schedule and budget pressures of their own.  They may be tempted to take the same short cuts, which can lead to suboptimal solutions.  Digital technologies and big data are facilitating the use of evidence-based approaches, but there are many domains of management activity they have not yet penetrated.

Individual case studies do not prove anything unless supported by other evidence

Consultants love case studies.  A well-told story in the form of a case study can be far more convincing than dry analysis and is an excellent way of bringing to life a conclusion derived from other evidence.  However a case study by itself is just an anecdote and can be seriously misleading.  A single example does not prove a more general rule, and research shows that people’s memories of what contributed to the success of a project are highly unreliable. 

Consultants should evaluate the effectiveness of their interventions

According to Pfeffer and Sutton, “consultants, … are always rewarded for getting work, only sometimes rewarded for doing good work, and hardly ever rewarded for evaluating whether they have actually improved things.”  (Emphases in the original.)  In the thirteen years since those words were written, there has been talk in the consultancy market about moving to value-based pricing, which of necessity requires the benefits of an intervention to be assessed, but in practice little has changed.  There are genuine practical difficulties in assessing the effectiveness of interventions and measuring value, and in any case price competition from consultancies offering similar services usually makes value-based pricing impracticable.  However the lack of evaluation makes the consultancy market vulnerable to fads and fashions.

“Most claims of originality are testimony to ignorance”

Leading consultancies aim for ‘thought leadership’ and compete to come up with the latest ‘big idea’, but the criteria for success are marketing and PR ones, encouraging the repackaging of old ideas or exaggerated claims for new ones rather than genuine innovation.  Evidence-based research typically progresses in small incremental steps, and most findings only apply in specific contexts.  In the words of management theorist James March, “Most claims of originality are testimony to ignorance”.

What academics can learn from management consultants

The best academic papers are valued by consultants as well as researchers, but not all management research reaches the same standards.

A good understanding of management and business is critical

Consultants reading academic publications may be disappointed for several reasons.  Some authors display a degree of naivety in their understanding of management and business, and of the motivations of participants, which undermines their credibility.  The pressures of making a career in academia appear to leave little opportunity for gaining practical experience, an unfortunate situation in an applied discipline such as management.

Data quality is important

A major problem is the inadequate quality of the data used in many studies.  Often good-quality data relating to a researcher’s chosen problem is hard to come by.  Many succumb to the temptation to use data that is available, for example from published sources, which has some overlap with or relation to the information required but does not correspond with it precisely. 

An alternative is to use survey data, for example from telephone interview surveys asking interviewees to rate items on rating scales.  This suffers from the disadvantages of subjectivity and the vagaries of human memory, compounded in some cases by low response rates and the impatience of interviewees if surveys are too long or poorly designed.  Consultants learn to be sensitive to the quality of the data they use if they are to avoid criticism from their clients.

Methodology is no substitute for content

Complex statistical analysis is fashionable in management research.  Regression analysis and structural equation modelling are particularly popular.  Unfortunately the complex analysis often leads to rather simplistic conclusions.  This may be unavoidable, given the large number of organisations from which data must be collected in most such studies to allow the statistical techniques to be applied and, often as a consequence, the small number of data points relating to each organisation.    

Complex statistical techniques reduce the accessibility of a paper for consultants and other readers who are not statisticians.  Those with some knowledge of statistics will be aware that such techniques yield correlations but provide no information on causality.  Nevertheless many authors cannot resist the temptation to imply or claim causality.  Researchers aware of the shortcomings of their data sometimes use additional statistical techniques to try to compensate for them, raising further questions about the validity of the conclusions.  Complex statistical methodologies, in particular when combined with weak data, mean that many research studies lack face validity for practitioners. 


The academic and consulting worlds would both benefit from forums that bring the two together and allow them to share perspectives, ideas and experience.  The Centre for Management Consulting Excellence aspires to offer such forums.


Karol Szlichcinski



Tuesday 1st October 2019
Parallel worlds