Many government consultations are more about meeting legal requirements than listening

Consultations are often a legal requirement for government departments – but this sometimes means they are formulaic and ineffective. In an extract from his report, Creating a democracy for everyone: strategies for increasing listening and engagement by government, Jim Macnamara (University of Technology Sydney/ LSE) looks at some of the failings of government consultation, and the problems with one NHS consultation in particular.

farmer ear trumpet

A doctor attempting to alert a deaf farmer using an ear trumpet as to the plight of his horses and cart. Image: Wellcome Library, London via a CC BY 4.0 licence

Under open government and open policy making strategies, the UK government – like many others – has made a major commitment to consultation. Indeed, it could be said that consultation is one of the central platforms for citizen engagement and participation, occurring much more frequently than elections and affording opportunities for detailed comments and feedback.

The development of a single official website for announcing and reporting consultations ( is an important step that was recommended in several previous reviews and studies such as the UK Power of Information Task Force report and the Digital Dialogues report.

However, the consultation site does not provide a full service consultation function. It serves as a central location to:

  • Announce consultations
  • Provide a description and details of consultations (e.g., background information, terms of reference, and sometimes questions for response); and
  • Post summary reports of consultations.

Typically, consultations announced and described on link to specialist web consultation applications such as Citizen Space, which is widely used by UK government departments and agencies (e.g., Highways England for an August 2016 consultation on managing freight vehicles through Kent and the Department of Education for a consultation on funding for early years education), or Crowdcity (used by the Ministry of Defence). Even these specialised tools need additional applications and plug-ins to be effective. For example, Citizen Space developed by Delib is best used in conjunction with Dialogue, a complementary application that allows participants in Citizen Space consultations to rate suggestions and ideas using a peer rating system to produce what Delib calls an “ideas lab”.

Furthermore, experienced consultation staff in the UK government note that public consultations need to be actively promoted and explained beyond what is possible on the Web site to make stakeholders and citizens aware of them and encourage participation. One approach used is to publish a blog specifically devoted to publicising and discussing issues relevant to the consultation (e.g., using WordPress). Email to known stakeholders such as organisations is also used.

These learnings draw attention to the fact that considerable skills are required among government policy and communication staff to conduct effective consultations, as well as a number of specialised tools. Such skills and use of tools such as those noted above are patchy across the government. For example, in planning a consultation in relation to disability, some policy and communication staff involved confessed unfamiliarity with consultation methods and tools.

From observation, interviews, examination of consultation reports, and analysis of consultation submissions, the following 10 failings in consultation were identified:>

Many consultations are framed narrowly with specific questions written by government department or agency staff that limit discussion to the government’s agenda:

  • A number use technical and official language, even when addressing the ‘general public’;
  • In most cases, submissions to consultations are not acknowledged;
  • Many have short time frames for comments, which may be practical for major industry and professional organisations that have expert resources to prepare submissions, but which disadvantage or preclude many citizens and small groups from participation;
  • The preceding limitations are created largely because of a one-size-fits-all approach to consultation. Some consultations are aimed at experts and industry and some legitimately have a very specific and limited scope. However, others seek (or should seek) views from a wide cross-section of the public. But there is no clear distinction between the different types and levels of consultation in terms of language, accessibility, time frame, etc.;
  • Most consultations attract and are dominated by the ‘usual suspects’ – i.e., major organisations and even professional lobbyists. The following point exacerbates this bias, but suggests solutions;
  • Consultation lacks outreach. All consultations studied involved a passive approach in which the government calls for and then waits for submissions to be made. This ignores the reality that some groups and individuals affected by a policy or issue under consultation are unlikely to initiate a submission. This particularly applies to those with low socioeconomic status and/or low education levels, and those who are not easily able to articulate their views.

Consultation can be productively enhanced through outreach to affected groups, such as:

  • Visiting affected areas to talk to local organisations, leaders, and individuals;
  • Interviewing in local communities, such as ‘button hole’ interviews in shopping malls or community centres in relevant areas;
  • Even door knocking in key affected areas; and
  • Establishing relationships with a wider range of organisations (i.e. beyond the ‘usual suspects’) including community groups, social movements, and activist organisations. For example, in the UK groups such as Fixers work with marginalised people, particularly youth. But such groups are seldom recognised or contacted in consultations and debate on relevant policies and programmes;
  • There is a lack of analysis of consultation submissions. Focus is predominantly on collecting inputs and often little planning and scant resources are devoted to how submissions will be analysed to produce outputs and outcomes. Also, many departments and agencies lack the tools to analyse large volumes of unstructured data (i.e., text);
  • There is no sharing of the findings of consultations when there is content relevant to other government departments and agencies;
  • There is also a lack of reporting back following consultations. Reports of consultations are posted on However, while major stakeholder organisations that ‘understand the system’ might readily access these reports, citizens are unlikely to search for the results of a consultation. Proactive reporting to relevant stakeholders and citizens should be undertaken. This can be easily managed today with technology such as auto-generated emails when email addresses are provided, or simply publishing reports and summaries in relevant media such as local newspapers, trade journals, and specialist publications (e.g., organisation newsletters). Research shows that acknowledgement and reporting back substantially increase trust in the process. (Studies of the 2008 Obama presidential campaign show that short acknowledgement e-mails sent to all donors, supporters, and general inquiries created wide public satisfaction and support.)

Overall, many consultations are more about meeting legal requirements than listening. With consultation a legislated requirement in many circumstances, focus is often on meeting the specified criteria, which results in formulaic and minimalist approaches.

For example, the NHS Mandate public consultation conducted in October 2015 to develop the mandate for the NHS for 2016–2017 illustrates the under-utilisation of public feedback and data received by the UK Government through lack of data analysis, as well as opportunities for improvement.

Typically, such consultations attract around 300 submissions. In 2015, the NHS Mandate consultation attracted 127,400 submissions. In addition to 140 organisations that responded, individual public responses included:

  • 114,000 that were attributed to a campaign by 38 Degrees, a membership organisation which campaigns on a range of issues (this was identified through the appearance of common terms and phrases closely linked to 38 Degrees policies, suggesting use of a form letter or template);
  • 470 that were attributed to a campaign by the National Autistic Society;
  • 270 that were attributed to ‘Our NHS’, a campaign to promote a fully nationalised, comprehensive health service;
  • 170 that were attributed to the Wheelchair Leadership Alliance;
  • 12,500 that were unique replies from individuals expressing personal views. This included 8,880 responses submitted via Citizen Space.

NHS England staff manually analysed the large volume of submissions, identifying the above factors and a number of findings including:

  • Opposition to further private sector involvement in the NHS;
  • Concern that there is insufficient funding to achieve the aims of the mandate and the NHS’s Five Year Forward View;
  • Concern that the mandate does not mention staff issues such as safe staffing levels and calls for improved pay and conditions;
  • Concern about seven-day services, which were being introduced at the time of this study;
  • Strong support for improving mental health services but concern about lack of funding;
  • Strong support for focus on prevention of ill-health, but concerns that public health, community and social care funding is insufficient to achieve aims;
  • Shortness of the consultation period and lack of publicity to make people aware of the consultation.

Civil service staff are to be commended for the analysis they did without any specialised text or data analysis tools. However, the additional findings gained from analysis conducted as part of this research project illustrate the importance of in-depth data analysis and the tools and skills for conducting such analysis.

This is an extract from Creating a democracy for everyone: strategies for increasing listening and engagement by government, published by the University of Technology Sydney and the LSE.

jim macnamaraJim Macnamara PhD, FAMEC, FAMI, CPM, FPRIA is Professor of Public Communication at the University of Technology Sydney (UTS) and a Visiting Professor at The London School of Economics and Political Science (LSE), Media and Communications Department.

Similar Posts

Comments are closed.