Research
Improving the consultation process
About this report
Good policy-making necessarily involves taking account of the views of those most affected by what is being proposed, the wider community and those with experience and knowledge of the issue. If this is not done then the likely effects are that a policy is not effective in achieving its objectives and distrust in Government by those affected by the policy.
The policy-making process in Jersey is deficient in a number of respects, of which practice on consultation is just one. This paper analyses the consultation process and suggests how it can usefully be improved.
About the author
Sir Mark Boleat has held a number of leading positions in the public, private and voluntary sectors in the UK and Jersey. He was the Political Leader of the City of London from 2012 to 2017, having previously headed major national trade associations and chaired or been a director of a number of listed and private companies and charities. He is currently Chair of Link, which runs the UK’s cash dispenser network, and of the International Business and Diplomatic Exchange. In Jersey he has been Chair of the Jersey Competition and Regulatory Authority, the Jersey Development Company and Andium Homes and has undertaken a number of consultancy projects for the Government. He is Senior Adviser to the Policy Centre Jersey.
About the Policy Centre Jersey
The Policy Centre Jersey aims to improve knowledge about and civic engagement in Jersey by –
- Undertaking authoritative research on policy issues.
- Publishing briefings on key policy issues with easy access to relevant papers and research.
- Providing a forum for discussion on policy issues.
- Building a Jersey Knowledge Centre comprising brief up-to-date papers on all aspects of Jersey (history, economy, constitution, geography etc) aimed particularly at schools.
Introduction
Good policy-making necessarily involves taking account of the views of those most affected by what is being proposed, the wider community and those with experience and knowledge of the issue. If this is not done then the likely effects are that a policy is not effective in achieving its objectives and distrust in Government by those affected by the policy.
The policy-making process in Jersey is deficient in a number of respects, of which practice on consultation is just one. This paper analyses the consultation process and suggests how it can usefully be improved.
Summary
- There is a short Code of Practice for consultations by the Jersey government and a more general Government Engagement Framework.
- There is significant experience in respect of best practice on consultations from the UK and internationally that Jersey can usefully draw on.
- The current Code of Practice and the Government Engagement Framework are not complied with.
- The major deficiency is a failure to have meaningful consultation with relevant organisations.
- Closed-question online surveys are the preferred means of responding to consultations. This is not sufficient and often they do not cover key issues and run the risk of questions being biased.
- Respondents are asked to give little information about themselves so there is no way of assessing how representative respondents are.
- Responses to consultation exercises are often non-existent or poor, in particular by failing to indicate how policy has been influenced.
- The quality of consultation exercises will be enhanced primarily by meaningful engagement with relevant organisations or people, who will be different for each consultation.
- There is also a need for administrative improvements, which should be built into the Code of Practice -
- Where an online questionnaire is the preferred method of response then the questions should also be published in the consultation document and the option given to make a standalone response.
- Respondents should be asked to give basic information (eg, age, ethnic origin, sex, employment status, income band) so that an assessment can be made of the representativeness of respondents and results analysed by category of respondent.
- Where substantive responses are sought, contact details should be requested and there should also be a statement that responses will be deemed to be public unless the respondent requests otherwise and gives the reason for so doing.
- Questionnaires should follow a standard format devised by the Chief Statistician, should ensure that key issues are covered and that questions are not slanted, and should be cleared in advance by the Chief Statistician. The Code of Practice for Statistics includes the following: “Methods and processes should be based on international good practice, scientific principles, and established professional consensus”. As surveys produce statistics this should apply to surveys.
- An analysis of responses should never be a matter of counting votes. It is appropriate to mention, and to include extracts from, individual responses either where they have a relevant interest or where they have made a particularly significant point.
It is probably the case that the current deficiencies are not because of decisions not to follow good practice but rather because of a lack of awareness of what good practice is. This points to the need for training and a central point for advice. Guernsey’s experience of having a dedicated consultation and survey support officer should usefully be examined.
Code of Practice on consultations
There is a Code of Practice for consultations carried out by the Jersey Government. It is brief -
- Every consultation is unique. Consultations should be designed to seek views on a specific topic or set of issues in a way that is effective, accessible and inclusive. This means that not all consultations will look the same.
- Only consult if change is possible. Public participation includes the expectation that people's contribution will influence the outcome. So only consult if there is scope for change.
- Know your stakeholders. Spend time identifying all the potential groups who may have an interest in the issue. Different groups of stakeholders may be involved to different degrees, consulted on different policy areas or in different ways. Whilst some consultations will involve the general public, others, such as draft legislation of a technical nature, may be better suited for targeted stakeholder consultation only.
- Make it accessible. An accessible consultation benefits everyone and increases the response rate. Use accessible methods, formats and words that are suitable for the intended audiences. Work towards removing barriers to access and making your consultation as inclusive as possible.
- Don't delay because you don't have all the answers. Consultation may provide some of these answers. Furthermore, there may already be speculation about the project or policy and engaging early gives people more opportunity to influence the outcome.
- Tell it like it is, be clear and transparent. It is good practice to be forthcoming with information unless there's a compelling reason not to. Be open about areas where decisions have already been taken.
- Give enough time for meaningful engagement. People need to be able to give sufficient consideration to the proposals. Significant public consultations should normally last for at least 8 weeks (excluding major holiday periods).
- Consider Data Protection obligations. Ensure that your consultation is run in accordance with the Data Protection (Jersey) Law 2018 , and make sure that the departmental data governance officer and the Government of Jersey central data protection unit have been consulted.
- Objective analysis. Information and views gathered during consultation must be assessed objectively.
- Publish the data and the outcome. Publish data from the consultation online, in accordance with your privacy notice, as soon as possible. Within a reasonable timeframe, give clear feedback on the consultation outcome and how the consultative process affected the policy decision.
The Government Engagement Framework
In 2022 the Government published the Engagement and Information Improvement Report , written by two officers - Ian Cope, Chief Statistician and Director of Statistics and Analytics, and Dirk Danino-Forsyth, Director of Communications. This included a section headed “Sufficiency and Presentation of Government Policy Engagement”. This explained current practice and pointed to some areas of good practice. It identified a number of areas for improvement –
- Consistency of approach. Between June 2018 and April 2022, 114 consultations were published on gov.je across all areas of Government policy, but crucially there were marked differences in how these were undertaken:
- Some consultations asked for Islanders’ views to be provided by a closed-question online survey, whereas others asked for long-form written views via email.
- Some consultations invited any interested Islanders to present their views face-to-face in workshops or meetings, whereas others select representative samples and focus groups to illicit views.
- Most consultations were published only in English.
- Most consultations did not collect data to monitor the diversity of respondents.
- Accessibility– mainly in respect of people whose first language is not English.
- Diversity–“the Government currently does not undertake diversity monitoring systematically across all engagement activities which it undertakes”.
- Observing public opinion- the continuous monitoring of public opinion through mass surveying.
- More effective use of deliberative bodies.
A series of recommendations was made. An update, published in May 2023, stated that of the 44 detailed recommendations:
- 16 were Green - been completed
- 27 were Amber - action is outstanding, but on track
- 1 was Red - actions are outstanding
In addition to an odd use of “amber” there is scope to debate whether this is a fair analysis.
A specific recommendation was that -
The Government should adopt a more cohesive, structured approach to engagement to ensure that it engages more effectively with the public to develop policy solutions which are more responsive to Islanders’ concerns. It is recommended that this is called the Policy Inclusion Framework which should include guidance on:
- Scope - clear internal and external guidance on when and how to involve Islanders in the development of policy.
- Structure - who to involve from within those groups, the establishment of standing groups such as the Old Person’s Living Forum and the development of criteria for whether to set up standing groups in the future.
- Processes - a consistent set of methodologies that are easy to understand and to implement.
- Products - the use of simple, short documents written in plain English; and when it would be appropriate to use non-English languages, as well as products for people with disabilities.
This report was followed by the publication on 16 November 2022 of a consultation document The Policy Inclusion Framework. This was based on a longer document the Proposals for the Jersey Policy Inclusion Framework. This framework comprises engagement good practice guidance and a policy engagement toolkit, which are based on an engagement pyramid structure. Comments were sought by 6 February 2023. In March 2023 the Government Engagement Framework was published. This set out 10 good practice principles for engagement and consultation. In summary these are –
A. Accessible - particularly forpeople with disabilities.
B. Purposeful - Engagement shouldbegin as early in the policy development process as possible and consult wherechange is possible and be clear about what aspects of the proposals people canchange.
C. Targeted - identify andinvolve stakeholders before you consult.
D. Informative - provideparticipants with sufficient information about the proposal or topic.
E. Flexible - adapt theengagement approach to stakeholders’ needs and preferences wherever possible.
F. Publicised - in order for aconsultation or engagement activity to be effective, it will need to bepublicised to some extent.
G. Recorded properly - records ofa consultation, such as public responses, should be kept as long as they arerequired to fulfil the purpose for which it was collected.
H. Involve feedback loops -responses to the views received should be made within a reasonable time frame.As a guideline, this could be 3 months of the contribution or the consultationfinishing. Where possible, what has been decided to do in response to thefeedback should be emphasized.
I. Published online – a webpage needs to be created for it onwww.gov.je so that people can find out more information about the issue and howto give feedback.
J. Finished with a consultationreport – which should set out the number of people who took part in theconsultation, types of engagement that took place, feedback received and theGovernment’s response to the feedback.
Best practice
Consultation is an established part of the policy-making process in democratic countries. It follows that any jurisdiction can draw on the experience of others in developing its own policies and practices.
The UK Cabinet Office has published Consultation Principles 2018 which is broadly similar to the Jersey code. The Consultation Institute publishes relevant papers and offers training courses. Some local authorities have developed and abide by effective codes of practice – a good example being Cambridge City Council.
Specific aspects of consultation are also covered by codes of practice and guidance. Particularly relevant to this paper is practice on opinion surveys. The Market Research Society Code of Conduct is a relevant document and the Questionnaire Design Guidance published by the Office for National Statistics is the definitive document in respect of designing questionnaires. The section headed “ask balanced questions” states –
You should avoid questions that lead respondents towards a particular response. For example, you should avoid questions like:
- “Did everyone enjoy the course? We always get really good feedback about it” — you should not make assumptions or encourage respondents towards a positive response.
- “Do you support the government policy on education, which many people are against?”— you should not encourage respondents towards a negative response.
- “How many days did you work last week?” — some respondents will not be able to answer this question as it assumes they worked the previous week.
Jersey Business has published a useful Guide to Conducting Market Research.
Adequacy of and compliance with policy
The Code of Practice is not complied with nor is it sufficient to ensure good quality consultation.
The Government Engagement Framework ignored its own guidance in that there was no mention of the consultation, let alone how it had influenced policy. The Framework is long and detailed, and as will be seen subsequently is not well observed, for example in respect of diversity monitoring. The maxim “the best is the enemy of the good” may well be relevant here.
Consultations are often not meaningful
Much of the framework described in a previous section is concerned with administrative points, such as accessibility and diversity. However, there is anecdotal evidence that there are rather more fundamental problems with the consultation process which centre around two key points –
- Consultations are not always meaningful, rather they are going through the motions with the objective of claiming that consultation has been undertaken and that there is broad agreement with the proposals. There are reports of charities or businesses being invited to events labelled as consultations but which in practice comprise a PowerPoint presentation with little opportunity for discussion.
- Little notice is taken of points made in the consultation, combined with the absence of a meaningful analysis of consultation responses and how the consultation exercise has influenced the outcome.
These comments are based on anecdotal evidence, although the critical comments on Government processes in the Barriers to Business Report do rather lend support to them: “84% of respondents stated the Government does not understand the impact of regulation on businesses and 79% stated that the Government does not consult well with businesses before creating any new regulation or making changes to existing regulation”.
It is important to recognise that policy-making is not easy, particularly in a small island which for most practical purposes is a sovereign state. Officers have to handle a wide range of diverse issues on which they cannot expect to be expert, and the policy-making process itself requires considerable expertise separate from subject knowledge.
Best practice in handling a particular issue is to use a process along the following lines –
- Identification of the problem and consideration of policy options in conjunction with relevant people and organisations - who will be different for each issue. This in turn requires knowledge of who the people and organisations are, which cannot simply be a mechanical operation. Ideally, a charity or business association is expert in its field, understands the political process and is not simply concerned with a vested interest, but this is not always the case. A central resource can identify the best interlocuters, and there is also scope to help businesses and charities be more effective in their representative work, in particular through the use of evidence and presenting their views in a way that most helps the policy-making process.
- Working through policy options, which in some cases will require expert consultancy help but normally should be capable of being done by officials in conjunction with relevant people and organisations.
- Publication of a formal consultation document followed by a programme of engagement specific to the subject.
- Publication of the final policy document which among other things should spell out the policy-making process and how the final consultation has influenced the outcome.
It may be that this process is followed in some cases but there is certainly an impression that in some areas the work is largely done within the government machinery, perhaps helped by favoured consultants, with relevant people and organisations then being presented with something of a fait accompli.
The remainder of this paper is largely concerned with various administrative points, all of which are important but not nearly as important as the issues covered in this section.
The used of closed-question online surveys
Allowing responses to be made only by an online questionnaire now seems to be common practice. This makes life easy for policy officers in that collating the results is largely an administrative process. However, as a method of consultation it is inadequate. From a technical point of view completing a questionnaire is easier if the whole questionnaire can be seen, otherwise there is a danger of covering issues before the relevant question or worse still coming to the end of the questionnaire without being able to make an important point.
But the more important point is that the questions are selected, presumably by officials, and it is possible only to answer the set questions, even if they do not cover key issues. There is also a risk of questions being slanted, if only inadvertently.
The consultation on the wind farm proposal is a good example of using an online questionnaire in an unsatisfactory way. There were two introductory questions on whether respondents could access and understand accompanying information – which incidentally was only published two weeks after the consultation began, another example of bad practice. Six questions were then asked -
3. Developing offshore wind presents several possible benefits for Jersey. How important are the following to you?
- More energy security because we would be able to agree prices over the longer-term
- Creating enough energy to export, which could help grow Jersey’s economy and create good jobs
- Additional income for the public purse, such as new tax revenues
- Securing our access to low carbon energy and helping other countries reduce their fossil fuel use
4. Do you have any particular opinions about how a wind farm would be funded that are important to you?
5. If there are things that particularly excite you about the proposed development of offshore wind, please let us know here.
6. If there are things that particularly concern you about the proposed development of offshore wind, please let us know here.
7. At this initial stage, in one sentence, how would you summarise your current opinion about developing a wind farm in the south west of Jersey’s waters? (20-word limit)
8. Is there anything else you wish to add?
The questionnaire asked for the age of the respondent and the parish in which they lived or or if they lived elsewhere.
There are three major deficiencies with this questionnaire –
- Question 3 is loaded. The four points are all phrased such that a wind farm could have only beneficial effects. In any discussion on energy policy the “trilemma” is recognised, that is the trade-offs between price, security and sustainability; but this consultation implied that there were no trade-offs.
- The fourth bullet point in question 3 is also misleading as it implies that Jersey does not currently have access to low carbon energy. However, Jersey’s electricity supply is already low carbon.
- It does not allow for a detailed response, unless one counts question 8 “Is there anything else you wish to add?” And it will be noted that just 20 words is allowed for a summary of current opinion.
Assessing opinions through the use of surveys is easy to do badly and requires expertise to do well. Just framing a question in different ways can give very different results. For example, these four questions about the Chief Minister of Jersey would lead to very different answers.
- Do you know that Lyndon Farnham is Chief Minister of Jersey?
- Is Lyndon Farnham Chief Minister of Jersey?
- Who is Chief Minister of Jersey – Kristina Moore, Lyndon Farnham, Sam Mézec, Ian Gorst?
- Who is the Chief Minister of Jersey?
The first question could lead to the conclusion that 80% of the population know that Lyndon Farnham is Chief Minister of Jersey, the second 70%, the third 60% and the fourth 40% .
It is relevant to note this short news item in The Independent on 22 March 2024 concerning impressions of leading Labour politicians –
Fiona Wilson was known to 47% of respondents to the survey, carried out by Portland Communications for Times Radio, with 15% feeling “favourable” towards her. The only issue is, Fiona Wilson does not exist. Invented by the survey, she was better known than real-life shadow culture secretary Thangham Debbonaire and shadow Cabinet Office minister Nick Thomas-Symonds.
It seems that there are no agreed procedures or standards for doing surveys, which has resulted in poor quality surveys, which in turn leads to poor quality policy-making. Guernsey provides a useful contrast in this respect. There is a process by which all consultations are run past the Data and Analysis, Communications and Data Protection teams. Two weeks’ notice is required. The Data and Analysis Team checks that questions are not leading, are in plain English, in a logical order and are relevant to the research subject. The Team puts people off running a survey if the data can be obtained from elsewhere, or if they are just repeating a previous survey, perhaps with the hope of getting a different result. It also checks that someone is lined up to do data cleaning and analysis. A member of the Team has the specific responsibility for supporting surveys and consultations including advice on the use of online forms.
Collecting data on and identifying respondents
It seems to be standard practice for consultations simply to ask for age and parish of respondents. (Indeed some questionnaires, e.g. on population policy, have not allowed responses from people not resident in Jersey as a parish has to be given before the questionnaire could be completed.) This is not sufficient. Questions should also be asked on sex, ethnic origin and either income or employment status.
It is good practice generally for consultations that seek substantive responses, as opposed to ticking boxes, to ask for the name of the respondent and contact details and indicate that unless there is a good reason to the contrary all responses will be made public. It is important that the identity and where appropriate affiliation of the respondents is known. For example, in the case of the wind farm it is important to know if the respondent could benefit directly or indirectly if a wind farm is built. Sometimes a response will be such that policy makers will wish to ask for further comments or even to meet the respondent to discuss the comments.
The Engagement and Information Improvement Report recommended that “robust diversity monitoring should be introduced as standard across all Government engagement exercises.” The Government Engagement Framework duly has the following section -
To monitor diversity, information is collected about individuals’ protected characteristics under the Discrimination (Jersey) Law 2013. These include:
- Race
- Sex
- Sexual orientation
- Gender reassignment
- Pregnancy
- Age
- Disability
This information is provided on a voluntary basis. By answering the diversity monitoring questionnaire, you are opting in to providing the Government with this information. All responses are anonymous and will be used to produce high-level statistical reports. These reports will be internal but may be published on www.gov.je. More information about how this data is collected and used can be found in the privacy notice.
The standard diversity monitoring questionnaire can be found in the resources section of this document. When producing an online survey for a consultation, please inform policyengagement@gov.je. The relevant officer can then link the standard diversity monitoring
questionnaire to your survey. This officer will securely collect the results from the questionnaire and produce high-level diversity statistics.
This is clear, but it is also ignored. A quick survey of recent consultations indicated that none followed the requirement of using the standard diversity monitoring questionnaire. Taking recent consultations -
- Women’s health and wellbeing - only asks for sex and gender reassignment. (In passing this consultation has some excellent examples of best practice – providing the full survey for downloading as well as the online questionnaire, and asking sensible questions in a user-friendly way about the characteristics of those responding. But with 28 pages containing 68 detailed questions it is very long.)
- Long Beach playground equipment - asks for no data.
- Contraceptive services survey - seeks comprehensive information on respondents, although not on pregnancy or disability.
- Single use vapes - asks for no data.
- Youth vaping – asks for no data.
- Sustainable finance – asks for no data.
- Cyber Law Jersey – asks for no data.
That clear requirements on this aspect of consultation are ignored rather suggests that the same applies to other aspects.
It is not clear whether the intention here is to produce a separate analysis of respondents or rather to allow the results to be analysed by category of respondent. By having a separate questionnaire, following the practice used in recruitment, it seems that the purpose is to have a specific report on the diversity of respondents, rather the analysis of responses identifying differences of view between different categories of respondents. The latter is essential in a meaningful consultation.
Analysing responses to consultations
Points 9 and 10 of the Code of Practice are clear –
- Objective analysis. Information and views gathered during consultation must be assessed objectively.
- Publish the data and the outcome. Publish data from the consultation online, in accordance with your privacy notice, as soon as possible. Within a reasonable timeframe, give clear feedback on the consultation outcome and how the consultative process affected the policy decision.
The Government Engagement Framework suggests that as a guideline the consultation report should be published within three months of the contribution or the consultation finishing.
Building on the point in the previous section, not all responses should count equally. A response from someone directly affected by the proposal or who has relevant expertise should be weighted accordingly, and legitimately can be quoted in the analysis of responses. This requires expertise in assessing the value of responses. Sometimes a response by an individual with no seemingly relevant qualification will be of such a high quality that it should feature prominently in the analysis of responses while a response from say the relevant trade body may be poor quality, making assertions with no evidence. All this means that analysing responses is not a mechanical operation but rather should be the responsibility of the official responsible for the policy.
Publishing an analysis of the responses, with a list of respondents and links to their responses, is good practice and facilitates an informed debate, and obviously it is important that the final policy document indicates how responses have influenced the policy decision. In practice, performance in this respect is poor. In my comments on the policy inclusion framework in 2023 I examined 11 consultations, all of which had closure dates before the end of April 2022, in respect of one specific point – analysis of and response to the consultation responses –
- For seven of the consultations there was no analysis of or response to the responses that was easily findable on the government website.
- They were good responses to the responses on two.
- The response to the responses to the consultation on proposed new conservation areas in Jersey was very detailed but was undated, provided no context and gave no indication of the number or characteristics of respondents.
- A revised Code of Practice on safe use of rider operated lift trucks simply stated that suggestions in the consultation had been taken into account.
So out of the 11 exercises considered there was an appropriate analysis of the consultations and a proper government response in only two of the cases and in another case the very detailed response was spoilt by the absence of basic information about who responded to the exercise.
The exercise has been repeated in respect of 20 consultation exercises with closing dates between 1 April and 30 September 2023. The results were –
- No response and no indication of how the consultation has influenced policy could be found for 7 consultations.
- Three planning consultations had feedback such as -
- This guidance was prepared following engagement with the Housing and Regeneration Team in the Cabinet Office and published for consultation in March 2023. This generated much constructive feedback and the Minister has carefully considered the responses received and made changes to the draft guidance.
- One response on shipping law and inshore harbours regulations briefly summarised responses but then misused statistics –
- Of the 14 responses, only one respondent did not agree with the draft Shipping Law amendment (7.14%), and two did not agree with the draft Harbour Inshore Safety law amendment (16.67%).
- It is good practice to give the number of responses but not to present 1 in 14 to two decimal places.
- The response on money laundering summarised responses but gave no indication as to how policy has been influenced –
- Government received 16 responses to the consultation. Since then, Government has considered all the feedback received as part of its policy formation. The responses to the consultation are summarised below and Government has stated its position in relation to the questions posed in the consultation.
One consultation exercise stands out for being a model of good practice - the Employment Forum’s Report and Recommendations for the minimum wage from 1 January 2024. This quoted from consultation responses so it was clear which organisation expressed a particular view. For example –
The majority of employees in the agriculture sector are paid above the minimum wage (the total wage bill is estimated by the sector at £15 million a year). The Jersey Farmers’ Union (JFU) suggested to the Forum that the minimum wage as a concept is becoming increasingly irrelevant.
and
Unite the Union told the Forum that some employers had found it difficult to achieve pay increases close to an RPI figure of (then) 12.7%, so the union, in attempting to try to negotiate a balanced package for its members, had focused on increasing the value of contractual annual leave and other employment benefits as well.
Other consultation which had good feedback and indicated how policy had been influenced were on sport in Jersey, competition law and speed limits.
Conclusions and recommendations
There is anecdotal evidence that there are fundamental concerns with the consultation process which centre around two key points –
- Consultations are not always meaningful.
- Little notice is taken of points made in the consultation, combined with the absence of a meaningful analysis of consultation responses and how the consultation exercise has influenced the outcome.
The quality of consultation exercises will be enhanced primarily by meaningful engagement with relevant organisations or people, who will be different for each consultation.
There is also a need for administrative improvements, which should be built into the Code of Practice -
- Where an online questionnaire is the preferred method of response then the questions should also be published in the consultation document and the option given to make a standalone response.
- Respondents should be asked to give basic information (eg, age, ethnic origin, sex, employment status, income band) so that an assessment can be made of the representativeness of respondents and results analysed by category of respondent.
- Where substantive responses are sought, contact details should be requested and there should also be a statement that responses will be deemed to be public unless the respondent requests otherwise and gives the reason for so doing.
- Questionnaires should follow a standard format devised by the Chief Statistician, should ensure that key issues are covered and that questions are not slanted, and should be cleared in advance by the Chief Statistician. The Code of Practice for Statistics includes the following: “Methods and processes should be based on international good practice, scientific principles, and established professional consensus”. As surveys produce statistics this should apply to surveys.
- An analysis of responses should never be a matter of counting votes. It is appropriate to mention, and to include extracts from, individual responses either where they have a relevant interest or where they have made a particularly significant point.
In addition, someone in Government should have responsibility for reviewing all consultation exercises on a regular basis so as to ensure that the Code of Practice and Engagement Framework are being followed. Guernsey’s experience of having a dedicated consultation and survey support officer should usefully be examined.
It is probably the case that the current deficiencies are not because of decisions not to follow good practice but rather because of a lack of awareness of what good practice is. This points to the need for training and a central point for advice.