Children, Schools and Families Committee Inquiry into the DCSF-commissioned review of elective home education
Memorandum by Professor Bruce Stafford
see all the others
Summary
This memorandum considers the conducted of the Review of Elective Home Education in England (hereafter the Review) by applying three criteria:
Impartiality
• The Review displays some impartiality. However, the membership of the Review team did not reflect the range of expertise needed; the questionnaires used to collect data are poorly designed; the tentative nature of the estimates of home educated children ‘known to social care’ is not highlighted; and survey findings and other associated documentation should have been reported in more detail or published alongside the review.
Honesty
• The published Review includes three instances of highly selective quoting that do not provide a full and fair representation of the evidence submitted.
Objectivity
• The objectivity of the Review is compromised by the extent to which it lacks impartiality and honesty. As a consequence the Review fails to make a strong case for its recommendations.
1 Introduction
1.1 How the Review was conducted is important because any shortcomings are inevitably reflected in its recommendations.
1.2 This submission uses three criteria to assess the conduct of the Review.
1.3 In drafting the memorandum the author draws upon nearly 30 years experience of applied public and social policy research and nearly three years of sharing responsibility for home educating the youngest of his four children.
2 Conduct of the Review of Elective Home Education
2.1 There appear to be few mechanisms for scrutinising the conduct of an ‘independent’ Review, for instance, it falls outside the remit of the Parliamentary and Health Service Ombudsman and the Statistical Authority. However, there are several public service codes of conduct – for the Civil Service, official statistics and public life - that could be used to derive criteria to assess the conduct of the Review. The proposed criteria to be applied to the Review are:
• Impartial – whether it presents the full range of argument and evidence on the subject, this would include counterexamples.
• Honest – whether it give a full account of people’s views and experiences.
• Objective – whether the review weights the arguments and evidence in presenting its case and recommendations. This is not possible unless it has been impartial and honest.
3 Criterion 1: Impartiality
3.1 The Review displays some impartiality. It does acknowledge, for example, the ‘passion and commitment’ of many home educating parents (para 1.2), that there is ‘exemplary practice’ in home education (para 3.1), and that ‘… local authorities were much criticised by home educators …, for their perceived lack of understanding.’ (para 5.1). However, it devotes considerably more space to views that raise concerns about child safety than to the strengths and benefits of home education or the case for no change.
3.2 Specific issues that raise doubts about the Review’s impartiality are discussed below
3.2.1 The membership of the Review’s Expert Reference Group did not represent the range of expertise needed. No home educating parents, representatives of organisations directly representing home educators, or academics who had conducted research on home education were included. There was an over-representation of experts with knowledge of early years.
3.2.2 Relevant annexes/working papers and the literature review were not initially published. Some of the shortcomings with the data underpinning the Review have been revealed through a series of Freedom of Information requests. A more impartial report would have included a copy of a second questionnaire sent to local authorities and of a statistical annex/working paper.
3.2.3 The Review process included the administration of an on-line public questionnaire and two questionnaires to local authorities. However, the surveys are not sufficiently robust, the tentative nature of estimates is not made explicit and the findings were not reported in full in the published report.
Data collection
3.2.4 The on-line public questionnaire used to gather home educators and others’ views was badly designed involving leading and poorly constructed questions. For example, question 3 is a ‘double barrel’ question as it asks views about the role of Government and local authorities in achieving the five Every Child Matters outcomes, but a respondent might believe only one of these two bodies should have any obligation or that, say, councils should have a duty for some but not all five outcomes. Two separate questions are required that allow responses for each child outcome.
3.2.5 A supplementary questionnaire was administered to the 90 local authorities who replied to the Review’s original questionnaire sent to 150 councils, but was not included or even mentioned in the report. This questionnaire appears to address a serious shortcoming of the initial questionnaire by seeking to collect statistical data on elective home education and safeguarding concerns and is important because, together with data from the first local authority questionnaire, it seems to be the only statistical basis for subsequent claims about home education and possible child abuse (but see below). However, only 25 local authorities responded to this second questionnaire.
3.2.6 This questionnaire is also poorly designed. The first question asks what number and proportion of the local authority’s current elective home education caseload is ‘known to social care’. The data sought includes open and closed cases of ‘known to social care’. But the data will not be comparable because of differences in the composition of the elective home education in each local area. The questionnaire ought to have collected more data on the make-up of the population so that it could be weighted to make it comparable between local authorities.
3.2.7 The inclusion of closed cases also means that the data may include some inappropriate referrals to social care. For instance, it collected data on Section 47 cases and these include child protection inquiries but also referrals to social care irrespective of whether or not child abuse is subsequently established. It is likely that home educators are, wrongly, over-represented amongst Section 47 referrals because (concerned) third parties, unaware of the legal right to educate at home, mistakenly contact social services. The Section 47 figures, therefore, over-estimate the number of ‘at risk’ cases amongst the home educating community. These difficulties with interpreting ‘known to social care’ statistics are not mentioned in the main report nor in the working paper.
3.2.8 The second question asked: ‘What proportion of your current caseload do you estimate have safeguarding implications? However, as the Departmental working paper acknowledges, the data on safeguarding concerns is ‘less reliable’ than the ‘known to social care’ data. Local authorities may well have interpreted safeguarding implications very differently. Moreover, the extent to which cases reported under ‘known to social care’ (Question 1) were also reported under ‘safeguarding implications’ (Question 2) is unknown and hence the data are incompatible.
3.2.9 None of the questionnaires asked for comparable data for children educated at school which severely limits the usefulness of the data. For example, 95 per cent of respondents to the on-line public questionnaire thought that home educated children are able to achieve the ‘be healthy’ Every Child Matters outcome, but without a comparable figure for school educated children this finding is difficult to interpret – is this a ‘good’ or a ‘poor’ finding relative to other children?
Estimates
3.2.10 The Review document states at paragraph 8.12 ‘….the number of children known to children’s social care in some local authorities is disproportionately high relative to the size of their home educating population…’. This is the only substantive statistical claim made in the report which suggests there is a policy ‘problem’ to be addressed. The actual estimates of the proportion and number of home educated children known to social care are (again) not presented in the published report.
3.2.11 An initial Freedom of Information request for the data behind paragraph 8.12 produced an extract, headed Annex, from the working paper mentioned above. The implication was that the claim in the report was based on data from the two local authority surveys; the first to estimate the size of the home educated population and the second the number ‘known to social care’. However, the Freedom of Information request that lead to the release of the working paper (mentioned above) includes an explanatory note that claims that paragraph 8.12 of the Report ‘… is based on the raw data returns from LAs, rather than directly from the information contained in this working paper.’ Notwithstanding that an extract from the working paper was originally released to explain the claim made in paragraph 8.12. At the time of writing this memorandum the statistical basis for paragraph 8.12 remains unclear – what ‘raw data’ over and above that reported in the annex/working paper was used? The creditability and robustness of the assertion made in paragraph 8.12 remains unclear.
3.2.12 There are also concerns about the estimation method used that undermine the validity and accuracy of the claim made in the report. The tentative nature of the ‘known to social care’ estimate ought to have been highlighted in the report and in the annex/working paper.
3.2.13 The annex/working paper’s estimated number of registered home educated children known to social care (‘around 1350’) appears to be obtained by applying the median (or mid-point) percentage (6.75 per cent) for the 25 local authorities to the estimated total number of registered home educated children. However, there are a number of problems with this calculation.
3.2.14 Firstly, the estimate assumes a median of 139 registered home educated children per local authority; giving 20,850 (139*150) nationally. The Freedom of Information response rounds this up to 21,000 whilst the Review report rounds it down to 20,000. The median value of 139 comes from the survey of 90 local authorities. Whether it is a typical value for all local authorities is unknown, because the representativeness of the 90 councils is unknown. No attempt appears to have been made to correct for sample response bias.
3.2.15 Secondly, the median value (6.75 per cent) is taken from the sub-sample of 25 local authorities. The representativeness of these 25 local authorities (17 per cent of all English local authorities) is also unknown; although the estimated mean number of ‘know to social care’ for these authorities (19 compared to 9 for the country has a whole) suggests that they are atypical and have an above average number of cases. For such an unrepresentative sample, even the use of the median percentage is likely to be an over-estimate of the proportion that should be used in national estimates. Thus grossing-up using 6.75 per cent is likely to produce an invalid over-estimate of the number of home educated children ‘known to social care’.
3.2.16 Both the Review report and the annex/working paper ought to have included a range of estimates of the proportion of home educated children known to social care.
Reporting
3.2.17 The Review report lacks impartiality through failing to summarise the results of the surveys. For example, results of the on-line public questionnaire are only cursorily summarised in paragraph 4.2 of the published report. Paragraph 4.2 gives an indication of the range of responses, but not the magnitude of opposition to further reform, because it fails to quote percentages from the survey. Headline findings include:
• 80 per cent think the current system for safeguarding children who are educated at home is adequate;
• between 91 per cent and 95 per cent of respondents believe that home educated children are able to achieve the five Every Child Matters outcomes; and
• 64 per cent think there should not be any changes made to the current system for monitoring home educating families.
3.2.18 A breakdown of these findings by client group suggests that the overwhelming majority of home educating parents and children believed that there was no need for policy reform and that the Every Child Matters outcomes were being met. The on-line survey suggests that only local authorities believed that legislative change was required. That there may be an element of ‘rent-seeking behaviour’ motivating these responses is not considered in the Review.
4 Criterion 3: Honesty
4.1 The published Review includes three instances of highly selective quoting that do not provide a full and fair representation of the evidence submitted. Firstly, the report contains a quote from a home educator that is less than complimentary about local authority staff:
“”… no one from the LA [local authority] would in my opinion be on my child’s intellectual level or they wouldn’t be working for the LA.” (para 4.3)
Leaving aside the questionable motives for the inclusion of this quote, the Report fails to give the apparent context to the observation:
‘It was in response to a question about whether a scientifically gifted child would benefit from having a science teacher from the LA come and give them tuition. It was to point out that scientists at the top of their profession are rarely working for the LA, so anyone sent out would not be on the same intellectual level as the scientifically gifted child.’
4.2 Secondly, the Review selectively quotes from a submission from the Education Division of the Church of England. The report includes a fairly lengthy extract that expresses their concerns about home education. However, the Review does not quote the Church’s overall conclusion:
‘10 We have seen no evidence to show that the majority of home educated children do not achieve the five Every Child Matters outcomes, and are therefore not convinced of the need to change the current system of monitoring the standard of home education. Where there are particular concerns about the children in a (sic) home-educating this should be a matter for Children’s Services.’
The report omits the Church’s view that they are not convinced of further reform, yet it does quote their concerns. An honest view would have included their concerns and their reservations about the need for further reform; as this would have ‘set out the facts and relevant issues truthfully’.
4.3 Although the Church gave permission for the quote to be included in the report, an email from a Church representative says that at the time they were unaware of the report’s content and are now ‘not comfortable’ with the selective use of their evidence.
4.4 Thirdly, there is selective quoting from the UN Convention on the Rights of the Child. The Report quotes paragraph 1 of Article 12 which requires Governments to ‘… assure to the child who is capable of forming his or her own views the right to express those views freely in all matters affecting the child, the views of the child being given due weight in accordance with the age and maturity of the child.’ This is then used to help justify giving local authorities a right of access to determine the child’s views without the parent(s) being present.
4.5 However, the Report does not quote paragraph 2 which would imply that the child’s views could be presented by someone other than a local authority representative:
‘For this purpose, the child shall in particular be provided the opportunity to be heard in any judicial and administrative proceedings affecting the child, either directly, or through a representative or an appropriate body, in a manner consistent with the procedural rules of national law.’
Yet the Review makes no mention of this possibility. (The Report also fails to address the situation of where the child refuses to see the local authority’s representative, a right the child should have under Articles 15 and 16.)
4.6 Whilst the use of quotations is never ‘neutral’ - they serve to highlight certain views merely by their inclusion – the way in which they are used in the Review’s report arguably does not meet the standards a ‘reasonable’ person might expect in terms of providing an overview of individual’s and organisation’s points of view. However, that this has occurred with at least three pieces of evidence cannot be simply dismissed as accidental; rather it appears to be systematic attempt to selectively present evidence.
5 Criterion 3: Objective
5.1 The objectivity of the Review is compromised by the extent to which it lacks impartiality and honesty. As a consequence the Review fails to make a strong case for its recommendations. Little of the argument is supported by evidence. Where evidence is presented there is an absence of critical analysis. This might help to explain why the published report long on assertion and short on evidence and argument, with the author simply stating ‘I believe’ 16 times.
5.2 If policy in this area is to be based on ‘what works’, a more objective base for policy formation is required than is provided by the Review.
3 comments:
Hi Maire. Bruce's submission is showing twice on my computer. If it's accidentally been posted twice you might like to correct it before people start leaving comments. Best wishes, Mike
Thanks Mike, sorted.
Fab stuff, Maire. Thanks for posting.
Post a Comment