Skip to main content
English Cymraeg
Food and You 2: Technical Report

Food and You 2: Technical Report: Appendices

Appendices for Food and You 2 technical report.

List of appendices

The online questionnaire for each wave is provided as a separate PDF alongside each key findings report and a table of methodological differences between waves has been presented below.

For each wave the documentation (listed below) will be uploaded onto the UK Data Archive:

  • Food and You 2 online questionnaire
  • Food and You 2 postal questionnaires
  • Food and You 2 invitation and reminder letters:
  • Invitation letters (sent to main sample addresses)
  • First reminder letters (sent to main sample addresses)
  • Second reminder letter (sent to main sample addresses only)
  • Final reminder letter (sent to main sample addresses only)
  • Food and You 2 full SPSS data
  • Food and You 2 SPSS user guide
  • Food and You 2 full data tables (and user guide) for England, Wales and Northern Ireland combined 
  • Food and You 2 individual country data tables (and user guide) for England, Wales and Northern Ireland

Methodological differences 

The largest issued sample size was in Wave 1 (21,053), dropping to 13,922 in Wave 2 and remaining at 14,115 for Waves 3 and 4. For Wave 5, the inclusion of the reserve addresses meant that the sample size was larger than in Waves 2-4, with 16,115 addresses. The issued sample size for Wave 6 and Wave 7 was increased to 14,500 to minimise the need to issue the reserve sample. As up to two adults per household can participate, the number of individual returns overall is always higher than the number of households participating. The number of individual returns, household returns and issued addresses for each wave can be found in the accompanying tables.

Table 2: Comparing differences between Waves 1 to 7, for postal questionnaire versions, fieldwork period and mailing sample
 

Wave Number of different versions of postal questionnaires for each country Fieldwork period Proportion of available sample sent final mailing
1 Two (Version A and Version B) 29 to 6 October 2020 (about 10 weeks)

100% of non-responding households in Wales and Northern Ireland. 

50% of non-responding households in England

2 Two ('Eating Out' and 'Eating at Home') 20 November 2020 to 21 January 2021 (about nine weeks) 100% of non-responding households across the sample
3 Two (one in Northern Ireland and one in England and Wales) 28 April to 25 June 2021 (about 8 weeks) 66.6% of non-responding households across the sample
4 Two ('Eating Out' and 'Eating at Home') 18 October 2021 and 10 January 2022 (about 12 weeks) 100% of non-responding households across the sample
5 Two (Version A and Version B) 26 April 2022 and 24 July 2022 (about 13 weeks) 100% of non-responding households across the main sample, 0% of households from the reserve sample
6 Two ('Eating Out' and 'Eating at Home') 12 October 2022 to 10 January 2023 (about 13 weeks) 100% of non-responding households across the sample
7 Two (one in Northern Ireland and one in England and Wales) 28 April to 10 July 2023 (about ten weeks) 100% of non-responding households across the sample

Questionnaire development

In Wave 1 a prolonged period of questionnaire development took place which involved an extensive review of questions from previous FSA surveys (Food and You and Public Attitudes Tracker). After all relevant questions were compiled a workshop with the Food and You 2 advisory group was held to discuss key priorities for the questionnaire. This was followed by a second workshop with key internal stakeholders to discuss their priorities for the questionnaire and provide Ipsos with direction regarding questionnaire content. 

Following this, draft questionnaire modules were compiled based on questions from previous FSA surveys. Numerous alterations to the wording, ordering, format and content of the questions were made in the process based on survey design best practice, with additional questions designed based on stakeholder needs. The questionnaire development stages for subsequent waves were much shorter as core questions and materials had been developed in Wave 1.  

Cognitive testing

Ahead of Waves 1-4 and 6, cognitive testing was conducted to examine participant comprehension of new or potentially challenging questions. Participants for cognitive testing were recruited from Ipsos MORI’s iOmnibus recontact database and for Wave 2 onwards, via an external Ipsos approved supplier. No cognitive interviews are conducted in waves with minimal new content.

The number of cognitive interviews conducted each wave can be found in the accompanying tables.

Usability testing

Prior to Wave 1 fieldwork, usability testing was also undertaken to identify areas where improvements could be made in the form and format of the questions on the online survey across different commonly used devices (e.g. mobile phone, tablet, computer). Interviews were conducted over online video conferencing software, with interviewers observing participants journey through the online questionnaire (using screen share technology) and asking questions where relevant. Eleven interviews were undertaken at this stage. This helped identify formatting and layout issues with the online questionnaire which were amended ahead of the pilot survey. Usability testing was not conducted again for subsequent waves as the online questionnaire took the same format as the Wave 1 questionnaire.

Pilot

Prior to the main stage fieldwork for Wave 1 a pilot was conducted on the full questionnaire to understand the time it took for participants to complete the questionnaire and each individual module within it. The questionnaire was tested over four days with 390 members of Ipsos MORI’s online access panel. The questionnaire took participants on average 26 mins and 48 seconds to complete and it was believed that no alterations were needed to the length of the questionnaire, in order for it to fall within the desired 30 minutes. Pilots were not conducted in subsequent waves as the expected completion time was estimated from earlier waves. 

Differences in the questionnaire

Due to the modular design of Food and You 2, some questions (core modules) are asked in every wave, whereas other questions are only present in certain waves. For some questions, the base will vary between waves. This is due to changes in the questions available for filtering, and/or their inclusion in the postal questionnaire. Please see the Wave 6 Tables User Guide for details.

The table below notes which modules were present in each wave of the survey, though note the content of each module varied somewhat between waves, as outlined above. 

Table 3: Questionnaire module content of each survey wave

 

Full list of modules from Wave 1 to 6 Waves included
About you and your household 1-7
Food Concerns (core) 1-7
Food you can trust (core) 1-7
Household Food Security (Core) 1-7
Eating at Home (core questions) 2, 4, 6
Eating at Home (full module) 1,5
Food Shopping 1, 3, 5, 7
Defra Questions 1, 3, 5, 7
Eating Out 2, 4, 6
Online Food Platforms 3, 5, 7
Food Hypersensitivities (core questions) 1, 3-7
Food Hypersensitivities (full module) 2, 6
Healthy Eating (Northern Ireland only) 3, 7
Emerging Issues 4

Differences in fieldwork 

Fieldwork dates

The Food and You 2 survey should take place every six months. However, the length of the initial questionnaire development led to a later start in its first year. Fieldwork length across waves has varied between 9 and 13 weeks. Fieldwork dates can be found in the accompanying tables.

Sample sizes

There were just over 21,000 addresses issued in Wave 1, leading to 9,319 individual returns. Since this was much higher than the target of 5,600 individual returns, only 13,922 addresses were issued in Wave 2, and 14,115 addresses issued in Waves 3 and 4. For Wave 5, 16,115 addresses were issued (14,115 from the main sample and 2,000 from the reserve sample). 14,500 addresses were issued for Wave 6. The issued sample size for Wave 6 and Wave 7 was increased to 14,500 to minimise the need to issue the reserve sample.

Vouchers 

As an experiment, each adult who completed the questionnaire in Wave 1 received either a £15 online voucher, £10 online or paper voucher and £5 online or paper voucher. Based on the results, respondents in subsequent waves received only the £10 voucher. The experiment process and results were summarised in an article published on the Social Research Association (SRA) website, Volume 11, Summer 2021. 

Postal questionnaires

For the majority of waves, when postal questionnaires were sent out the version was assigned to person one and person two in the household on a quasi-random basis. This meant half contained questions from one module and the rest contained questions from another module. However, in Waves 3 and 7, one of the modules was only relevant to residents of Northern Ireland. Therefore, the content of the postal questions varied on a country basis rather than randomly.

Reminders

In Wave 1, the final reminder was sent to all outstanding non-responding households in Wales and Northern Ireland, and to 50% of those in England. In Wave 3, the response rate was high enough after Mailing 3 for the final reminder to be sent to just two-thirds of the non-responding sample. In Wave 5, main sample cases received up to three reminders whereas reserve sample cases received up to two – the final reminder was not issued to reserve sample addresses due to an increase in response. In all other waves, all non-responding sample received a final reminder.

Topline checks

For all waves, topline checks were carried out within the first two weeks of fieldwork when sufficient responses in each country had been received to the online survey.'. 

Results of the topline checks and refinements to the data checking syntax revealed an inconsistency for HSVOUCH (receipt of Healthy Start vouchers) in Waves 1-4 and Wave 6 (HSVOUCH was not asked in Wave 5). HSVOUCH should be asked of participants who were pregnant or had a child aged 0-4 in the household. Checks showed the online script was treating those who answered ‘Prefer Not To Say’ at CHILDAGE as having at least one child age=0 and routing them to HSVOUCH. This meant more participants were being asked HSVOUCH than required. 

Numbers affected:

  • W1 – 55
  • W2 – 33
  • W3 – 51
  • W4 – 42
  • W5 – n/a
  • W6 – 17

However, Waves 1 and 2 had been corrected in the data and tables before publication, and therefore, corrections were only needed for tables in Waves 3 and 4.

Changing the routing during fieldwork would introduce inconsistencies within the Wave 6 data waves. Therefore, this was addressed by:
(a)    Applying a back edit to HSVOUCH at the data processing stage of Wave 6 data
(b)    Changing the routing of this question if it is used in future waves.
No errors were found in the topline checks for Wave 7.

Differences in weighting

Overall, the same weighting approach was taken in all waves of the survey. However, in each wave, some additional weights are needed for those questions that are not asked of all postal respondents. These additional weights will vary between waves depending on which questions are included.

Wave 4 onwards have “Welsh and Welsh-England” weights to easily compare Welsh respondents against an English population calibrated to have similar demographic characteristics. In Waves 1 to 3, the weights for the calibrated English population were called “Welsh England” weights. The corresponding weights for Welsh respondents were formally part of the individual country level weights.

Differences in data validation and management

In Waves 1 and 2, the tables were created from the underlying data independently of the SPSS dataset. From Wave 3 onwards, syntax produced the derived variables in SPSS, and this was used to produce the tables in Quantum. As part of this change, the data validation procedures were reviewed, and the following improvements made: 

  • In all waves, back editing and forwarding editing was applied to inconsistencies in the postal data, with a smaller amount of back editing applied to the data for Wave 3 onwards. Back editing meant that if a filtered question was answered but the filter origin question contradicted that answer (blank or different), then the origin question was changed to be the answer for the filter question. Whereas forward editing meant that if a participant answered a question but did not follow the routing to answer the next filtered question they were assigned a code of -99 “Not stated”.
  • In Waves 1 and 2, if a question was incorrectly answered as a multi-code question when only one answer should have been selected, then a digit from the participant ID was used to randomly select one of the answers given. From Waves 3 onwards, the responses were set to -99 “Not stated”.
  • From Wave 3 onwards, an edit was introduced to correct the number of adults when participants from a multiple response household answered that only one adult lived in that household.

Ipsos standards and accreditations

Ipsos’s standards and accreditations provide our clients with the peace of mind that they can always depend on us to deliver reliable, sustainable findings. Moreover, our focus on quality and continuous improvement means we have embedded a ‘right first time’ approach throughout our organisation.

Market Research ISO 20252 Certificate

This is the international market research specific standard that supersedes BS 7911/MRQSA and incorporates IQCS (Interviewer Quality Control Scheme). It covers the five stages of a Market Research project. Ipsos UK was the first company in the world to gain this accreditation. 

Information Security ISO 27001 Certificate

This is the international standard for information security designed to ensure the selection of adequate and proportionate security controls. Ipsos UK was the first research company in the UK to be awarded this in August 2008.

Company Quality ISO 9001 Certificate

This is the international general company standard with a focus on continual improvement through quality management systems. In 1994, we became one of the early adopters of the ISO 9001 business standard.

Market Research Society (MRS) Company Partners Certificate

By being an MRS Company Partner, Ipsos endorse and support the core MRS brand values of professionalism, research excellence and business effectiveness, and commits to comply with the MRS Code of Conduct throughout the organisation. Ipsos were the first company to sign our organisation up to the requirements & self regulation of the MRS Code; more than 350 companies have followed our lead.

The UK General Data Protection Regulation (UK GDPR) & the UK Data protection Act 2018 (DPA) 

Ipsos UK is required to comply with the UK General Data Protection Regulation and the UK Data Protection Act; it covers the processing of personal data and the protection of privacy.

Cyber Essentials Certificate

A government backed and key deliverable of the UK’s National Cyber Security Programme. Ipsos UK was first assessment validated for certification in 2016 and this is renewed annually. Cyber Essentials defines a set of controls which, when properly implemented, provide organisations with basic protection from the most prevalent forms of threat coming from the internet. 

Fair Data 

Ipsos UK is signed up as a ‘Fair Data’ Company by agreeing to adhere to ten core principles. The principles support and complement other standards such as ISOs, and the requirements of Data Protection legislation.

 

Add to smarter communications search Off