(BQ) Part 1 book “Marketing research - An applied a pproach” has contents: Research design, secondary data collection and analysis, internal secondary data and analytics, survey and quantitative observation techniques, questionnaire design,… and other contents.
www.downloadslide.net MARKETING RESEARCH AN APPLIED APPROACH FIFTH EDITION NARESH K MALHOTRA DANIEL NUNAN • DAVID F BIRKS www.downloadslide.net Marketing Research An Applied Approach www.downloadslide.net At Pearson, we have a simple mission: to help people make more of their lives through learning We combine innovative learning technology with trusted content and educational expertise to provide engaging and effective learning experiences that serve people wherever and whenever they are learning From classroom to boardroom, our curriculum materials, digital learning tools and testing programmes help to educate millions of people worldwide – more than any other private enterprise Every day our work helps learning flourish, and wherever learning flourishes, so people To learn more, please visit us at www.pearson.com/uk www.downloadslide.net Marketing Research An Applied Approach Fifth Edition Naresh K Malhotra Daniel Nunan David F Birks Harlow, England • London • New York • Boston • San Francisco • Toronto • Sydney • Dubai • Singapore • Hong Kong Tokyo • Seoul • Taipei • New Delhi • Cape Town • São Paulo • Mexico City • Madrid • Amsterdam • Munich • Paris • Milan www.downloadslide.net Pearson Education Limited Edinburgh Gate Harlow CM20 2JE United Kingdom Tel: +44 (0)1279 623623 Web: www.pearson.com/uk Original 6th edition entitled Marketing Research: An Applied Orientation published by Prentice Hall Inc., a Pearson Education company Copyright Prentice Hall Inc First edition published 2000 (print) Second edition published 2003 (print) Third edition published 2007 (print) Fourth edition published 2012 (print) Fifth edition published 2017 (print and electronic) © Pearson Education Limited 2000, 2003, 2007, 2012 (print) © Pearson Education Limited 2017 (print and electronic) The rights of Naresh K Malhotra, Daniel Nunan, David F Birks and Peter Wills to be identified as authors of this work has been asserted by them in accordance with the Copyright, Designs and Patents Act 1988 The print publication is protected by copyright Prior to any prohibited reproduction, storage in a retrieval system, distribution or transmission in any form or by any means, electronic, mechanical, recording or otherwise, permission should be obtained from the publisher or, where applicable, a licence permitting restricted copying in the United Kingdom should be obtained from the Copyright Licensing Agency Ltd, Barnard’s Inn, 86 Fetter Lane, London EC4A 1EN The ePublication is protected by copyright and must not be copied, reproduced, transferred, distributed, leased, licensed or publicly performed or used in any way except as specifically permitted in writing by the publishers, as allowed under the terms and conditions under which it was purchased, or as strictly permitted by applicable copyright law Any unauthorised distribution or use of this text may be a direct infringement of the authors’ and the publisher’s rights and those responsible may be liable in law accordingly All trademarks used herein are the property of their respective owners The use of any trademark in this text does not vest in the author or publisher any trademark ownership rights in such trademarks, nor does the use of such trademarks imply any affiliation with or endorsement of this book by such owners Pearson Education is not responsible for the content of third-party internet sites ISBN: 978-1-292-10312-9 (print) 978-1-292-10315-0 (PDF) 978-1-292-21132-9 (ePub) British Library Cataloguing-in-Publication Data A catalogue record for the print edition is available from the British Library Library of Congress Cataloging-in-Publication Data Names: Malhotra, Naresh K., author | Nunan, Daniel, author | Birks, David F., author Title: Marketing research : an applied approach / Naresh K Malhotra, Daniel Nunan, David F Birks Description: Fifth Edition | New York : Pearson, [2017] | Revised edition of Marketing research, 2012 | Includes bibliographical references and index Identifiers: LCCN 2017007654 | ISBN 9781292103129 Subjects: LCSH: Marketing research | Marketing research—Methodology Classification: LCC HF5415.2 M29 2017 | DDC 658.8/3—dc23 LC record available at https://lccn.loc.gov/2017007654 10 19 18 17 16 15 Print edition typeset in 10/12 pt Times LT Pro by Aptara Printed in Slovakia by Neografia NOTE THAT ANY PAGE CROSS REFERENCES REFER TO THE PRINT EDITION www.downloadslide.net Brief contents Prefacexiii Publisher’s acknowledgements xv About the authors xvii Introduction to marketing research Defining the marketing research problem and developing a research approach 29 Research design 59 Secondary data collection and analysis 90 Internal secondary data and analytics 121 Qualitative research: its nature and approaches 147 Qualitative research: focus group discussions 179 Qualitative research: in-depth interviewing and projective techniques207 Qualitative research: data analysis 233 10 Survey and quantitative observation techniques 267 11 Causal research design: experimentation 302 12 Measurement and scaling: fundamentals, comparative and non-comparative scaling 333 13 Questionnaire design 371 14 Sampling: design and procedures 409 15 Sampling: determining sample size 442 16 Survey fieldwork 471 17 Social media research 491 18 Mobile research 513 19 Data integrity 528 20 Frequency distribution, cross-tabulation and hypothesis testing 556 21 Analysis of variance and covariance 601 www.downloadslide.net vi Marketing Research 22 Correlation and regression 632 23 Discriminant and logit analysis 673 24 Factor analysis 707 25 Cluster analysis 735 26 Multidimensional scaling and conjoint analysis 762 27 Structural equation modelling and path analysis 795 28 Communicating research findings 831 29 Business-to-business (b2b) marketing research 854 30 Research ethics 881 Glossary908 Subject index 926 Name index 952 Company index 954 www.downloadslide.net Contents Prefacexiii Publisher’s acknowledgements xv About the authors xvii Introduction to marketing research Objectives2 Overview2 What does ‘marketing research’ mean? A brief history of marketing research 6 Definition of marketing research The marketing research process A classification of marketing research 12 The global marketing research industry 15 Justifying the investment in marketing research 19 The future – addressing the marketing research skills gap 22 Summary25 Questions26 Exercises26 Notes27 Defining the marketing research problem and developing a research approach 29 Objectives30 Overview30 Importance of defining the problem 31 32 The marketing research brief Components of the marketing research brief 33 The marketing research proposal 36 The process of defining the problem and developing a research approach 39 42 Environmental context of the problem Discussions with decision makers 42 Interviews with industry experts 44 Initial secondary data analyses 45 Marketing decision problem and marketing research problem46 Defining the marketing research problem 49 Components of the research approach 50 Objective/theoretical framework 51 Analytical model 52 Research questions 53 Hypothesis54 Summary54 Questions55 Exercises56 Notes57 Research design 59 Objectives60 Overview60 Research design definition 61 Research design from the decision makers’ perspective62 Research design from the participants’ perspective 63 Research design classification 69 Descriptive research 73 Causal research 79 Relationships between exploratory, descriptive 80 and causal research Potential sources of error in research designs 82 Summary85 Questions86 Exercises86 Notes87 Secondary data collection and analysis 90 Objectives91 Overview91 Defining primary data, secondary data and marketing intelligence 92 Advantages and uses of secondary data 94 Disadvantages of secondary data 96 Criteria for evaluating secondary data 96 Classification of secondary data 99 Published external secondary sources 100 Databases104 Classification of online databases 104 Syndicated sources of secondary data 106 Syndicated data from households 109 www.downloadslide.net viii Marketing Research Syndicated data from institutions 115 Summary117 Questions118 Exercises119 Notes119 Internal secondary data and analytics121 Objectives122 Overview122 Internal secondary data 125 Geodemographic data analyses 128 Customer relationship management 132 Big data 134 Web analytics 136 Linking different types of data 139 Summary144 Questions144 Exercises145 Notes146 Qualitative research: its nature and approaches 147 Objectives148 Overview148 Primary data: qualitative versus quantitative research 150 152 Rationale for using qualitative research Philosophy and qualitative research 155 Ethnographic research 162 Grounded theory 168 Action research 171 Summary174 Questions176 Exercises176 Notes177 Qualitative research: focus group discussions 179 Objectives180 Overview180 Classifying qualitative research techniques 182 Focus group discussion 183 Planning and conducting focus groups 188 The moderator 193 Other variations of focus groups 194 Other types of qualitative group discussions 195 Misconceptions about focus groups 196 Online focus groups 198 Advantages of online focus groups 200 Disadvantages of online focus groups 201 Summary202 Questions203 Exercises204 Notes205 8 Qualitative research: in-depth interviewing and projective techniques207 Objectives208 Overview208 In-depth interviews 209 Projective techniques 221 Comparison between qualitative techniques 227 Summary228 Questions229 Exercises230 Notes230 9 Qualitative research: data analysis 233 Objectives234 Overview234 The qualitative researcher 235 The process of qualitative data analysis 239 251 Grounded theory Content analysis 254 Semiotics256 Qualitative data analysis software 259 Summary262 Questions263 Exercises264 Notes264 10 Survey and quantitative observation techniques 267 Objectives268 Overview268 269 Survey methods Online surveys 271 Telephone surveys 275 Face-to-face surveys 276 A comparative evaluation of survey methods 279 Other survey methods 288 289 Mixed-mode surveys Observation techniques 289 Observation techniques classified by mode of administration292 A comparative evaluation of the observation techniques295 Advantages and disadvantages of observation techniques296 Summary297 Questions297 Exercises298 Notes299 www.downloadslide.net Contents 11 Causal research design: experimentation302 Objectives303 Overview303 Concept of causality 304 Conditions for causality 305 Definitions and concepts 308 Definition of symbols 310 Validity in experimentation 310 Extraneous variables 311 Controlling extraneous variables 313 A classification of experimental designs 315 Pre-experimental designs 316 True experimental designs 317 Quasi-experimental designs 318 Statistical designs 320 Laboratory versus field experiments 323 Experimental versus non-experimental designs 325 Application: test marketing 326 Summary328 Questions329 Exercises330 Notes330 12 Measurement and scaling: fundamentals, comparative and non-comparative scaling 333 Objectives334 Overview334 335 Measurement and scaling Scale characteristics and levels of measurement 336 Primary scales of measurement 337 A comparison of scaling techniques 342 Comparative scaling techniques 343 347 Non-comparative scaling techniques Itemised rating scales 349 Itemised rating scale decisions 352 Multi-item scales 356 Scale evaluation 358 Choosing a scaling technique 363 Mathematically derived scales 364 Summary364 Questions365 Exercises366 Notes367 13 Questionnaire design 371 Objectives372 Overview372 Questionnaire definition 374 Questionnaire design process 375 Specify the information needed 378 Specify the type of interviewing method 379 ix Determine the content of individual questions 380 Overcoming the participant’s inability and unwillingness to answer 381 Choose question structure 385 Choose question wording 389 Arrange the questions in proper order 394 Identify the form and layout 396 Reproduce the questionnaire 397 Eliminate problems by pilot-testing 398 Summarising the questionnaire design process400 Designing surveys across cultures and countries 402 Summary403 Questions404 Exercises405 Notes405 14 Sampling: design and procedures 409 Objectives410 Overview410 Sample or census 412 The sampling design process 414 A classification of sampling techniques 419 Non-probability sampling techniques 420 425 Probability sampling techniques Choosing non-probability versus probability sampling 433 Summary of sampling techniques 434 Issues in sampling across countries and cultures 436 Summary437 Questions438 Exercises439 Notes439 15 Sampling: determining sample size 442 Objectives443 Overview443 Definitions and symbols 445 The sampling distribution 446 Statistical approaches to determining sample size 447 The confidence interval approach 448 Multiple characteristics and parameters 454 Other probability sampling techniques 454 Adjusting the statistically determined sample size 455 Calculation of response rates 456 Non-response issues in sampling 457 Summary464 Questions464 Exercises465 Appendix: The normal distribution 466 Notes468 www.downloadslide.net 394 Marketing Research Use positive and negative statements Many questions, particularly those measuring attitudes and lifestyles, are presented as statements with which participants indicate their degree of agreement or disagreement Evidence indicates that the response obtained is influenced by the directionality of the statements: whether they are stated positively or negatively In these cases, it is better to use dual statements, some of which are positive and others negative Two different questionnaires could be prepared One questionnaire would contain half-negative and half-positive statements in an interspersed way The direction of these statements would be reversed in the other questionnaire (An example of dual statements was provided in the summated Likert scale in Chapter 12, designed to measure attitudes towards the Odeon cinema.) Arrange the questions in proper order The order of questions is of equal importance to the wording used in the questions As noted in the previous section, questions communicate and set participants in a particular frame of mind This frame of mind is set at the start of the questioning process and can change as each question is posed and responded to It affects how participants perceive individual questions and respond to those questions As well as understanding the characteristics of language in target participants, questionnaire designers must be aware of the logical connections between questions, as perceived by target participants The following issues help to determine the order of questions Opening questions The opening questions can be crucial in gaining the confidence and cooperation of participants These questions should be interesting, simple and non-threatening Questions that ask participants for their opinions can be good opening questions, because most people like to express their opinions Sometimes such questions are asked, although they are unrelated to the research problem and their responses are not analysed.41 Though classification questions seem simple to start a questionnaire, issues such as age, gender and income can be seen as very sensitive issues Opening a questionnaire with these questions tends to make participants concerned about the purpose of these questions and, indeed, the whole survey They can also give the questionnaire a feel of an ‘official form’ to be completed (like a national census or a tax form), rather than a positive engagement and experience with a particular topic However, in some instances it is necessary to qualify participants to determine whether they are eligible to participate in the interview In this case the qualifying questions serve as the opening questions, and they may have to be classification questions such as the age of the participant Type of information Classification information Socio-economic and demographic characteristics used to classify participants Identification information A type of information obtained in a questionnaire that includes name, address and phone number The type of information obtained in a questionnaire may be classified as (1) basic information, (2) classification information and (3) identification information Basic information relates directly to the research problem Classification information, consisting of socio-economic and demographic characteristics, is used to classify the participants, understand the results and validate the sample (see Chapter 14) Identification information includes name, postal address, email address and telephone number Identification information may be obtained for a variety of purposes, including verifying that the participants listed were actually interviewed and to send promised incentives or prizes As a general guideline, basic information should be obtained first, followed by classification and finally identification information The basic information is of greatest importance to the research project and should be obtained first, before risking alienation of the participants by asking a series of personal questions www.downloadslide.net Chapter 13 Questionnaire design 395 Difficult questions Difficult questions, or questions that are sensitive, embarrassing, complex or dull, should be placed late in the sequence After rapport has been established and the participants become involved, they are less likely to object to these questions Thus, in the S:Comm Leisure Time study (see Chapter 12), where we focused upon cinema visits, information about the nature of film merchandise that has been purchased should be asked at the end of the section on basic information Had participants perceived (incorrectly) that the survey was being used as a means to sell them merchandise, their trust in the survey and the nature of their subsequent responses could have been impaired Likewise, income should be the last question in the classification section (if it is to be used at all) Effect on subsequent questions Questions asked early in a sequence can influence the responses to subsequent questions As a rule of thumb, general questions should precede specific questions This prevents specific questions from biasing responses to the general questions Consider the following sequence of questions: Q1: What considerations are important to you in selecting a boutique? Q2: In selecting a boutique, how important is convenience of its location? Funnel approach A strategy for ordering questions in a questionnaire in which the sequence starts with the general questions, which are followed by progressively specific questions, to prevent specific questions from biasing general questions Note that the first question is general whereas the second is specific If these questions were asked in the reverse order, participants would be clued about convenience of location and would be more likely to give this response to the general question Going from general to specific is called the funnel approach The funnel approach is particularly useful when information has to be obtained about participants’ general choice behaviour and their evaluations of specific products.42 Sometimes the inverted funnel approach may be useful In this approach, questioning starts with specific questions and concludes with the general questions The participants are compelled to provide specific information before making general evaluations This approach is useful when participants have no strong feelings or have not formulated a point of view Logical order Branching question A question used to guide an interviewer (or participant) through a survey by directing the interviewer (or participant) to different spots on the questionnaire depending on the answers given Questions should be asked in a logical order This may seem a simple rule, but as the researcher takes time to understand participants and how they use language, the researcher should also take time to understand their logic, i.e what ‘logical order’ means to target participants All questions that deal with a particular topic should be asked before beginning a new topic When switching topics, brief transitional phrases should be used to help participants switch their train of thought Branching questions should be designed with attention to logic, making the questionnaire experience more relevant to individual participants.43 Branching questions direct participants to different places in the questionnaire based on how they respond to the question at hand These questions ensure that all possible contingencies are covered They also help reduce interviewer and participant error and encourage complete responses Skip patterns based on the branching questions can become quite complex A simple way to account for all contingencies is to prepare a flow chart of the logical possibilities and then develop branching questions and instructions based on it A flow chart used to assess the use of electronic payments in online clothes purchases is shown in Figure 13.4 Placement of branching questions is important and the following guidelines should be followed: (1) the question being branched (the one to which the participants are being directed) should be placed as close as possible to the question causing the branching; and (2) the branching questions should be ordered so that the participants cannot anticipate what additional information will be required Otherwise, the participants may discover that they can www.downloadslide.net 396 Marketing Research Figure 13.4 Introduction Flow chart for questionnaire design Ownership of credit and debit cards Bought clothes over the internet during the last month Yes No How was the payment made? Cheque Electronic Other PayPal Ever purchased via the internet? Yes Other Credit card Debit card Intentions to use cheque PayPal, credit and debit cards avoid detailed questions by giving certain answers to branching questions For example, the participants should first be asked if they have seen any of the listed advertisements before they are asked to evaluate advertisements Otherwise, the participants will quickly discover that stating that they have seen an advertisement leads to detailed questions about that advertisement and that they can avoid detailed questions by stating that they have not seen the advertisement Identify the form and layout Pre-coding In questionnaire design, assigning a code to every conceivable response before data collection The format, spacing and positioning of questions can have a significant effect on the results, particularly in self-administered questionnaires.44 It is good practice to divide a questionnaire into several parts Several parts may be needed for questions pertaining to the basic information The questions in each part should be numbered, particularly when branching questions are used Numbering of questions also makes the coding of responses easier In addition, if the survey is conducted by post, the questionnaires should preferably be precoded In pre-coding, the codes to enter in the computer can be printed on the questionnaire Note that when conducting online CATI and CAPI surveys, pre-coding of the questionnaire is built into the questionnaire design software (Coding of questionnaires is explained in more detail in Chapter 19 on data preparation.) In surveys where there are hard copies of questionnaires, they should be numbered serially This facilitates the control of questionnaires in the field, as well as the coding and analysis Numbering makes it easy to account for the questionnaires and to determine www.downloadslide.net Chapter 13 Questionnaire design 397 whether any have been lost A possible exception to this rule is postal questionnaires If these are numbered, participants assume that a given number identifies a particular participant Some participants may refuse to participate or may answer differently under these conditions With the majority of questionnaires being administered online, researchers should not think of form and structure of a questionnaire in terms of designing a paper or postal survey experience Such thinking can lead to a dull and monotonous survey experience for online participants With many people experiencing rich, varied and exciting websites, to move then into a flat, text-based questionnaire experience can be most off-putting An analogy may be in terms of games technology Imagine a participant being used to a highly interactive, perhaps 3D games experience, and then being expected to engage with a ‘Pong’ (one of the earliest arcade videogames, a tennis sports game featuring simple two-dimensional graphics) There may be a moment of nostalgia for such an experience but it would be quickly dismissed as boring and irrelevant Reproduce the questionnaire The arguments about the form and layout of an online, SMS and CAPI survey relate to reproducing the questionnaire In the design of an online questionnaire, variations of language, branching, graphics and visuals and the survey experience can be almost tailored to individual participants Time and money that in the past may have been devoted to the printing of paper-based questionnaires can now be avoided and invested in designing the form, layout and look to give participants the most engaging experience In surveys where there are hard copies of questionnaires (or even in multi-mode surveys where participants have a choice of survey type), how a questionnaire is reproduced for administration can influence the results For example, if the questionnaire is reproduced on poor-quality paper or is otherwise shabby in appearance, participants will think that the project is unimportant and the quality of response will be adversely affected Therefore, the questionnaire should be reproduced on good-quality paper and have a professional appearance In face-to-face interviews and postal surveys, when a printed questionnaire runs to several pages, it should take the form of a booklet rather than a number of sheets of paper clipped or stapled together Booklets are easier for the interviewer and the participants to handle and not easily come apart with use They allow the use of a double-page format for questions and look more professional Each question should be reproduced on a single page (or doublepage spread) Researchers should avoid splitting a question, including its response categories Split questions can mislead the interviewer or the participant into thinking that the question has ended at the end of a page This will result in answers based on incomplete questions Vertical response columns should be used for individual questions It is easier for interviewers and participants to read down a single column rather than reading sideways across several columns Sideways formatting and splitting, done frequently to conserve space, should be avoided The tendency to crowd questions together to make the questionnaire look shorter should be avoided Overcrowded questions with little blank space between them can lead to errors in data collection and yield shorter and less informative replies Moreover, it gives the impression that the questionnaire is complex and can result in lower cooperation and completion rates Although shorter questionnaires are more desirable than longer ones, the reduction in size should not be obtained at the expense of crowding Directions or instructions for individual questions should be placed as close to the questions as possible Instructions relating to how the question should be administered or www.downloadslide.net 398 Marketing Research Pilot-testing Testing the questionnaire on a small sample of participants for the purpose of improving the questionnaire by identifying and eliminating potential problems answered by the participant should be placed just before the question Instructions concerning how the answer should be recorded or how the probing should be done should be placed after the question (for more information on probing and other interviewing procedures, see Chapter 16) It is common practice to distinguish instructions from questions by using different typefaces (such as capital, italic or boldfaced letters) Although colour does not generally influence response rates to questionnaires, it can be employed advantageously in many respects Colour coding is useful for branching questions The next question to which the participant is directed is printed in a colour that matches the space in which the answer to the branching question was recorded The questionnaire should be reproduced in such a way that it is easy to read and answer The type should be large and clear Reading the questionnaire should not impose a strain Eliminate problems by pilot-testing Pilot-testing refers to testing the questionnaire on a small sample of participants to identify and eliminate potential problems, as illustrated in the following example.45 Real research Re-inventing long-haul travel46 Air New Zealand worked with Synovate (www.synovate.com) to conduct a survey on all areas of designing the flying experience: seats, in-flight entertainment, food and beverage specification, service flow, kitchen design The researchers also helped realise other ideas that had not been considered before, such as on-board events The actual questionnaire had to be relatively straightforward in structure, but it was critical that the attitudinal statements at the core of the questionnaire were sufficiently discriminatory to separate out what might be quite soft attitudes and feelings The researchers set up a series of statements for the questionnaire that were a combination of both the findings from a study conducted by the design consultancy IDEO (www.ideo.com) and their learning from previous Air New Zealand research Lengthy discussions between Air New Zealand and IDEO, as well as the inclusion of the Qualitative Research Director, Grant Storry (www.grantstorry.com), led to the final draft An online pilot questionnaire was tested on 40 long-haul travellers, who were clearly informed that they were testing the questionnaire The test questionnaire contained only the attitudinal statements Pilot participants were asked to pause after each block of attitudinal questions and reflect on whether these statements were clear, easy to understand, and whether they captured the concept of the wider need that Air New Zealand was interested in Their open-ended responses were reviewed by the researchers and used to refine the statements In addition, the pilot participants’ actual ratings were analysed to determine whether each statement was sufficiently discriminative across the sample to be useful for segmentation purposes The final questionnaire included a total of 50 attitudinal statements across the whole long-haul experience www.downloadslide.net Chapter 13 Questionnaire design 399 Even the best questionnaire can be improved by pilot-testing As a general rule, a questionnaire should not be used in the field survey without adequate pilot-testing A pilot-test should be extensive All aspects of the questionnaire should be tested, including question content, wording, sequence, form and layout, question difficulty and instructions The participants in the pilot-test should be similar to those who will be included in the actual survey in terms of background characteristics, familiarity with the topic and attitudes and b ehaviours of interest.47 In other words, participants for the pilot-test and for the actual survey should be drawn from the same population Pilot-tests are best done by face-to-face interviews, even if the actual survey is to be conducted by online, postal or telephone methods, because interviewers can observe participants’ reactions and attitudes After the necessary changes have been made, another pilot-test could be conducted by online, postal or telephone methods if those methods are to be used in the actual survey The latter pilot-tests should reveal problems peculiar to the interviewing method To the extent possible, a pilot-test should involve administering the questionnaire in the environment and context similar to that of the actual survey A variety of interviewers should be used for pilot-tests The project director, the researcher who developed the questionnaire and other key members of the research team should conduct some pilot-test interviews This will give them a good feel for potential problems and the nature of the expected data If the survey involves face-to-face interviews, pilot-tests should be conducted by regular interviewers It is good practice to employ both experienced and novice interviewers Experienced interviewers can easily perceive uneasiness, confusion and resistance in the participants, and novice interviewers can help the researcher identify interviewer-related problems The sample size of the pilot-test is typically small, varying from 15 to 30 participants for the initial testing, depending on the heterogeneity (e.g a wide array of education levels) of the target population The sample size can increase substantially if the pilot-testing involves several stages or waves Protocol analysis and debriefing are two commonly used procedures in pilot-testing In protocol analysis, the participant is asked to ‘think aloud’ while answering the questionnaire (as explained in Chapter 12) Typically, the participant’s remarks are recorded and analysed to determine the reactions invoked by different parts of the questionnaire Debriefing occurs after the questionnaire has been completed Participants are told that the questionnaire they just completed was a pilot-test and the objectives of pilot-testing are described to them They are then asked to describe the meaning of each question, to explain their answers and to state any problems they encountered while answering the questionnaire Editing involves correcting the questionnaire for the problems identified during pilottesting After each significant revision of the questionnaire, another pilot-test should be conducted using a different sample of participants Sound pilot-testing involves several stages One pilot-test is a bare minimum Ideally, pilot-testing should be continued until no further changes are needed Finally, the responses obtained from the pilot-test should be analysed The analysis of pilot-test responses can serve as a check on the adequacy of the problem definition and the data and analysis required to obtain the necessary information The dummy tables prepared before developing the questionnaire will point to the need for the various sets of data If the response to a question cannot be related to one of the pre-planned dummy tables, either those data are superfluous or some relevant analysis has not been foreseen If part of a dummy table remains empty, a necessary question may have been omitted Analysis of pilot-test data helps to ensure that all data collected will be utilised and that the questionnaire will obtain all the necessary data.48 www.downloadslide.net 400 Marketing Research Summarising the questionnaire design process Table 13.1 summarises the questionnaire design process in the form of a checklist Table 13.1 Questionnaire design checklist Step 1: Specify the information needed 1 Ensure that the information obtained fully addresses all the components of the problem Review components of the problem and the approach, particularly the research questions, hypotheses and characteristics that influence the research design 2 Prepare a set of dummy tables 3 Have a clear idea of the characteristics and motivations of the target participants Step 2: Specify the type of interviewing method 1 Review the type of interviewing method determined based on considerations discussed in Chapter 10 Step 3: Determine the content of individual questions 1 Is the question necessary? 2 Are several questions needed instead of one to obtain the required information in an unambiguous manner? 3 Do not use double-barrelled questions Step 4: Overcome the participant’s inability and unwillingness to answer 1 Is the participant informed? 2 If the participant is not likely to be informed, filter questions that measure familiarity, product use and past experience should be asked before questions about the topics themselves 3 Can the participant remember? 4 Avoid errors of omission, telescoping and creation 5 Questions that not provide the participant with cues can underestimate the actual occurrence of an event 6 Can the participant articulate? 7 Minimise the effort required of the participant 8 Is the context in which the questions are asked appropriate? 9 Make the request for information seem legitimate 10 If the information is sensitive: (a) Place sensitive topics at the end of the questionnaire (b) Preface the question with a statement that the behaviour of interest is common (c) Ask the question using the third-person technique (d) Hide the question in a group of other questions that participants are willing to answer (e) Provide response categories rather than asking for specific figures (f) Use randomised techniques, if appropriate Step 5: Choose question structure 1 Open-ended questions are useful in exploratory research and as closing questions 2 Use structured questions whenever possible 3 In multiple-choice questions, the response alternatives should include the set of all possible choices and should be mutually exclusive 4 In a dichotomous question, if a substantial proportion of the participants can be expected to be neutral, include a neutral alternative 5 Consider the use of the split-ballot technique to reduce order bias in dichotomous and multiple-choice questions 6 If the response alternatives are numerous, consider using more than one question to reduce the information-processing demands on the participants www.downloadslide.net Chapter 13 Questionnaire design Table 13.1 401 Continued Step 6: Choose question wording 1 Define the issue in terms of ‘who’, ‘what’, ‘when’ and ‘where’ 2 Use ordinary words Words should match the vocabulary level of the participants 3 Avoid ambiguous words such as ‘usually’, ‘normally’, ‘frequently’, ‘often’, ‘regularly’, ‘occasionally’, ‘sometimes’, etc 4 Avoid leading or biasing questions that cue the participant to what the answer should be 5 Avoid implicit alternatives that are not explicitly expressed in the options 6 Avoid implicit assumptions 7 Participants should not have to make generalisations or compute estimates 8 Use positive and negative statements Step 7: Arrange the questions in proper order 1 The opening questions should be interesting, simple and non-threatening 2 Qualifying questions should serve as the opening questions 3 Basic information should be obtained first, followed by classification and finally identification information 4 Difficult, sensitive or complex questions should be placed late in the sequence 5 General questions should precede specific questions 6 Questions should be asked in a logical order 7 Branching questions should be designed carefully to cover all possible contingencies 8 The question being branched should be placed as close as possible to the question causing the branching, and the branching questions should be ordered so that the participants cannot anticipate what additional information will be required Step 8: Design the form and layout 1 Divide a questionnaire into several parts 2 Questions in each part should be numbered 3 If hard copies of the questionnaires are used, coding should be printed on the forms to facilitate manual data entry Step 9: Publish the questionnaire 1 The questionnaire should be designed to be visually engaging 2 Vertical response columns should be used 3 Grids are useful when there are a number of related questions that use the same set of response categories 4 The tendency to crowd questions to make the questionnaire look shorter should be avoided 5 Directions or instructions for individual questions should be placed as close to the questions as possible 6 If hard copies of the questionnaires are used, a booklet format should be used for long questionnaires; each question should be reproduced on a single page (or double-page spread) Step 10: Eliminate problems by pilot-testing 1 Pilot-testing should always be done 2 All aspects of the questionnaire should be tested, including question content, wording, sequence, form and layout, question difficulty, instructions and rewards for taking part in the survey 3 The participants in the pilot-test should be similar to those who will be included in the actual survey 4 Begin the pilot-test by using face-to-face interviews 5 The pilot-test should also be conducted by online, postal or telephone methods if those methods are to be used in the actual survey 6 A variety of interviewers should be used for pilot-tests 7 The pilot-test sample size should be small, varying from 15 to 30 participants for the initial testing 8 Use protocol analysis and debriefing to identify problems 9 After each significant revision of the questionnaire, another pilot-test should be conducted using a different sample of participants 10 The responses obtained from the pilot-test should be analysed to check the set-up of tables and charts 11 The responses obtained from the pilot-test should not be aggregated with responses from the final survey www.downloadslide.net 402 Marketing Research Designing surveys across cultures and countries Questionnaire design should be adapted to specific cultural environments and all efforts made to avoid bias in terms of any one culture This requires careful attention to each step of the questionnaire design process The information needed should be clearly specified and form the focus of the questionnaire design This should be balanced by taking into account any participant differences in terms of underlying consumer behaviour, decision making processes, psychographics, lifestyles and demographic variables In the context of demographic characteristics, information on marital status, education, household size, occupation, income and dwelling unit may have to be specified differently for different countries, as these variables may not be directly comparable across countries For example, household definition and size vary greatly, given the extended family structure in some countries and the practice of two or even three families living under the same roof Although online surveys may dominate as a survey method in Western countries, different survey methods may be favoured or more prevalent in different countries for a variety of reasons Hence, the questionnaire may have to be suitable for administration by more than one mode Using different survey modes can be readily facilitated by the use of online research communities The following example illustrates a study on the use of online research communities in different countries and how online research is used in different countries Even if there is a global trend that questionnaires have to be designed to be administered in online surveys, there still remains a need for cultural adaptation of questionnaires Examples of such adaptation can include the challenges of comprehension and translation It is desirable to have two or more simple questions rather than a single complex question In overcoming the inability to answer, the variability in the extent to which participants in different cultures are informed about the subject matter of the survey should be taken into account Participants in some parts of the world may not be as well informed on many issues as people in Europe (and vice versa of course!) The use of unstructured or open-ended questions may be desirable if the researcher lacks knowledge about the determinants of response in other countries Because they not impose any response alternatives, unstructured questions also reduce cultural bias, but they are more affected by differences in educational levels than structured questions They should be used with caution in countries with low literacy and, indeed, low readership levels The questionnaire may have to be translated for administration in different cultures and the researcher must ensure that the questionnaires in different languages are equivalent Pilot-testing the questionnaire is complicated in international research because linguistic equivalence must be pilot-tested Two sets of pilot-tests are recommended The translated questionnaire should be pilot-tested on monolingual subjects in their native language, and the original and translated versions should also be administered to bilingual subjects The pilot-test data from administering the questionnaire in different countries or cultures should be analysed and the pattern of responses compared to detect any cultural biases Real research Online research communities: a cross-cultural review of participants’ perceptions49 Online research communities have been described as the fastest-growing sector of marketing research Among the benefits credited to online research communities are phrases such as ‘authentic voice of the consumer’ and ‘increased engagement’ These benefits were unsupported by data, raising questions about whether participants really felt better about online research communities A study examined this issue and compared views of different survey modes across five countries: Australia, Canada, China, I ndonesia and the www.downloadslide.net Chapter 13 Questionnaire design 403 USA The study represented a collaboration of four companies: Colmar Brunton (www colmarbrunton.com.au) in Australia, Social Data Research in Canada, SSI (www.surveysampling.com) in China and the USA and Nielsen (www.nielsen.com) in Indonesia An online survey looked at the perceptions of interviews conducted face to face, by telephone, via online access panels, focus groups and online research communities In Indonesia and China, research could be conducted online but infrastructure issues and access problems for some socio-economic groups continued to make the challenges for online research formidable There were considerable similarities between the countries This suggested that once a citizen of any of the five countries became a member of an online access panel, the citizen started to be like members of panels in other countries, at least in terms of views about the different ways to conduct research It was possible to draw a distinction between the three more-developed research markets (Australia, Canada and USA) and the two less-developed research markets (Indonesia and China), with China being part-way between Indonesia and the three more developed markets The two less developed research markets showed higher degrees of participants being members of online access panels and also being participants for other survey modes, suggesting that they represented a highly researched minority This change probably reflected the fact that in Australia, Canada and the USA, research started by using random sampling approaches based on face-to-face, migrating to telephone and, more recently, to online access panels In China, and even more so in Indonesia, marketing research had jumped the earlier phases and arrived online much earlier (and with the complexity of telephone research being identified with mobile phones from the outset) Chinese participants were the least positive about face-to-face research, and Indonesia was the most positive about telephone research Across all countries, participants reported lower levels of satisfaction with telephone surveys than with any of the other survey modes While research online communities were seen as having several strengths, they needed to be more engaging or they may turn participants off One key area for improvement for online research communities was the need to ensure that participants felt the return they received was worth their effort This may require both improvements to incentives and processes that increased other types of rewards, including recognition, feedback and the provision of information One major caveat that should be borne in mind was that this study focused on online participants and their views of all modes In Australia, Canada, and the USA, online was the single largest mode However, in China, and even more so in Indonesia, online was a relatively smaller mode, but one that was growing rapidly Summary A questionnaire has three objectives It must translate the information needed into a set of specific questions that the participants can and will answer It must motivate participants to complete the interview It must also minimise response error Designing a questionnaire is more of a craft than a science This is primarily due to the interrelationship of stages and the trade-offs that questionnaire designers make in balancing the source of ideas, question purposes, actual questions and question analyses The steps involved in the questionnaire design process involve: 1 Specifying the information needed Understanding what information decision makers need and the priorities they face 2 Specifying the type of interviewing method Understanding which means of eliciting the information will work best, given the research design constraints that the researcher has to work with www.downloadslide.net 404 Marketing Research 3 Determining the content of individual questions Understanding the purpose of each question and working out how a posed question may fulfil that purpose 4 Overcoming the participants’ inability and unwillingness to answer questions Understanding the process of approaching and questioning participants – from their perspective Knowing what benefits they get from taking part in the survey process Knowing what they find engaging and/or boring as a questionnaire experience 5 Choosing the question structure Understanding how individual questions help to elicit information from participants and help them to express their feelings 6 Choosing the question wording Understanding the meaning of words from the perspective of the participant 7 Arranging the questions in a proper order Understanding what ‘proper’ means from the perspective of the participant Recognising that, as each question is posed to a participant and the participant thinks about the response, he or she changes Information is not only drawn out of participants, it is communicated to them and they change as each question is tackled 8 Identifying the form and layout of the questionnaire Understanding how, in a self-completion scenario, the form and layout motivate and help the participant to answer the questions in an honest and reflective manner Understanding, when needed, how the form and layout help the interviewer to conduct and record the interview 9 Publishing the questionnaire Understanding how the professional appearance of a questionnaire affects the perceived credibility and professional ability of researchers 10 Eliminating problems by pilot-testing Understanding that, no matter how much experience the researcher has in designing questionnaires, the issues, participant characteristics and context of questioning make each survey unique – pilot-testing is vital Questions What is the purpose of a questionnaire? What expectations does the researcher have of potential questionnaire participants – in terms of how they will react to the experience of completing a questionnaire? What does the researcher have to offer potential questionnaire participants? Why should this question be considered? How would you determine whether a specific question should be included in a questionnaire? What are the reasons why participants may be (a) unable to answer and (b) unwilling to answer the question asked? Explain the errors of omission, telescoping and creation What can be done to reduce such errors? Explain the concepts of aided and unaided recall What can a researcher to make the request for information seem legitimate? www.downloadslide.net Chapter 13 Questionnaire design 405 10 11 12 13 What are the advantages and disadvantages of unstructured questions? What are the issues involved in designing multiple-choice questions? What are the guidelines available for deciding on question wording? What is a leading question? Give an example What is the proper order for questions intended to obtain basic, classification and identification information? 14 What guidelines are available for deciding on the form and layout of a questionnaire? 15 Describe the issues involved in pilot-testing a questionnaire Exercises Heineken beer would like to conduct a survey of 18–25-year-old Europeans to determine the characteristics of its corporate image Design a full questionnaire using survey software such as Google Forms, Qualtrics or SurveyMonkey Administer this questionnaire in a mode of your choice to 25 fellow students Write a short report based upon your experience of using the software, the findings you have generated and any limitations you see in the whole process (i.e how would you this differently if you were to repeat it?) Develop a questionnaire for determining household preferences for popular brands of cold breakfast cereals Administer the questionnaire to five adult females, five adult males and five children How would you modify the questionnaire if it was to be administered by telephone? What changes would be necessary if it was to be administered online? Are there distinctive characteristics of your different participant types that could affect your questionnaire design? You have been hired as a management trainee by a firm that manufactures major household appliances Your boss has asked you to develop a questionnaire to determine how households plan, purchase and use major appliances This questionnaire is to be used in five European countries However, you feel that you not have the expertise or the experience to construct such a complex questionnaire Present your case to your boss In a small group, discuss the following issues: ‘Because questionnaire design is a craft, it is useless to follow a rigid set of guidelines Rather, the process should be left entirely to the creativity and ingenuity of the researcher.’ and ‘Asking sensitive classification questions such as age or income at the start of a questionnaire upsets the sensibilities only of older participants; young participants are not concerned about where these questions are asked.’ Notes Flavián, C and Gurrea, R., ‘Digital versus traditional newspapers: Influences on perceived substitutability’, International Journal of Market Research 51 (5) (2009), 635–57 Delattre, E and Colovic, A., ‘Memory and perception of brand mentions and placement of brands in songs’, International Journal of Advertising 28 (5) (2009), 807–42 Livin, J., ‘Improving response rates in web surveys with default setting: The effects of default on web survey participation and permission’, International Journal of Market Research 53 (1) (2011), 75–94; Balabanis, G., Mitchell, V.W and Heinonen-Mavrovouniotis, S., ‘SMS-based surveys: Strategies to improve participation’, International Journal of Advertising 26 (3) (2007), 369–85 www.downloadslide.net 406 Marketing Research The founding reference to this subject is Payne, S.L., The Art of Asking Questions (Princeton, NJ: Princeton University Press, 1951) See also Lietz, P., ‘Research into questionnaire design: A summary of the literature’, International Journal of Market Research 52 (2) (2010), 249–72; Schrage, M., ‘Survey says’, Adweek Magazines’ Technology Marketing 22 (1) (January 2002), 11; Gillham, B., Developing a Questionnaire (New York: Continuum International, 2000) Woodnutt, T and Owen, R., ‘The research industry needs to embrace radical change in order to thrive and survive in the digital era’, Market Research Society: Annual Conference (2010) These guidelines are drawn from several books on questionnaire design: Dillman, D.A., Smyth, J.D and Melani, L., Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 3rd edn (Hoboken, NJ: Wiley, 2008); Bradburn, N.M., Sudman, S and Wansink, B., Asking Questions: The Definitive Guide to Questionnaire Design – For Market Research, Political Polls, and Social and Health Questionnaires (San Francisco: Jossey-Bass, 2004); Gillham, B., Developing a Questionnaire (New York: Continuum International, 2000); Peterson, R.A., Constructing Effective Questionnaires (Thousand Oaks, CA: Sage, 2000); Schuman, H and Presser, S., Questions and Answers in Attitude Surveys (Thousand Oaks, CA: Sage, 1996); Fink, A., How to Ask Survey Questions (Thousand Oaks, CA: Sage, 1995); Sudman, S and Bradburn, N.M., Asking Questions (San Francisco: Jossey-Bass, 1983) Cierpicki, S., Davis, C., Eddy, C., Lorch, J., Phillips, K., Poynter, R., York, S and Zuo, B., ‘From clipboards to online research communities: A cross cultural review of respondents’ perceptions’, ESOMAR Congress Odyssey, Athens (September 2010) Biering, P., Becker, H., Calvin, A and Grobe, S.J., ‘Casting light on the concept of patient satisfaction by studying the construct validity and the sensitivity of a questionnaire’, International Journal of Health Care Quality Assurance 19 (3) (2006), 246–58; Clark, B.H., ‘Bad examples’, Marketing Management 12 (2) (2003), 34–8; Bordeaux, D.B., ‘Interviewing – part II: Getting the most out of interview questions’, Motor Age 121 (2) (February 2002), 38–40 Bressette, K., ‘Deeply understanding the mind to unmask the inner human’, ESOMAR Qualitative, Marrakech (November 2009); Reynolds, T.J., ‘Methodological and strategy development implications of decision segmentation’, Journal of Advertising Research 46 (4) (December 2006) 445–61; Healey, B., Macpherson, T and Kuijten, B., ‘An empirical evaluation of three web survey design principles’, Marketing Bulletin, 16 (May 2005), 1–9; Hess, J., ‘The effects of person-level versus household-level questionnaire design on survey estimates and data quality’, Public Opinion Quarterly 65 (4) (Winter 2001), 574–84 10 Fairfield, A., ‘Doing the right thing is a brand communicator’s imperative’, Admap 504 (April 2009), 32–5 11 Alioto, M.F and Parrett, M., ‘The use of “respondentbased intelligent” surveys in cross-national research’, ESOMAR Latin American Conference, São Paulo (May 2002), 157–220; Knauper, B., ‘Filter questions and question interpretation: Presuppositions at work’, Public Opinion Quarterly 62 (1) (Spring 1998), 70–8; Stapel, J., ‘Observations: A brief observation about likeability and interestingness of advertising’, Journal of Advertising Research 34 (2) (March/April 1994), 79–80 12 Graeff, T.R., ‘Reducing uninformed responses: The effects of product class familiarity and measuring brand knowledge on surveys’, Psychology and Marketing 24 (8) (August 2007), 681–702; Graeff, T.R., ‘Uninformed response bias in telephone surveys’, Journal of Business Research 55 (3) (March 2002), 251 13 Braunsberger, K., Gates, R and Ortinau, D.J., ‘Prospective respondent integrity behaviour in replying to direct mail questionnaires: A contributor in overestimating nonresponse rates’, Journal of Business Research 58 (3) (March 2005), 260–7; Lee, E., Hu, M.Y and Toh, R.S., ‘Are consumer survey results distorted? Systematic impact of behavioural frequency and duration on survey response errors’, Journal of Marketing Research 37 (1) (February 2000), 125–33 14 Wilson, E.J and Woodside, A.G., ‘Respondent inaccuracy’, Journal of Advertising Research 42 (5) (September/October 2002), 7–18; Gaskell, G.D., ‘Telescoping of landmark events: Implications for survey research’, Public Opinion Quarterly 64 (1) (Spring 2000), 77–89; Menon, G., Raghubir, P and Schwarz, N., ‘Behavioural frequency judgments: An accessibilitydiagnosticity framework’, Journal of Consumer Research 22 (2) (September 1995), 212–28; Cook, W.A., ‘Telescoping and memory’s other tricks’, Journal of Advertising Research (February–March 1987), 5–8 15 Goodrich, K., ‘What’s up? Exploring upper and lower visual field advertising effects’, Journal of Advertising Research 50 (1) (2010), 91–106 16 Bednall, D.H.B., Adam, S and Plocinski, K., ‘Ethics in practice: Using compliance techniques to boost telephone response rates’, International Journal of Market Research 52 (2) (2010), 155–68 17 Nancarrow, C and Brace, I., ‘Let’s get ethical: Dealing with socially desirable responding online’, Market Research Society: Annual Conference (2008); Tourangeau, R and Yan, T., ‘Sensitive questions in surveys’, Psychological Bulletin 133 (5) (September 2007), 859–83; France, M., ‘Why privacy notices are a sham’, Business Week (18 June 2001), 82 18 Manfreda, K.L., Bosnjak, M., Berzelak, J., Haas, I and Vehovar, V., ‘Web surveys versus other survey modes: A meta analysis comparing response rates’, International Journal of Market Research 50 (1) (2008), 79–104; Hanrahan, P., ‘Mine your own business’, Target Marketing (February 2000), 32; Tourangeau, R and Smith, T.W., ‘Asking sensitive questions: The impact of data collection mode, question format, and question context’, Public Opinion Quarterly 60 (20) (Summer 1996), 275–304 19 Maehle, N., and Supphellen, M., ‘In search of the sources of brand personality’, International Journal of Market Research 53 (1) (2011), 95–114; Peterson, R.A., ‘Asking the age question: A research note’, Public Opinion Quarterly (Spring 1984), 379–83 20 Nancarrow, C and Brace, I., ‘Let’s get ethical: Dealing with socially desirable responding online’, Market www.downloadslide.net Chapter 13 Questionnaire design Research Society: Annual Conference (2008); Larkins, E.R., Hume, E.C and Garcha, B.S., ‘The validity of the randomized response method in tax ethics research’, Journal of Applied Business Research 13 (3) (Summer 1997), 25–32; Mukhopadhyay, P., ‘A note on UMVUestimation under randomized-response model’, Communications in Statistics – Theory and Methods 26 (10) (1997), 2415–20 21 Millican, P and Kolb, C., ‘Connecting with Elizabeth: Using artificial intelligence as a data collection aid’, Market Research Society: Annual Conference (2006); Patten, M.L., Questionnaire Research: A Practical Guide (Los Angeles: Pyrczak, 2001); Newman, L.M., ‘That’s a good question’, American Demographics (Marketing Tools) (June 1995), 10–13 22 Esuli, A and Sebastiani, F., ‘Machines that learn how to code open-ended survey data’, International Journal of Market Research 52 (6) (2010), 775–800; Popping, R., Computer-Assisted Text Analysis (Thousand Oaks, CA: Sage, 2000); Luyens, S., ‘Coding verbatims by computers’, Marketing Research: A Magazine of Management and Applications (2) (Spring 1995), 20–5 23 Verhaeghe, A., De Ruyck, T and Schillewaert, N., ‘Join the research – participant-led open-ended questions’, International Journal of Market Research 50 (5) (2008), 655–78; Pothas, A.M., ‘Customer satisfaction: Keeping tabs on the issues that matter’, Total Quality Management 12 (1) (January 2001), 83 24 Vicente, P., Reis, E and Santos, M., ‘Using mobile phones for survey research: A comparison with fixed phones’, International Journal of Market Research 51 (5) (2009), 613–34 25 Bellman, S., Schweda, A and Varan, D., ‘The importance of social motives for watching and interacting with digital television’, International Journal of Market Research 52 (1) (2010), 67–87 26 Russell, M., Fischer, M.J., Fischer, C.M and Premo, K., ‘Exam question sequencing effects on marketing and management sciences student performance’, Journal for Advancement of Marketing Education (Summer 2003), 1–10; Javeline, D., ‘Response effects in polite cultures’, Public Opinion Quarterly 63 (1) (Spring 1999), 1–27; Krosnick, J.A and Alwin, D.F., ‘An evaluation of a cognitive theory of response order effects in survey measurement’, Public Opinion Quarterly (Summer 1987), 201–19; Blunch, N.J., ‘Position bias in multiplechoice questions’, Journal of Marketing Research 21 (May 1984), 216–20, has argued that position bias in multiple-choice questions cannot be eliminated by rotating the order of the alternatives This viewpoint is contrary to common practice 27 ‘IKEA – love where you live’, The Communications Council, Bronze, Australian Effie Awards (2009) 28 DeMoranville, C.W and Bienstock, C.C., ‘Question order effects in measuring service quality’, International Journal of Research in Marketing 20 (3) (2003), 457–66; Singer, E., ‘Experiments with incentives in telephone surveys’, Public Opinion Quarterly 64 (2) (Summer 2000), 171–88; Schuman, H and Presser, S., Questions and Answers in Attitude Surveys (Thousand Oaks, CA: Sage, 1996) 29 Dolnicar, S., Grün, B and Leisch, F., ‘Quick, simple and reliable: Forced binary survey questions’, International 407 Journal of Market Research 53 (2) (2011), 233–54; Blumenschein, K., ‘Hypothetical versus real willingness to pay in the health care sector: Results from a field experiment’, Journal of Health Economics 20 (3) (May 2001), 441; Herriges, J.A and Shogren, J.F., ‘Starting point bias in dichotomous choice valuation with follow-up questioning’, Journal of Environmental Economics and Management 30 (1) (January 1996), 112–31 30 Kalton, G and Schuman, H., ‘The effect of the question on survey responses: A review’, Journal of the Royal Statistical Society Series A 145, Part (1982), 44–5 31 Albaum, G., Roster, C., Yu, J.H and Rogers, R.D., ‘Simple rating scale formats: Exploring extreme response’, International Journal of Market Research 49 (5) (2007), 1–17; Vriends, M., Wedel, M and Sandor, Z., ‘Split-questionnaire design’, Marketing Research 13 (2) (2001), 14–19; Conrad, F.G., ‘Clarifying question meaning in a household telephone survey’, Public Opinion Quarterly 64 (1) (Spring 2000), 1–27; McBurnett, M., ‘Wording of questions affects responses to gun control issue’, Marketing News 31 (1) (6 January 1997), 12 32 Cape, P., Lorch, J and Piekarski, L., ‘A tale of two questionnaires’, ESOMAR Panel Research, Orlando, (October 2007); Colombo, R., ‘A model for diagnosing and reducing nonresponse bias’, Journal of Advertising Research 40 (1/2) (January/April 2000), 85–93; Etter, J.F and Perneger, T.V., ‘Analysis of nonresponse bias in a mailed health survey’, Journal of Clinical Epidemiology 50 (10) (25 October 1997), 1123–8; Omura, G.S., ‘Correlates of item non-response’, Journal of the Market Research Society (October 1983), 321–30 33 Manfreda, K.L., Bosnjak, M., Berzelak, J., Haas, I and Vehovar, V., ‘Web surveys versus other survey modes: A meta-analysis comparing response rates’, International Journal of Market Research 50 (1) (2008), 79–104; Bollinger, C.R., ‘Estimation with response error and nonresponse: Food-stamp participation in the SIPP’, Journal of Business and Economic Statistics 19 (2) (April 2001), 129–41 34 Gillham, B., Developing a Questionnaire (New York: Continuum International, 2000); Saltz, L.C., ‘How to get your news release published’, Journal of Accountancy 182 (5) (November 1996), 89–91 35 Reid, J., Morden, M and Reid, A., ‘Maximising respondent engagement: The use of rich media’, ESOMAR Annual Congress, Berlin (September 2007); Couper, M.P., ‘Web surveys: A review of issues and approaches’, Public Opinion Quarterly 64 (4) (Winter 2000), 464–94; Edmondson, B., ‘How to spot a bogus poll’, American Demographics (10) (October 1996), 10–15; O’Brien, J., ‘How market researchers ask questions?’, Journal of the Market Research Society 26 (April 1984), 93–107 36 Cohen, J., ‘Reading and writing: The forgotten 12 million’, Market Research Society: Annual Conference (2006) 37 Snowden, D., and Stienstra, J., ‘Stop asking questions: Understanding how consumers make sense of it all’, ESOMAR Annual Congress, Berlin (September 2007); Chisnall, P.M., ‘Marketing research: State of the art perspectives’, International Journal of Marketing www.downloadslide.net 408 Marketing Research Research 44 (1) (First Quarter 2002), 122–5; Abramson, P.R and Ostrom, C.W., ‘Question wording and partisanship’, Public Opinion Quarterly 58 (1) (Spring 1994), 21–48 38 Charney, C., ‘Top ten ways to get misleading poll results’, Campaigns and Elections 28 (7) (July 2007), 66–7; Dubelaar, C and Woodside, A.G., ‘Increasing quality in measuring advertising effectiveness: A meta analysis of question framing in conversion studies’, Journal of Advertising Research 43 (1) (March 2003), 78–85; Becker, B., ‘Take direct route when data gathering’, Marketing News 33 (20) (27 September 1999), 29–30 39 Brinkmann, S., ‘Could interviews be epistemic? An alternative to qualitative opinion polling’, Qualitative Inquiry 13 (8) (December 2007), 1116–38; Gillham, B., Developing a Questionnaire (New York: Continuum International, 2000); Adamek, R.J., ‘Public opinion and Roe v Wade: Measurement difficulties’, Public Opinion Quarterly 58 (3) (Fall 1994), 409–18 40 Chen, S., Poland, B and Skinner, H.A., ‘Youth voices: Evaluation of participatory action research’, Canadian Journal of Program Evaluation 22 (1) (March 2007), 125; Ouyand, M., ‘Estimating marketing persistence on sales of consumer durables in China’, Journal of Business Research 55 (4) (April 2002), 337; Jacoby, J and Szybillo, G.J., ‘Consumer research in FTC versus Kraft (1991): A case of heads we win, tails you lose?’, Journal of Public Policy and Marketing 14 (1) (Spring 1995), 1–14 41 Phillips, S and Hamburger S., ‘A quest for answers: The campaign against Why’, ESOMAR Annual Congress, Berlin (September 2007); Glassman, N.A and Glassman, M., ‘Screening questions’, Marketing Research 10 (3) (1998), 25–31; Schuman, H and Presser, S., Questions and Answers in Attitude Surveys (Thousand Oaks, CA: Sage, 1996) 42 Rating a brand on specific attributes early in a survey may affect responses to a later overall brand evaluation For example, see Gendall, P and Hoek, J., ‘David takes on Goliath: An overview of survey evidence in a trademark dispute’, International Journal of Market Research 45 (1) (2003), 99–122; Bartels, L.M., ‘Question order and declining faith in elections’, Public Opinion Quarterly 66 (1) (Spring 2002), 67–79 See also McAllister, I and Wattenberg, M.P., ‘Measuring levels of party identification: Does question order matter?’, Public Opinion Quarterly 59 (2) (Summer 1995), 259–68 43 Watson, P.D., ‘Adolescents’ perceptions of a health survey using multimedia computer-assisted selfadministered interview’, Australian & New Zealand Journal of Public Health 25 (6) (December 2001), 520; Bethlehem, J., ‘The routing structure of questionnaires’, International Journal of Market Research 42 (1) 2000, 95–110; Willits, F.K and Ke, B., ‘Part-whole question order effects: Views of rurality’, Public Opinion Quarterly 59 (3) (Fall 1995), 392–403 44 Puleston, J and Sleep, D., ‘Measuring the value of respondent engagement – innovative techniques to improve panel quality’, ESOMAR Panel Research, Dublin (October 2008) 45 Schlegelmilch, B.B., Diamantopoulos, A and Reynolds, N., ‘Pre-testing in questionnaire design: A review of the literature and suggestions for further research’, International Journal of Market Research 35 (2) (1993), 171–82 46 Feldhaeuser, H and Smales, H., ‘Flying with the Simpsons: An award-winning research paper that helped Air New Zealand reinvent the long-haul air travel’, ESOMAR Asia Pacific, Melbourne (2011) 47 Blair, J and Srinath, K.P., ‘A note on sample size for behaviour coding pretests’, Field Methods 85 (11) (February 2008), 20; Conrad, F.G., ‘Clarifying question meaning in a household telephone survey’, Public Opinion Quarterly 64 (1) (Spring 2000), 1–27; Diamantopoulos, A., Schlegelmilch, B.B and Reynolds, N., ‘Pre-testing in questionnaire design: The impact of participant characteristics on error detection’, Journal of the Market Research Society 36 (October 1994), 295–314 48 Meir, D., ‘The seven stages of effective survey research’, American Marketing Association (2002); Gillham, B., Developing a Questionnaire (New York: Continuum International, 2000) 49 Cierpicki, S., Davis, C., Eddy, C., Lorch, J., Phillips, K., Poynter, R., York, S and Zuo, B., ‘From clipboards to online research communities: A cross-cultural review of participants’ perceptions’, ESOMAR Congress Odyssey, Athens (September 2010) ... one-way ANOVA 610 n-way ANOVA 614 Analysis of covariance (ANCOVA) 619 Issues in interpretation 620 Repeated measures ANOVA 622 Non-metric ANOVA 624 Multivariate ANOVA 624 Practise data analysis... Exercises 11 9 Notes 11 9 Internal secondary data and analytics 12 1 Objectives 12 2 Overview 12 2 Internal secondary data 12 5 Geodemographic data analyses 12 8 Customer relationship management 13 2 Big data... Internal secondary data and analytics 12 1 Qualitative research: its nature and approaches 14 7 Qualitative research: focus group discussions 17 9 Qualitative research: in-depth interviewing and