The ugly truth on conducting DIY research campaigns

The ugly truth on conducting DIY research campaigns
Jane Hales is managing partner at Sapio Research.

Today’s data driven economy has theoretically transformed access to information about customers, prospects and competitors.  But drill down into the data provided by DIY research tools and the quality is raising serious concerns about the validity of research activity.

Help yourself

The plethora of self-service data collection platforms has provided organisations of every size with access to unprecedented depths of information. From apparently simple scripting to the wide choice of sample groups, DIY surveys promise fast, low cost access to customers, prospects and markets globally. But how many of these research projects are delivering benefit? How much trust does the business have in the information and can this insight confidently be used to support strategic planning or to drive high profile marketing campaigns?

The fact is that upwards of around 50% of DIY surveys are fundamentally flawed, in no small part because individuals simply do not understand what’s needed to deliver accurate, trusted and usable insight. There are four key areas to consider:

The wrong audience

The online DIY survey providers make it incredibly easy to request a basic audience of, for example, 1,000 people. But how relevant is that audience? How well balanced? If an organisation asks for nothing more than 1,000 responses, DIY platforms will simply provide the first 1,000 responses; whether or not they are all women or mainly people aged between 25 and 35. This unbalanced sample set will clearly affect the overall results.

Failure to select the correct respondents will completely skew the findings. Indeed, inexperienced organisations are wasting a disproportionate amount of time targeting the wrong audiences and as a result producing irrelevant responses that are simply not fit to support any business strategy. A recent survey about Ramadan, for example, would have delivered meaningless results if applied to the entire UK population; to be relevant to the specific questions being asked, the survey had to be targeted exclusively at Muslim respondents. This isn’t something that could be done using IP address location or ‘inferred’ demographics classifications.

This is one critical area of DIY research that organisations fail to address in part because the more selection criteria an organisation opts for, the more expensive the DIY model becomes. Indeed, in many cases, by the time a truly representative and relevant survey panel has been selected, the cost differential between DIY and agency research is hard to spot. This is particularly relevant in B2B research where you’ll often need more than the 4 screening questions allowed by Google Surveys and a feasibility beneath 5%.


Inexperience in scripting leads to organisations making basic mistakes, from asking two questions in one to asking unclear questions that the respondents cannot understand, or expecting a multiple response to a question but coding it to only enable a single response. There is a real skill in asking the questions in the right way to deliver data that can be used to support the underpinning objectives or desired headlines.

Data analysis

Would be researchers often lack the data manipulation knowledge necessary to analyse the data and can make assumptions on sample sizes that just aren’t statistically reliable.. Obviously basic percentage results can be interesting but there are many other techniques, including regression analysis, weighting and indexing that can provide a different insight into the data. Without this understanding, DIY researchers failing to get the expected results will just stop and miss out on any value from the investment. A credible market research agency will always have a plan B and find a way to manipulate the data to provide an alternative story that can support, for example, a marketing campaign.

The rise of the bot

Taking a far more proactive approach to defining the audience will also help address the increasing problem of fake information and bots skewing research results. Diligent? Ethical research organisations are constantly using a variety of techniques to confirm the veracity of survey responses.  Is an individual simply providing dummy answers in order to move onto to the next place in the website? Is he or she actually in the claimed country of location? There is also growing evidence that AI and robots are coming into the sample creation marketplace – and with organisations actively checking for this problem suggesting as many as 20% of responses need to be deleted, the DIY approach once again raises significant concerns about data relevance and quality.


Online research has clearly become an incredibly effective component of the overall market research model. But good: process, quality controls, a detailed knowledge of analysis techniques and what constitutes a statistically sample are critical if the research is to be both credible and useful. Going it alone may appear to be far cheaper than outsourcing to a third-party agency – but cost is relative; how much is the business paying for this random data? How much would the wrong decision or message cost?  Market research experts can both sanity check the project’s objectives and ensure the script is correctly coded, the audience expertly selected, and the data correctively analysed to meet the company’s objectives.   And that is a great deal more valuable than any DIY approach.

View Comments
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *