The PDPLA have several members who have, as part of their academic training, been taught how to conduct market research – yet whenever we create a survey, conduct research and report results there is an understandable ‘you would conclude that, you are biased’ response from many. And they are right – however balanced and accurate we try to be, we are always going to have a sample based on our membership (who tend to be among the best landlords in the area due to their higher level of interest and training) and also, to be frank, we are not going to try and prove we are all ‘bad guys’ – why would we?
Based on our knowledge and experience, here are some questions to ask when reading about a survey:
- Look at who published the survey and try to assess what they will gain from promulgating the views.
- Question whether those who responded (the sample) are representative of the audience for whom conclusions are being drawn
- Challenge the logic of the conclusions and their veracity
- And obviously, check that the people reporting the research know what they are talking about
Let’s take an example – ’24 Housing’ is a publication aimed at housing managers, local authorities and those responsible for social housing. They have just published some research with the heading and tagline: “Survey shows landlords ‘struggling’ to keep up with reforms 30% didn’t understand HMO regulations, 28% in the dark over the abolition of Section 21, and 27% unaware of letting fees ban.” When we dig a little deeper, we see the survey was conducted by Market Financial Solutions (MFS), a provider of short term (bridging) finance – so we may be jumping to unfair conclusions, but you can assume more often than not, that the sample surveyed is ‘anyone on their distribution list who responded’. So in this case the 400 who answered the survey are probably members of MFS email contact list, which means that most are primarily property developers and if they do let any of their properties, they probably do so through a letting agent as they themselves are too busy ‘developing’ to be bothered with the hassles of letting to tenants. You would thus conclude, that most of them do not need to understand, for example, HMO regulations.
You could go further and ask whether ’30% didn’t understand HMO regulations’ is a good or a bad thing. You can do a lot with statistics. If MFS had 400 respondents and 30% said they did not understand HMO regulations, you can assume that 280 did and 120 didn’t. Now, if this was representative of Portsmouth – just as an example, we have around 88,000 properties of which around a quarter are rented – so lets say, 22,000. We also know that around 4,000 of those are HMO’s (depending on which of the many available HMO definitions you choose). So roughly 18-20% of Portsmouth rented homes are HMO’s. Assuming that most landlords don’t mix their portfolios (just for the sake of this example), that would mean that 20% of landlords NEED to understand HMO regulations and 80% do not. Given the research result that effectively says, 70% of their (non-representative) sample DO KNOW about HMO regulations, that means that 3x more landlords know about HMO regulations than need to do so! See what I mean about being able to do a lot with statistics. These are exactly the same facts but there is a lot of difference between ‘30% don’t understand HMO regulations’ and ‘3x more know about HMO regulations than need to do so’.
The other problem which we all face is that regulations change so frequently even the most committed have trouble keeping up. In the example given, the journalist reported ‘28% in the dark over the abolition of Section 21’ - mildly amusing as Section 21 has not been abolished, so you have to ask what the 72% of respondents (288 if the sample was 400) were thinking – the headline could have been ‘72% of landlords have no idea’ which would have been equally accurate based on this research… The real point here is that you rarely see the questions and have to try and guess from the summary results given, and more often than not, the questions are badly structured, lead the respondent or misguide them completely and sometimes, they are based on untruths.
So what is our advice? If you see ‘survey results’, at the very least take them with a pinch of salt. All too often we are guilty of confirmation bias (the tendency to search for, interpret, favour, and recall information in a way that confirms one's pre-existing beliefs or hypotheses) and we need to try and ensure we avoid this and if we do rely on research, we should read the original source and assess how accurate it is likely to be based on the questioner, the sample and the questions – anything summarised in the press (PDPLA news excepted) is likely to be inaccurate or just plain wrong.