Skip to content

Reflecting on data collection

by on March 9, 2017

It has now been a month since we finished our main phase of data collection (using visionslive discussion board) which involved asking people, in some detail, about their home life and experiences of living with renewable heating technology. We had 30 participants from two countries simultaneously using an online platform over the period of 7 days – quite a logistical feat! Luckily we didn’t develop the software, only the content, and likewise we got some help with the recruitment, however, it is only now as we recover and reflect, that we are able to write up this experience in a coherent way. Fortunately we kept field diaries, and have a [very!] long chain of email exchanges between ourselves about changes we’d made to the content or questions we were asking, and approaches we took when responding. For our own benefit, and perhaps some of you may be interested given the novelty of our approach, we decided to try and blog about our data collection reflections, so here goes….

First of all, we want to talk about our participants. All social scientists quite rightly give credit to their participants and we are no different. The detail they provided us with was simply astonishing, and the time and effort that they gave to their contributions was often overwhelming. We were amazed by the diversity of experience and the openness of our participants. The richness and quality of the data that we got was better than we expected which in part we attribute to the nature of collection (more about this below) and the fact that perhaps participants felt relatively anonymous. In fact, on average participants responded to 97% of questions, many of which were quite sensitive (e.g. household income, sensitive household activities) [we have statistics on completion rates thanks to the software]. Of course, we recruited participants who had technology installed [purposive sampling] and hence would be prepared to talk about it, but we were still often taken aback by the level of detail they provided and the willingness to upload and share photos. That we didn’t take on all of the recruitment ourselves was a significant advantage particularly since we had a very specific sampling framework, and because we were working across two countries. It was also very helpful in the context of getting participants who would all contribute in the same 7 days.

The nature of the data we received did vary according to the type of question and when it was asked. For instance, we found that participants were much more comfortable and able to give detail about their everyday life and routines than the technology they had (although some did give a lot of information about their technology). This surprised us slightly as we thought participants may wonder why we were so interested in their life and routines when the study was about renewable energy. So it may be that these topics were just easier for them to talk about, or it may have been because the questions about technology were later in the week and they were suffering slightly from question fatigue. Of course, what people don’t say can be just as interesting as what they do say! Despite this, on the whole we were pleased with the amount and extent of answers we got, especially for those that required the participants to do some ‘homework’ (e.g. look up their EPC rating or find out other information about their homes/technologies).

Compared to other ways of collecting data that we’ve used in the past (e.g. primarily interviews and house tours) this format allowed participants to engage at moments in the day/week which suited them, and also allowed them to return to points over the course of the week. So the method offered greater flexibility for participants than traditional offline methods, not least because they could use a smartphone app (26% of responses were from a smartphone). We were also able to provide prompts and ask for further information when required (during the 7 days we prompted/posted 328 times, (Louise 258, Katherine 70)). Indeed, we found the work of prompting much less natural and instinctive to begin with, and did struggle to keep on top of this towards the end of the week when we had lots of data and when so many participants were actively contributing to the board at the same time. Sometimes these prompts were fairly generic and we used the same one for a number of participants (e.g. please explain why you think this is), or more often they were specific to that individual and post. What we also found was that when we thanked participants for their contributions after posting (and particularly if it was a photograph, video or a link), they were much more likely to make additional and fuller contributions or add another photograph. Moreover, the software allowed us to either view responses to a particular question, or more interestingly, to look at the all entries by participant. The flexibility that this offered was great as it allowed us to grasp the consensus view (if there was one), see what other participants had said and use this in our prompting. It also ensured that by looking at a single participant’s entries we could better understand each person’s answer in the context of their own personal story, which also made it easier to make connections between their different answers and ask more probing questions.

Managing the contributions, and responding to participants when they added new material was challenging and more work than we perhaps expected. Although we had anticipated it being a busy week, and adjusted our diaries accordingly, we were probably a bit naive about the time and effort it would take to stay on top of things. Having additional moderators would certainly have helped here, because it felt that as we were reading and responding to comments the number left to read just kept growing. For instance, we could log on at 8pm to over 500 responses and even after 2 hours of reading and responding to participants, upon exiting the feedback page we would still have over 300 new posts – including follow-ups to our recent prompts. Indeed, we were very aware that we had to make the most of the 7 day period and so spent a lot of time going over previous entries and following up points with participants.  As we were doing this, we were also taking time to revise, reword or change the following days questions according to the feedback we were getting (sections of the board were released a day at a time so not to overwhelm participants). Although it was a lot of work, it was enjoyably addictive too. We both found ourselves spending hours online checking entries and interacting with participants.

The other reflection on the software was the extent to which the answers were visible by all participants. We did experiment with having some answers open to everyone so that all participants could see the answers and comment on them, hoping that participants would interact with each other and initiate conversations we perhaps wouldn’t think of. However, the reality was that this made people more hesitant to respond, or they would just say ‘same as above’ and not add further detail. This was quite disappointing and as a result we didn’t use this function as much as we thought we would.

Despite piloting our boards, knowing what we know now means there are things we would do differently in the future. Instead of having all participants together on one board we would probably split participants up according to the type of technology they had. We would also have taken more care to understand exactly what information the recruiter gave participants (although we did provide detailed guidance) and give the recruiter more information to help distinguish between participants (e.g. there was some confusion about the difference between solar thermal and solar PV as well as biomass boilers and wood-burning stoves). Also, we made some mistakes and left out some fairly fundamental questions presuming that the recruiter would capture this information, although thankfully we were able to get this during the 7 day period.

Overall, and even though we’ve yet to really immerse ourselves in the data analysis, we are pleased with how it went and feel we have some excellent data. The ability to prompt and interact with participants, both immediately if we were online – but also having the time to consider what we wanted the prompt to be – was incredibly useful and we would certainly use this method again.

Next stop….analysis.

Leave a Comment

Leave a comment