Thinking through the challenges of building an evidence base for edtech
Most Sunday afternoons are spent in limbo as we reminisce about the too-short weekend and think of everything that needs to be done in the coming week. Not this particular week though! In the run-up to the Reimagine Conference, the Imperial Business School was buzzing with experts and enthusiasts from the education sectors to attend some of the exciting workshops being run by some of the leading experts in their fields.
The EDUCATE workshop was designed from the outset to truly learn from the attendees, each with their unique experiences and hailing from all parts of the world. It is not often that you have such an opportunity! Premeditatively, we had constructed groups of 6 people representing different sectors of the educational space: higher-ed, edtech founders, policy-makers and educational researchers. This mixture of experiences and perspectives was the catalyst to the discussions and ideas generated.
With some shuffling, and groups equipped with their arsenals of sharpies and post-it notes, they were ready to shine light onto the workshop’s central question: What do you understand “an evidence base for Edtech” to be? Ruminating on those group discussions and revisiting those sheets of A3 filled with (mostly legible) post-it notes, we thought to share some of the outcomes of all of the industry by organising the ideas thematically.
“Trust(ed) evidence?”
The first thing we often think of when we are given some sort of evidence, is the source – who has produced it? It is hard to trust any evidence (or its implications) if we do not ultimately trust its source. As some of our attendees noted, the highest level of authority one can have in the academic context is for the evidence to have been published in a “peer-reviewed research journal”. But consider this, after a long day at work, two teachers are talking and one of them is telling the other about the latest edtech tool that they have implemented in their classroom and the positive results they have begun to see – their “self-reported beliefs, perspectives, etc”. Any published piece of research evidence may well be a valuable asset to the edtech company and its existing users, but how effective is it to support new schools to know whether to adopt the technology when compared to the word-of-mouth between teachers in a local area? Often, we trust the advice and thoughts of those in relatively similar contexts. This poses a challenge to edtech companies concerning the sort of research they should conduct and how it is best communicates such that it may be trusted by the people who should see it. The options are numerous and varied, as was demonstrated by the group discussions. Other sources of evidence cited by our participants were: conference presentations; government websites, reports and presentations; school websites built by faculty members; reviews from NGOs, non-profits and national organisations; social media posts; publicly available sales figures; simple usability studies; and small-scale pilot studies carried out by the edtech companies themselves.
“Edtech products need back-up to sell”
At the end of the day, edtech companies are well… companies! To survive, they need to be able to sustain themselves and grow by selling their product/service or receiving funding from investors. When going about building evidence for the effectiveness of their product, they have to justify the time and expense to the many stakeholders involved. And so, the evidence is often considered a “back-up” to increase sales. But there is much more to it. Education can truly transform peoples’ lives and therefore for their product, companies have an ethical duty to make sure, as one group put it, “pro[ve] that it ‘does no harm’”. However, the exercise of conducting research is tainted to the extent where the evidence collected can be solely considered as being needed for marketing material. As one group put it, “what counts as evidence depends on the stated objectives”, both of the party generating the evidence and of the audience reviewing it. The challenge? How can we be honest about the way we conduct our research and disconnect from our business thinking of “selling more” to making sure we are delivering the best possible service.
“If we’re talking ‘ed-tech’ then at least a part of the evidence must relate to learning”
Collectively, we know that many times we are producing research and looking for evidence to satisfy the many stakeholders involved. The tough thing with edtech, and education in general, is that very often, the learner is not the one who pays for the service. Too often, we are interested in numbers like “engagement” and “users” which frankly are linked to business growth rather than improvement in one’s education. A question raised in the discussion was “data at [the] level of individual vs. scaleable interventions”. Should we be more focussed on reaching a huge sample size for RCTs or should we instead spend a lot more time researching at the level of an individual and critically assess the associated learning process and the context in which it is occurring? As it was rightly said, “evidence for edtech should come from users” – it is imperative to bring the learners voice into it. Context is important here too. One group highlighted the need for it to have “been tested by a diverse group of students & teachers”, enabling potential users not only to know if it ‘works’, but if it is likely to work in their context. So the challenge is, as edtech entrepreneurs try to satisfy the needs and wants of the stakeholders around them, too often, the most important person for whom the product is being developed, the learner, is forgotten!
So what next?
At the heart of the discussions was the need for the three communities: educational researchers; the edtech industry; and educators and learners to have ways to communicate with each other and enhance the quality of the conversation about the evidence base for edtech. The different perspectives bring much-needed rigour and depth to the conversation and, ultimately the quality of the research that is conducted. This needs everyone to up their game! Important initiatives such as UCL EDUCATE, which supports edtech companies to learn how to find, interpret and generate their own research evidence are doing much to help create an evidence base that works for all!