Big tech headlights

Listen (7 mins) or read (5 mins)

Whether it’s non-existent problems, unscalable solutions or a lack of imagination, we need to be careful about what educational technology appears to promise.

I have written before about how easy it is to get dazzled by shiny tech things and, most dangerously, thinking that those shiny things will herald an educational sea change. More often than not they don’t. Or if they do, it’s nowhere near the pace often predicted.  It is remarkable to look back at the promises interactive whiteboards (IWBs) held for example. I think I still have a broken Promethean whiteboard pen in a drawer somewhere. I was sceptical from the off that one of the biggest selling points seemed to be something like: “You can get students up to move things around”. I like tech but as someone teaching 25+ hours per week (how the heck did I do that?) I could immediately see a lot of unnecessary faff. Most in my experience in schools and colleges suggest they are, at best, glorified projectors rarely fulfilling promise. Research I have seen on impact tends to be muted at best and studies in HE like this one (Benoit, 2022) suggest potential detrimental impacts. IWBs for me are emblematic of much of what I feel is often wrong with the way ed tech is purchased and used. Big companies selling big ideas to people in educational institutions with purchasing power and problems to solve but, crucially, at least one step removed from the teaching coal face. Nevertheless, because of my role at the time (‘ILT programme coordinator’, thank you very much) I did my damnedest to get colleagues using IWBs interactively and at all (I was going to say ‘effectively’) other than as a screen until I realised that it was a pointless endeavour. For most colleagues the IWB was a solution to a problem that didn’t exist. close up of oldsmobile headlights in monochrome

A problem that is better articulated is about the extent of engagement of students coupled with tendencies towards uni-directional teaching and passivity in large classes.  One solution is ‘Clickers’.  These have been kicking around since the 1960s in fact and foreshadowed modern student / audience response systems like Mentimeter, still sometimes referred to as clickers (probably by older generation types like me). Research was able to show improvements in engagement, enjoyment, academic improvement and useful intelligence for lecturing staff (see Kay and LeSage, 2009; Keough, 2012; Hedgcock and Rouwenhort, 2014) but the big problem was scalability. Enthusiasts could secure the necessary hardware, trial use with small groups of students and report positively on impact. I remember the gorgeous aluminium cases our media team held containing maybe 30 devices each. I also recall the form filling, the traipse to the other campus, the device registering and the laborious question authoring processes. My enthusiasm quickly waned and the shiny cases gathered dust on media room shelves. I expect there are plenty still doing so and many more with gadgets and gizmos that looked so cool and full of potential but quickly became redundant. BYOD (Bring your own device) and cloud-based alternatives changed all that of course. The key is not whether enthusiasts can get the right kit but whether very busy teachers can get it and the results versus effort balance sheet firmly favours the former. There are of course issues (socio-economic, data, confidentiality, and security to name a few!) with cloud-base BYOD solutions but the tech is never going to be of the overnight obsolete variety. This is why I am very nervous about big ticket kit purchases such as VR headsets or smart glasses and very sceptical about the claims made about the extent to which education in the near future will be virtual. Second Life’s second life might be a multi-million pound white elephant.

Finally, one of the big buzzes in the kinds of bubbles I live in on Twitter is about the ‘threat’ of AI. On the one hand you have the ‘kid in the sweetshop’ excitement of developers marvelling at AI text authoring and video making and on the other doom-mongering teachers frothing about what these (massively inflated, currently) affordances offer our cheating, conniving, untrustworthy youth. The argument goes that problems of plagiarism, collusion and supervillain levels of academic dishonesty will be exacerbated massively. The ed tech solution: More surveillance! More checking! Plagiarism detection! Remote proctoring! I just think we need to say ‘whoa!’ before committing ourselves to anything and see whether we might imagine things a little differently. Firstly, do existing systems (putting aside major ethical concerns) for, say, plagiarism detection, actually do what we imagine them to do? They can pick up poor academic practice but can they detect ‘intelligent’ reworking?   The problem is: How will we know what someone has written themselves otherwise? But where is our global perspective on this? Where is our 21st century eye? Where is acknowledgement of existing tools used routinely by many? There are many ways to ‘stand on the shoulders of giants’ and different educational traditions value different ways to represent this. Remixes, mashups and sampling are a fundamental part of popular culture and the 20s zeitgeist. Could we not better embrace that reality and way of being? Spellcheckers and grammar checkers do a lot of the work that would have meant lower marks in the past but we use them now unthinkingly. Is it such a leap to imagine positive and open employment of new tools such as AI?  Solutions to collusion in online exams offer more options it seems: 1. Scrap online exams and get them all back in huge halls or 2. [insert Mr Burns’ gif] employ remote proctoring. The issues centre on students’ abilities to 1. Look things up to make sure they have the correct answer and 2. Work together to ensure they have a correct answer. I find it really hard not see that as a good thing and an essential skill. I want people to have the right answer. If it is essential to find what any individual student knows, our starting point needs to be re-thinking the way we assess NOT looking for ed tech solutions so that we can carry on regardless. While we’re thinking about that we may also want to re-appraise the role new tech does and will likely play in the ways that we access and share information and do what we can to weave it in positively rather than go all King Canute.

Benoit, A. (2022) Investigating the Impact of Interactive Whiteboards in Higher Education. A Case Study. Journal of Learning Spaces

Hedgcock, W. and Rouwenhorst, R. (2014) ‘Clicking their way to success: using student response systems as a tool for feedback.’ Journal for Advancement of Marketing Education,

Kay, R. and LeSage, A. (2009) ‘Examining the benefits and challenges of using audience response systems: A review of the literature.’ Computers & Education

Keough, S. (2012) ‘Clickers in the Classroom: A Review and a Replication.’ Journal of Management Education


5 reasons why Mentimeter works so well

[if you have never seen or used Mentimeter then a quick look here may help]

[Listen -11 mins or read below]

When it comes to tools that will do a teaching and learning job there is a world of dedicated educational technology and ‘productivity’ tools to choose from. I’m very much an experimenter and a fiddler. If I see someone using or referring to a website or tool that looks interesting in a meeting or at a conference, I am there in seconds signing up and playing around and making judgements that have become a staple of my needs and preference-focussed filtration system. In broad terms I like to be able to try things for free, for it to be relatively intuitive and straightforward and (most important) fit for either pre-defined or imagined teaching or assessment purposes. I have written more about the how and why of this with my former colleague Dr Timos Almpanis here and in chapter 4 of this collection. I try not to evangelise and I am very much of the school that would argue that purpose rather than any given tool should be a starting point for discussions about integrating digital approaches but there’s something about what Mentimeter can do and how it does it that means I do sometimes slip into ultra-enthusiast mode.  Unlike a lot of tech approaches and tools that pass the initial ‘free, easy, fit for purpose’ test there’s something about the breadth of purpose that Mentimeter is fit for and its intuitiveness that, for me, make it a class above other tools. Also see here for why Chris Little at Keele made this point a while back and here for an evaluation of a number student response tools including Mentimeter from when I and a colleague at Greenwich were tasked with identifying  what the best institution-wide student response system would be.

1. It’s not hardware dependent

Like a lot of people similarly enthusiastic about opportunities for enhancing student interaction and engagement with digital technologies, I spent a lot of time (much of which was ultimately wasted) focussed on hardware. From interactive whiteboards to in-class ipad sets to PDAs and ‘flipcams’ the issues that directly impeded scaling of use as well as my own enthusiasm  were related to one or more of the following:

  • Amount of training needed
  • Device security and storage
  • ‘Just in time’ access limits
  • Responsibility for maintenance
  • Rapidity of obsolescence of kit

In my view, all these were factors that afflicted ‘clickers’ (voting pods that were handed round in face to face sessions) – as revolutionary as they promised to be -they were only ever used by the few, despite the gleaming aluminium cases and the sumptuous foam inserts that the clicker devices sat in. The BYOD dependence on user devices when it comes to cloud-based software alternatives like Mentimeter means that:

  • People usually know how to use their own devices or at least access the internet
  • Device security, maintenance, updating is not an issue
  • They are, by definition, available; turning an oft-cited teacher frustration of mobile device distraction into a potential virtue

‘What if students don’t have a device?’ is a common question but, like many things in this domain, it’s largely about framing. I will always make participation optional and make it clear ‘if you have a device on you’ if in a face to face setting or ‘if you have a big enough screen or a separate device nearby’ if online and frequently subvert the assumption that responses need to be individual and precede voting with group-based discussion with one person per group responding.

I have moved between institutions in the last year and both have invested in a site licence and access to the full suite of tools and functionality Mentimeter offers. This privilege is something that must be acknowledged so it’s certainly not ‘free’ any more (though I  personally pay nothing of course!) the free version is still relatively generous. In my view it’s an exemplary freemium set up. Just on the right side of frustrating in amongst the persuasive.

2. It’s a slide/ presentation tool that has many merits in its own right

One thing that is often missed because the ‘point’ of Mentimeter is interaction is how well it works as an alternative to PowerPoint as a presentation tool.  Even though PowerPoint remains the default across higher education for slide production (even during that weird period when everyone was doing Prezis!), for the most part colleagues seem to struggle to break from the desktop app habit. As a consequence, sharing of slides becomes an upload/ download faff or, even if sharing is managed via MS Office cloud storage there are often restrictions on who can view. Mentimeter generates a link so the first benefit is slides can be shared as easily as any website link. Secondly, the participation link enables the students to see the slides (including detail of pictures) on their own devices in real time (as well as or possibly INSTEAD OF the main screen). Thirdly the author interface is simple, there is a variety of slide types and styles, the copyright free image gallery is easy to use as is the ALT text prompt. Fourthly, the ability to add simple interactions (eg thumbs up or down) mean that students can be invited to contribute even to content delivery type slides by, for example, agreeing or disagreeing with a controversial idea or quotation.  The slides have more limited space for text and this (to some a limitation) is an excellent discipline when preparing slides to minimise the text and challenge the tendency many of us have to use too much of it.

A screenshot from the editing woindow of Mentimeter showing a bulleted slide with image and also the content slide types available

The editing window of Mentimeter showing a bulleted slide with a copyright free image and also the content slide types available

3. The participation and interaction options are substantial and adaptable

In a previous post there were a few occasions where students chose notoriety over maturity and tried to undermine sessions by being abusive in open text questions. This led to something of a knee-jerk response by some colleagues who questioned whether the tool should be supported or used at all. Much like (way way back) access to YouTube was banned for all students AND teaching staff in a college I worked in because ONE student accessed a (seriously) inappropriate video. The sledgehammer / nut response was not the way to address things, not least because Mentimeter’s existing tools and functionality enable users to avoid and tackle such behaviours. So, if open text questions are used there are ways of monitoring and filtering content (including a profanity filter) and of the ten interaction/ question types only three are open text.  To grasp this, however, does often necessitate more than superficial exploration and experimentation (or coming to one of my hour-long workshops!) One thing I commonly do is encourage colleagues to consider how they might eschew the favoured word cloud and open text formats and find ways of fully exploiting the lesser used types.  In addition, it’s important to think about how the interactions are presented and managed. A well-designed question can be an excellent vehicle for prompting discussion prior to ‘voting’ or as a prompt for analysing/ rationalising responses that have already been offered.

screenshot from Mentimeter authoring dashboard showing all the question types available

Mentimeter authoring dashboard showing all the question types available

4. Frequent updates and improvements

There’s no resting on laurels with Mentimeter and there does seem to be acknowledgement of user requests. For example, the ability to embed video in slides from YouTube is a real blessing and, if using Mentimeter as a slide tool as well as for interaction, further minimises shifting between tabs or different software.  The recently introduced collaborative authoring of presentations was much requested at UCL and enables more efficient working in addition to the collaborative potential. A very recent and welcome improvement is the ability to have active hyperlinks (in both participation and presentation modes). The ‘Mentimote’ tool that allows you to use your smartphone as a slide clicker, moderation tool and presentation embellisher has also recently switched from beta to ‘fully fledged’ mode and works very well, especially for live in-person events.

5. When Covid came, Mentimeter was equipped to adapt.

The default pace setting in Mentimeter is ‘presenter paced’. That is, the presenter advances slides and only then can participants see them. This is very much in keeping with the how Mentimeter (presumably) was conceived and how many people who are users regard it. However, the non default option (audience paced) allows slide collections with interactions to be accessed at audience pace. When lessons switched online almost across the board it was common for academic colleagues to take the intuitive approach and try to replicate face to face teaching in online environments via Zoom, Teams or Collaborate. They often tried to incorporate Mentimeter slides too. Whilst this is do-able and it is something I routinely use myself, the complexity and both mental and actual bandwidth this layer added to already struggling staff and students (with kit, with space, with implications of Covid) meant that it often felt unsatisfactory. Alongside my and colleagues’ recommendations to rethink how online time could be exploited and optimised I encouraged colleagues to think about the possibilities of using Mentimeter asynchronously. By encouraging participation ahead of a session then presenting results in a session much faffing, device and screen changing is removed but still students have a buy-in to the content. When I came to my current post it was fascinating to see how colleagues in similar positions to my own such as Dr Silvia Colaiacomo were saying the same thing here.

If you want to read more on my thoughts about Mentimeter see this post and also this collaboration with two former colleagues (Dr Gerhard Kristandl from Greenwich and Paramedic extraordinaire Richard Ward who is at Cumbria).

Here, too, is a video case study I made with a colleague and student from the Division of Psychiatry on academic and student use of Mentimeter.

Colleagues at UCL interested in using Mentimeter start here: