Building new cultures: A visit to UCL East (Marshgate)

A panoramic photo realistic illustrtaion of the UCL East campus showing Marshgate and Poole Street as well as the Mittel Orbit

I have worked in primary and secondary schools, further education colleges, overseas colleges & universities and at three UK universities. So much of the identity of all those institutions was not only woven into the fabric of the buildings but, in many ways, defined and moulded by the spaces themselves. As an alumnus of Thames Poly (aka University of Greenwich, my former employer), I experienced a rough round the edges, radical, class-conscious undergraduate degree that oozed 60s idealism from the concrete 60s edifice that was plonked awkwardly on a bunch of shops in Woolwich in South East London (this wasn’t in the 60s by the way; I’m not that old). That block is now flats and, with the change in name, the poly has become a uni and has moved the largest part of its operation to the Old Royal Naval College in Greenwich. How much of its current identity is wrapped in the huge selling point that is those buildings? How much of that cultural capital seeps into the subconscious of the staff and students as well as into the conscious marketing, framing and ongoing aspirations? Frankly, the place I studied at and the place I used to work at couldn’t be more alien to one another.

I have seen new buildings or radical re-fits from a ‘college without walls’ (disaster!) to a construction hub that was almost all atrium along with its too small triangular classrooms. When I moved to UCL I loved the post lockdown opportunities to visit many of its Bloomsbury buildings: the old, the repurposed and the refurbished. Many of the spaces are locked into pre 20th century architecture or, like the IoE buildings, echoic of late 60s/ early 70s modernist design (and protest!). Limitations on space and light in particular are ‘themes’ so I was delighted to be able, with a dozen or so colleagues, visit the Marshgate building of the new UCL East campus yesterday. Due to open in September 2023 it is paired with the One  Poole Street building which is both study and living space (this one opens in September ’22). On the site of the Olympic Park and a stone’s throw from West Ham United’s new stadium (you can’t have everything I guess), the broader site is welcoming, feels quite safe and is very accessible by public transport and bike.

the helter skelter building in stratford olympic park in london is a messy red scaffolding like structure and it sits on the right of this picture. The left is dominated by a huge concrete structure of eight floors with a wall of narrow windows that is the Marshgate building

UCL East Marshgate building with West Ham stadium and ‘Mittel Orbit’

The UCL site has all the accurate information you could want about location, development, opening and maps by the way, so I won’t bother repeating any of it here!

Our tour didn’t include an inside visit to the ‘One Poole Street’ building as it is very close to completion and will be used from September this year. The twin block is multiple stories of student accommodation atop teaching spaces which include a cinema that it is hoped will also be a community venue. The landscaping, shifting of bus stops (108 route) and general polishing were very much in evidence. At least one of us was heard to say: “I’d like to live here!”

Poole Street Building- this is two towers of accommodation more than 15 storeys high on top of flatter blocks which contain classrooms and other usual university things

One Poole Street building (formerly known as Poole Street West)

Our tour was led by Helen Fisher who is the UCL East operations lead and we were accompanied by the UCL East Director, Professor Paola Lettieri, as well as representatives from the building contractors. Three of my colleagues from Arena were there and, after a ‘short’ delay while one of our sub-party was found some trousers, we trudged to the entrance in our hard hats, boots and protective gloves.

UCL name in relief in front entrance of Marshgate building at UCL East

Entrance to Marshgate building showing UCL logo in relief

front view of as yet unfinished marshgate building at the new UCL East campus

‘Front’ view of Marshgate though over time this may not be the main entrance

Whilst it’s quite hard to visualise how some of the spaces will look, being there really helped me to  appreciate the scale and the vision. The thing that I kept coming back to as I listened to Helen and others enthuse about the realisation of the spaces is how much working cultures could be defined by the way the building (and the wider environs) are shaping up. The thought that has gone into the teaching spaces, the design and centrality of inter-disciplinary spaces, the community-focussed spaces and possible activities, the value placed on wellbeing & access and the importance of communal space, light and views for both staff and students all permeated the framing of the tour and the conversations within the group. While debates rage in the wider world about the various pros and cons of working from home and returning to offices, I couldn’t help thinking: ‘This place would pull me in of a morning”.

A trip up in the goods lift (some might say appropriately enough) saw us to our first stop on the 7th floor and the communal space. The light from the windows coming in on the left in the picture below will be supplemented by the central pool of light from the glass-not-glass roof in the central section which will supply light within the whole building. It was shame this was still covered as it will likely change the look and feel incredibly. The glass panes in the widows are designed in such a way as to prevent the summer sauna/ winter freezer effect of other buildings many of us are familiar with.

A large space with high ceilings and pillars still clearly unders construction but with a wall of wiondows just about visible

7th Floor staff communal space- the wall of windows sit in the void at the top of the building that can be seen in the image above

On the 6th floor we walked through shared lab spaces and looked at some teaching rooms. Space and light define the lab, office and communal spaces while the classrooms pull light from the ‘core’.

View from window of part of the olympic park- the canal, greenery, buildings and the construction of the London CVollege of Fashion

View from a shared lab (I think!)

Escalators as yet still being fitted lead from second to fourth floor

Escalators that will take staff and students from the ground to the second floor. Another set will enable swift movement from 2nd to 4th.

As I understood it, along with stairs there are lifts (and toilets) in each corner of the building but swift access to the student communal spaces and library are facilitated by the floor skipping escalator system.  One thing I very much liked was the ‘picture window’ below. Apparently, this is designed in such a way that it can be opened as a bridge to other builds as and when they are completed.

Picture windows with view blocked by building works

Picture window at far end of the library and study spaces

We saw standard classrooms, a sort of ‘executive’ conference venue and audio-visual creator spaces as well as spaces for film that had some colleagues drooling! We didn’t see the object-based learning spaces or the in-situ ‘museum’ space but, as a former history teacher, knowing these will be a feature of a planned, authentic pedagogic approach is very exciting. The large lecture hall with seating designed for whole and small group activity will I hope contribute towards shifts away from over use of the uni-directional, ‘classic’ lecturing style.

A lecture hall space that is still being built. No seating yet in this large, wide space. The exposed pipe work and scaffolding make it hard to visualise what the space will look like.

Large lecture hall

As we worked our way down the building we eventually arrived at the main hall. This will have public spaces, commissioned art work and will double as a gallery space. I wondered as we went round how the spaces will settle on the names that become common. No doubt the architects and the UCL East staff will strongly influence things but I wonder if the things like the often-heard ‘the Mez’ will be well-established or if, as staff and students arrive, new names will emerge.

View of the entrance hall of Marshgate building on the ground floor showing concrete pillars and a huge open space

Entrance hall (currently being fitted with underfloor heating)

We rounded our visit off with a walk around the Olympic park, the cultural district of which is being branded as the ‘East Bank’ . The morning was completed with lunch in the sun with colleagues at ‘The Breakfast Club’ in ‘Here East’ which I only recently discovered has been a UCL presence in Stratford for some time. Maybe it was the sun, the lunch and/ or spending more relaxed time with colleagues but I came away feeling very positive about UCL East. The possibilities are definitely there.

Sharing rationales and managing expectations

On the one hand we hear a lot of talk about co-creation, dialogue and partnership with students in HE. On the other we witness the (in many ways the completely understandable) persistence of uni-directional, risk-averse and more conservative approaches to teaching, learning and assessment. In academic development and digital education it is not uncommon for us to hear something on the lines of ‘I tried X, students hated it so I reverted as swiftly as possible to the tried and tested‘.  The myth of academic autonomy and marketisation of education along with both actual and perceived expectations of students contribute to stifling drives to innovate. Too often when we do change things up a bit we miss a simple way of managing these expectations consequent of the current drivers: rationalising our approaches. In this short video my colleague, Dr Alex Standen, says why she thinks this is fundamental.

 

Transcript

Can ‘ungrading’ change the way students engage with feedback and learning?

Dr Eva Mol; Dr Martin Compton- summary of paper presented at UCL Education conference 6th April 2022

‘Ungrading’ is a broad term for approaches that seek to minimise the centrality of grades in feedback and assessment. The goal is to enable students to focus on feedback purely as a developmental tool and to subvert the hegemony and potentially destructive power of grades. Fundamentally, ungrading is, at one end of the scale, completely stopping the process of adding grades to student work. A less radical change might be to shift from graded systems to far fewer gradations such as pass/ not yet passed (so called ‘minimal grading’). You don’t fatten the pig by weighing it

In addition to the summary offered in this post we began with the definition above and encouraged colleagues to consider critiques of the existing grading-dominated zeitgeist in terms of reliability, validity and  fairness. Grades become a proxy for learning in the minds of both students and lecturers and huge distractions away from the potentials of feedback and genuine dialogue about the work rather than the percentage or grade letter appended to it.

Grades can dampen existing intrinsic motivation… enhance fear of failure, reduce interest, decrease enjoyment in class work, increase anxiety, hamper performance on follow-up tasks, stimulate avoidance of challenging and heighten competitiveness (Schinske & Tanner, 2014)

We summarised the range of possibilities for those interested from simply talking about threats and potential detrimental effects of grades (as well as perceived benefits) through to wholesale, systemic change.

  • Scepticism/ discussion/ dialogue
  • Piloting no grades on small or low stakes work
  • ‘Conceal’ grades in feedback
  • Discussed (even negotiated) grades after engagement with feedback
  • Designing out grading
  • Students collaborate on criteria
  • Grade only for final summatives
  • Minimal grading (e.g. Pass/ fail)
  • Remove grades for early modules or years
  • Students self-grade
  • All students graded ‘A’
  • Institutional level – no grade policies

One of my ungrading experiences (Eva Mol)

These are based on teaching I did at Brown University (Providence Rhode Island), with a classroom of graduate and undergraduate students from archaeology and philosophy. I decided to give them all an A (highest mark possible) before the class started.

What did I learn about students?

  • Initially it was a shock to get students out of the system of marks! For most it is really their only mode of thinking about progress and learning, they wondered why take a course if there was no mark (which I think is very disconcerting).
  • However, this shifted quickly from shock to viewing the class as a few hours of relief from the system, followed by less anxiety, more experimentation, and students thinking freely and critically both about the system, as well as what they wanted to achieve in a course.
  • Much more engagement with the content of the course material and weekly readings
  • Discussions were more lively as there was less performance anxiety, students were more personal as well.
  • They set their own personal goals for the class, and I as instructor helped them achieve it. These were a variety of things: speaking at a conference, writing a blog, writing an article. At the end, they realised they got much more out of a course than they ever could with just a mark.

What did I learn as a teacher?

  • It is not less work! I still had to read what my students wrote, correspond to emails, give feedback. But it is really different and much more enjoyable work: when not reading in the context of how writing scores against a grading scale, you can allow yourself to appreciate what students accomplished in their writing.
  • Comments on feedback were much more rewarding because it was not to justify the mark for the administration, but how you can help students improve, and because there is no mark involved, students read feedback.
  • It made me a more engaged instructor, more flexible, creative, and more relaxed.
  • Because I could be flexible, I was much better equipped to deal with building in equity and inclusion.
  • It also forced me to critically reflect on the relationship between grading and teaching, contextualize how we have normalized the artificial frame of numerical feedback, and look for alternatives aimed at my personal pedagogy.

I felt empowered to question all aspects of the folklore. Why am I assigning a research paper even though it’s always a disappointment? Why do I care whether students use MLA formatting correctly down to the last parenthesis and comma? (I don’t.) Why should I worry about first-year writing as a course meant to prepare students for the rest of college? Why can’t I have autonomy over what I think students should experience? (Warner 2020, 215).

Now is the time

The pandemic showed that we can change, if necessary, perhaps now is the right time to reflect on the system. We have an opportunity to shift the way students feel about their own learning and move away from more traditional words associated with grading.

The classroom remains the most radical space of possibility in the academy. (bell hooks 1994)

—————————————————

References and more about ungrading

bell hooks (1994). Teaching to Transgress: Education as the Practice of Freedom, New York: Routledge.

Blum, S. and A. Kohn (eds.), (2020). Ungrading: Why rating students undermines learning (and what to do instead). West Virginia University Press.

Blum, S. (2019). Why Don’t Anthropologists Care about Learning (or Education or School)? An Immodest Proposal for an Integrative Anthropology of Learning Whose Time Has Finally Come. American Anthropologist 121(3): 641–54.

Eyler, J. R. (2018). How Humans Learn: The Science and Stories behind Effective College Teaching. Morgantown: West Virginia University Press.

Inoue, A.B. (2019). Labor-Based Grading Contracts: Building Equity and Inclusion in the Compassionate Writing Classroom. Fort Collins, CO: WAC Clearinghouse and University Press of Colorado. https://wac.colostate.edu/books/perspectives/labor/.

Rust, C. (2007), Towards a scholarship of assessment, Assessment & Evaluation in Higher Education 32:2, 229-237

Sackstein, S. (2015). Hacking Assessment: 10 Ways to Go Gradeless in a Traditional Grades School. Cleveland, OH: Times 10 Publications

Schinske, J., and K. Tanner (2014). Teaching more by grading less (or differently). CBE – LIfe Sciences Education 13, (2), 159-166

Stommel J., (2017), Why I don’t Grade. JesseStommel.com https://www.jessestommel.com/why-i-dont-grade/

Warner, J., (2020). Wile E. Coyote, the Hero of Ungrading, in S. Blum, Ungrading: Why rating students undermines learning (and what to do instead). West Virginia University Press, 204-218

Wormeli, R. (2018). Fair Isn’t Always Equal: Assessment and Grading in the Differentiated Classroom. 2nd ed. Portland, ME: Stenhouse.

Team Based Learning revisited

I have been forced to confront a prejudice this week and I’m very glad I have because I have significantly changed my perspective on Team Based Learning (TBL) as a result. When I cook I rarely use a recipe: rough amounts and a ‘bit of this; bit of that’ get me results that wouldn’t win Bake Off but they do the job.  I’m a bit anti-authority I suppose and I might, on occasion, be seen as contrary given a tendency to take devil’s advocate positions.  As a teacher educator, and unlike many of my colleagues over the years, I tend to advocate a more flexible approach to planning, am most certainly not a stickler for detailed lesson plans and maintain a sceptisicm (that I think is healthy) about the affordances of learning outcomes and predictably aligned teaching. I think this is why I was put off TBL when I first read about it. Call something TBL and most people would imagine something loose, active, collaborative and dialogic. But TBL purists (and maybe this was another reason I was resistant) would holler: ‘Hang on! TBL is a clearly delineated thing! It has a clear structure and process and language of its own.’ However, after attending a very meta-level session run by my colleague, Dr Pete Fitch, this week I was embarrassed to realise how thoroughly I’d misunderstood its potential flexibility and adaptability as well as the potentials of different aspects I might be sceptical of in other contexts.

Established as a pedagogic approach in medical education in the US in the 1970s, it is now used widely across medical education globally as well as in many other disciplinary areas. In essence, it provides a seemingly rigid structure to a flipped approach that typically looks like this:

  • Individual pre-work – reading, videos etc.
  • Individual readiness assurance test (IRAT) – in class multi-choice text
  • Team readiness assurance teast (TRAT) – same questions, discussed and agreed- points awarded according to how few errors are made getting to correct response
  • Discussion and clarification (and challenge)- opportunities to argue, contest, seek clarification from tutor
  • Application- opportunity to take core knowledge and apply it
  • Peer evaluation

This video offers a really clear summary of the stages:

Aside from the rigid structure, my original resistance was rooted in the knowledge-focussed tests and how this would mean sessions started with silent, individual work. However, having been through the process myself (always a good idea before mud slinging!), I realised that this stage could achieve a number of goals as well as the ostensible self-check on understanding. It provides a framing point for students to measure understanding of materials read; it offers-completely anonymously- even to the tutor, an opportunity to guage understanding within a group; it provides an ipsative opportunity to measure progress week by week and acts additionally as a motivator to actually engage with the pre-session work (increasingly so as the learning culture is established). It turns a typically high stakes, high anxiety activity (individual test) into a much lower stakes one and provides a platform from which intial arguments can start at the TRAT stage. A further advantage therefore could be that it helps students formatively with their understanding of and approaches to multi-choice examinations in those programmes that utilise this summative assessment methodology.  In this session I changed my mind on three questions during the TRAT, two of which I was quietly (perhaps even smugly) confident I’d got right. A key part of the process is the ‘scratch to reveal if correct’ cards which Pete had re-imagined with some clever manipulation of Moodle questions. We discussed the importance of the visceral ‘scratching’ commitment in comparsion to a digital alternative and I do wonder if this is one of those things that will always work better analogue!

The cards are somewhat like those shown in this short video:

To move beyond knowledge development, it is clear the application stage is fundamental. Across all stages it was evident how much effort is needed in the design stage. Writing meaningful, level appropriate multi-choice questions is hard. Level-appropriate, authentic application activities are similarly challenging to design. But the payoffs can be great and, as Pete said in session, the design lasts more than a single iteration. I can see why TBL lends itself so well to medical education but this session did make me wish I was still running my own programme so I could test this formula in a higher ed or digital education context.

An example of how it works in the School of Medicine in Nanyang Technological University can be seen here:

The final (should have been obvious) thing spelt out was that the structure and approach can be manipulated. Despite appearances, TBL does enable a flexible approach. I imagine one-off and routine adaptations according to contextual need are commonplace.  I think if I were to design a TBL curriculum, I’d certainly want to collaborate on its design. This would in itself be a departure for me but preparing quality pre-session materials, writing good questions and working up appropriate application activites are all essential and all benefit from collaboration or, at least, a willing ‘sounding board’ colleague.  I hope to work with Pete on modelling TBL across some of the sessions we offer in Arena and I really need to get my hands on some of those scratch cards!

 

Using questions in live teaching sessions

The video below is a bitesize summary of a session I was invited to host that came about as a consequence of an earlier post on questioning and a resulting Twitter chat. It’s 4 minutes long. (Transcript of bitesize summary of questioning session).

 

The slides for the session are here. They have similar questions-about-questions as those in the video:

(To advance slides hover cursor at bottom left of slide screen and use arrows or click on slide once and use keyboard arrows. To copy presentation, first have your Mentimeter account open, then click here to open the presentation in a new tab- you should then have the option to copy to your account on screen)

#LTHEChat – 2nd Feb 2022 – Supporting and humanising behavioural change without the behaviourism: nudges and digital footprints

Join at 20.00h on 2/2/22 (so many 2s!) via  https://twitter.com/LTHEchat 

Hosted by  Ameena Payne, Martin Compton, Sophie Kennedy an early career researcher, educator and incoming doctoral student (Payne), a  disabled, undergraduate psychology student (Kennedy) and me, our collaborative tweet chat aims to explore how behavioural change in online, higher education can be supported without behaviouristic approaches. Specifically, we will engage in discussion on how nudges and digital footprints may be deployed effectively to empower marginalised students – and the potential pitfalls of such data-driven pedagogy.

When students engage in online learning, they leave behind digital footprints, artefacts that trace their activities such as contributions, page views and communications. Digital learning management systems (LMS) generate data from these footprints that can provide insight into student progress and engagement as it relates to student success. These data are called learner analytics (LAs). LAs encompass the broad data mining, collection, analysis, and sharing/reporting/disseminating of students’ digital footprints. LAs are shaping the role of online instruction and student self-regulated learning by promoting ‘actionable intelligence’ (Bayne et al., 2020, p. 71), allowing instructors to orient students and empowering students to orient themselves. 

The growing adoption and interest in LAs has supported a strategic commitment to transparency regarding key drivers for improved student engagement, retention and success. At the same time, concerns are increasingly voiced around the extent to which students are informed about, supported (or hindered by), and tracked and surveilled as they engage online. It is important to acknowledge that making pedagogical conclusions based on delimited dimensions creates a context for stereotyping and discrimination, and profiling can result in hindering students’ potential and may hurt self-efficacy.

Nudge theory, coined by behaviour economist Richard Thaler, connects persuasion with design principles (Thaler, 2015). A nudge is an approach that focuses not on punishment and reward (behaviourism) but encourages positive choices and decisions – fundamental is understanding the context.

We’d like to share a few assumptions as we engage in this discussion:

  • Academic staff have a responsibility to support our increasingly diverse body of students and need to be open to new tools and techniques such as data generated by our students’ digital footprints and opportunities offered by behavioural psychology.
  • Achievement differentials and attainment gaps exist for marginalised students. Disabled students, or students with executive dysfunction, may struggle with skills vital to independent study and content learning e.g., initiation, planning, organisation, etc. For disabled students, a product of being under-served by higher education institutions (HEIs) is that they often demonstrate lower levels of engagement which leads to disproportionate completion rates and, subsequently, employment rates and other outcomes. 
  • Behaviouristic approaches (rewards and sanctions) are at the heart of much of what we still do in education but there have been movements and trends challenging manifestations of this – from banning of corporal punishment in schools to rapid growth in interest in ungrading. 
  • LMS data are not indicators of students’ potential and merit. LAs are not impartial; they are creations of human design. By giving a voice to the data, we’re defining their meaning through our interpretations.

It is valuable to build in periodic or persistent nudges of and toward ‘both the goal and its value’ to empower all students to sustain their efforts (CAST, 2018). We advocate the implementation of nudges as something that can be useful for everyone using an LMS, as compared to a tool aimed directly at disabled students, who may feel singled out. We hold that nudging is less of an evolution of behaviourism but more of a challenge to its ubiquity and all the common assumptions about its effectiveness. We propose the employment of empathy, human connection (in contrast with carrot and stick approaches of education) and understanding to help effect small changes through supportive nudges. Nudging, prompted by LAs, is one way to approach improving achievement, narrowing gaps and offering connection and support for all students. 

Q1 – If nudging students is less about coercive practises (punishments and rewards) and more about ‘soft’, small-step connections towards positive change, what examples can you offer from your practice?

Q2 – What role does/could learning analytics (LAs) play in shaping our in-course interactions with students, particularly those from marginalised groups? 

Q3 – LAs risks profiling students and driving inequality. How might we address the weaknesses of LAs (such as the cognitive biases we may bring to its interpretation and/or some students being advantaged by extra guidance)?

Q4 – What role might nudging and/ or LAs play in personalising/adaptive learning?

Q5 – Regarding the complex issues in the nexus of student agency & subjectivity, privacy, consent, & vulnerability, how might we differentiate between LAs & surveillance in online HE?

Q6 – Can nudges assist students in overcoming ‘learned helplessness’ especially when breaking through cycles of negative thoughts and self-blame? If so, how might nudges support students in taking control of their educational experiences?

Further reading:

Bayne, S., Evans, P., Ewins, R., Knox, J., Lamb, J., Mcleod, H., et al. (2020). The Manifesto for Teaching Online. Cambridge, MA: MIT Press.

CAST (2018). Universal Design for Learning Guidelines version 2.2. http://udlguidelines.cast.org

Commissioner for Fair Access. (2019). Disabled students at university: discussion paper. Scottish Government. Available at: https://www.gov.scot/publications/commissioner-fair-access-discussion-paper-disabled-students-university/

Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64-71. https://link.springer.com/content/pdf/10.1007/s11528-014-0822-x.pdf 

Lim, L. A., Gentili, S., Pardo, A., Kovanović, V., Whitelock-Wainwright, A., Gašević, D., & Dawson, S. (2021). What changes, and for whom? A study of the impact of learning analytics-based process feedback in a large course. Learning and Instruction, 72, 101202.

Payne, A. L., Compton, M. & Kennedy, S. (In Progress). ‘Supporting and humanising behavioural change without the behaviorism: nudges and digital footprints.’ Human Data Interaction, Disadvantage and Skills in the Community: Enabling Cross-Sector Environments For Postdigital Inclusion. Springer.

Prinsloo, P. (2016). “Decolonising the Collection, Analyses and Use of Student Data: A Tentative Exploration/Proposal.” Open Distance Teaching and Learning (blog). https://opendistanceteachingandlearning.wordpress.com/2016/11/14/decolonising-the-collection-analyses-and-use-of-student-data-a-tentative-explorationproposal/.

Prinsloo, P., & Slade, S.(2015). Student privacy self-management: implications for learning analytics. In Proceedings of the Fifth International Conference on Learning Analytics And Knowledge (LAK ’15). Association for Computing Machinery, New York, NY, USA, 83–92. https://doi.org/10.1145/2723576.2723585

Prinsloo, P., & Slade, S. (2016). Student Vulnerability, Agency and Learning Analytics: An Exploration. Journal of Learning Analytics, 3(1), 159–182. https://doi.org/10.18608/jla.2016.31.10

Roberts, L. D., Howell, J. A., Seaman, K., & Gibson, D. C. (2016). Student Attitudes toward Learning Analytics in Higher Education: “The Fitbit Version of the Learning World”. Frontiers in psychology, 7, 1959. https://doi.org/10.3389/fpsyg.2016.01959

Thaler, R. (2015). The Power of Nudges, for Good and Bad. The New York Times. Available at: https://faculty.chicagobooth.edu/-/media/faculty/richard-thaler/assets/files/goodandbad.pdfWeijers, R.J., de Koning, B.B. & Paas, F. Nudging in education: from theory towards guidelines for successful implementation. Eur J Psychol Educ 36, 883–902 (2021). https://doi.org/10.1007/s10212-020-00495-0

‘Why might students not always act with academic integrity?’ We tried asking them

Guest post from Dr Alex Standen (UCL Arena). I am grateful to my colleague Alex for the post below. I can, I know, be a little bit ‘all guns blazing’ when it comes to issues of plagiarism and academic integrity because I feel that too often we start from a position of distrust and with the expectation of underhandedness. I tend therefore to neglect or struggle to deal even-handedly with situations where such things as widespread collusion have clearly occurred. It is, I accept, perhaps a little too easy to just shout ‘poor assessment design’! without considering the depth of the issues and the cultures that buttress them.  This post builds on an in intervention developed and overseen by Alex within which students were asked to select the most likely cause for students to do something that might be seen as academically dishonest. The conclusions, linked research and implications for practice are relevant to anyone teaching and assessing in HE. 

———————-

In 2021, UCL built on lessons learnt from the emergency pivot online in 2020 and decided to deliver all exams and assessments online. It’s led to staff from across the university engaging with digital assessment and being prompted to reflect on their assessment design in ways we haven’t seen them do before.

However, one less welcome consequence was an increase in reported cases of academic misconduct. We weren’t alone here – a paper in the International Journal for Educational Integrity looks at how file sharing sites which claim to offer ‘homework help’ were used for assessment and exam help during online and digital assessments. And its easy to see why, as teaching staff and educational developers we try to encourage group and collaborative learning, we expect students to be digitally savvy, we design open book exams with an extended window to complete them – all of which serve to make the lines around plagiarism and other forms of misconduct a little more blurry.

We worked on a number of responses: colleagues in Registry focused on making the regulations clearer and more in line with the changes brought about by the shift to digital assessment, and in the Arena Centre (UCL’s central academic development unit) we supported the development of a new online course to help students better understand academic integrity and academic misconduct.

The brief we were given was for it to be concise and unequivocal, yet supportive and solutions-focused. Students needed to be able to understand what the various forms of academic misconduct are and what the consequences of cases can be, but also be given support and guidance in how to avoid it in their own work and assessments.

Since September 2021, the course has been accessed by over 1000 registered users, with 893 students being awarded a certificate of completion. It’s too early of course to understand what – if any – impact it will have had on instances of academic misconduct. What it can already help us to think about, however, are our students’ perspectives on academic misconduct and how in turn we can better support them to avoid it in future.

The course opens with a Menti quiz asking participants for their opinions on academic misconduct, posing the question: Why might students not always act with academic integrity? Here are the results (absolute numbers):

A bar chart showing the results of a poll - it shows that students believe might be reasons for not acting with academic integrity such as confusion, time pressure, desire to get the best grade, not knowing how to do things and anxiety

What they are telling us is that it is less about a lack of preparation or feelings of stress and anxiety on their part, and more a lack of understanding of how to integrate (academic) sources, how to manage their workload and what academic misconduct can even entail.

Our students’ responses are in line with research findings: studies have found a significant degree of confusion and uncertainty regarding the specific nature of plagiarism (Gullifer and Tyson, 2010) and situational and contextual factors such as weaknesses in writing, research and academic skills (Guraya and Guraya 2017) and time management skills (Ayton et al, 2021).

All of which gives us something to work on. The first is looking at how we plan our assessments over the course of a year so that students aren’t impeded by competing deadlines and unnecessary time pressures. The second is to devote more time to working with students on the development of the academic skills – and it is key that this isn’t exclusively an extra-curricular opportunity. Focusing on bringing this into the curriculum will ensure that it is accessible to all students, not just those with the times and personal resources to seek it out. Finally, as we move to more digital assessments, it is about really reflecting on the design of these to ensure they are fit for this new purpose – and perhaps the first question we should all be asking ourselves is, do we really need an exam?

How effective are your questions?

[listen -10 mins -or read below]

Questions you ask students are at the heart of teaching and assessment but where and how you ask them, the types of questions you ask and the ways you ask them can sometimes be neglected. This is especially true of the informal (often unplanned) questions you might ask in a live session (whether in-person or online) where a little additional forethought into your rationale, approach and the actual questions themselves could be very rewarding. I was prompted to update this post when reviewing some ‘hot questions’ from new colleagues about to embark on lecturing roles for the first time.  They expressed the very common fears over ‘tumbleweed’ moments when asking a question, concerns over nerves showing generally and  worries about a sea of blank screens in online contexts and ways to check students are understanding, especially when teaching online.  What I offer below is written with these colleagues in mind and is designed to be practically-oriented:

What is your purpose? It sounds obvious but knowing why you are asking a question and considering some of the possible reasons can be one way to overcome some of the anxieties that many of us have when thinking about teaching. Thinking about why you are asking questions and what happens when you do can also be a useful self-analysis tool.  Questions aren’t always about working out what students know already or have learned in-session. They can be a way of teaching (see Socratic method overview here and this article has some useful and interesting comments on handling responses to questions), a way of provoking, a way of changing the dynamic and even managing behaviour.  In terms of student understanding:  Are you diagnosing (i.e. seeing what they know already), encouraging speculation, seeking exemplification or checking comprehension?  Very often what we are teaching- the pure concepts – are the things that are neglected in questioning. How do we know students are understanding?  For a nice worked example see this example of concept checking.

The worst but most common question (in my view). Before I go any further, I’d like to suggest that there is one question (or question type) that should, for the most part, be avoided.  What do you think that question might be? It is a question that will almost always lead to a room full of people nodding away or replying in other positive ways. It makes us feel good about ourselves because of the positive response we usually get but actually can be harmful. The reason for it is that when we ask it there are all sorts of reasons why any given student might not actually give a genuine response. Instead of replying honestly they see others nodding and do not want to lose face, appear daft, go against the flow. They see everyone else nodding and join in. But how many of those students are doing the same? How does it feel when everyone else appears to understand something and you don’t? Do you know what the question is yet? See foot of this post to check** (Then argue with me in comments if you disagree).

Start with low stakes questions. Ask questions that ask for an opinion or perspective or to make a choice or even something not related to the topic. Get students to respond in different ways (a quick show of hands, an emoji in chat if teaching online, a thumbs up/ thumbs down to an e-poll or a controversial quotation – Mentimeter does this well). All these interactions build confidence and ease students into ‘ways of being’ in any live taught session. Anything that challenges any assumptions they may have about how teaching ‘should’ be uni-directional and help avoid disengagement are likely to help foster a safe environment in which exchange, dialogue, discussion and the questions that are at the heart of those things are comfortably accepted. Caveat: it is worth noting here that what we might assume if a student is at the back and not contributing will almost certainly have reasons behind it that are NOT to do with indolence or distraction. A student looking at their phone may be anxious about their comprehension and be using a translator, for example  They are there! This is key. Be compassionate and don’t force it. Build slowly.

Plan your questions. Another obvious thing but actually wording questions in advance of a session makes a huge difference. You can plan for questions beyond opinion and fact checking types (the easiest to come up with on the fly). Perhaps use something  like The Conversational Framework or Bloom’s Taxonomy to write questions for different purposes or of different types.  Think about the verbal questions you asked in your last teaching session. How many presented a real challenge? How many required analysis, synthesis, evaluation? Contrast to the number that required (or could only have) a single correct response. The latter are much easier to come up with so, naturally, we ask more of them. If framing the higher order questions is tough on the spot, maybe jot a few ahead of the lecture or seminar. If you use a tool like Mentimeter to design and structure slide content it has many built in tools to encourage you to think about questions that enable anonymous contributions from students.

The big question. A session or even a topic could be driven by a single question. Notions of Enquiry and Problem-Based Learning (EBL/ PBL) exploit well designed problems or questions that require students to resolve. These cannot of course be ‘Google-able’ set response type questions but require research, evidence gathering, rationalisation and so on. This reflects core components of constructivist learning theory.

The question is your answer. Challenging students to come up with questions based on current areas of study can be a very effective way of gauging the depth to which they have engaged with the topic. What they select and what they avoid is often a way of getting insights into where they are most and least comfortable.

Wait time. Did you know that the average time lapse between a question being asked and a student response is typically one second? In effect, the sharpest students (the ‘usual suspects’ you might see them as) get in quick. The lack of even momentary additional processing time means that a significant proportion (perhaps the majority) have not had time to mentally articulate a response. Mental articulation goes some way to challenging cognitive overload so, even where people don’t get a chance to respond the thinking time still helps (formatively).  There are other benefits to building in wait time too. This finding by Rowe (1974)* is long ago enough for us to have done something about it. It’s easy to see why we may not have done though…I ask a question; I get a satisfyingly quick and correct response…I can move on. But instilling a culture of ‘wait time’ can have a profound effect on the progress of the whole group. Such a strategy will often need to be accompanied by….

Targeting. One of the things we often notice when observing colleagues ‘in action’ is that questions are very often thrown out to a whole group. The result is either a response from the lightning usual suspect or, with easier questions, a sort of choral chant. These sorts of questions have their place. They signify the important. They can demarcate one section from another. But are they a genuine measurement of comprehension? And what are the consequences of allowing some (or many) never to have to answer if they don’t want to? Many lecturers will baulk at the thought of targeting individuals by name and this is something that I’d counsel against until you have a good working relationship with a group of students but why not by section? by row? by table? “someone from the back row tell me….”. By doing this you can move away from ‘the usual suspects’ and change your focus- one thing we can inadvertently do is to focus eye contact, attention and pace on students who are willing and eager to respond thereby further disconnecting those who are less confident or comfortable or inclined to ‘be’ the same.

Tumbleweed.  The worry of asking a question and getting nothing in response can be one of those things that leads to uni-directional teaching. A bad experience early on can dissuade us from asking further questions and then the whole thing develops its own momentum and only gets worse. The low stakes questions, embedding wait time and building a community comfortable with (at least minimal) targetting are ways to pre-empt this. My own advice is that numbers are with you if you can hold your nerve and relaxed smile. Ask a question and look at the students and wait. 30 seconds is nothing but feels like an eternity in such a situation. However, there are many more of them than you and one of them will break eventually! Resist re-framing the question or repeating it too soon but be prepared to ask a lower stakes version and building from there. More advice is available in this easy access article.

Technology as a question not the answer. Though they may seem gimmicky (and you have to be careful that you don’t subvert your pedagogy for colour and excitement) there are a number of in- or pre-session tools that can be used.  Tools like Mentimeter, Polleverywhere, Socrative, Slido, Kahoot all enable different sorts of questions to be answered as does the ‘Hot Questions’ function in Moodle that prompted me to re-post this.

 

Putting thought into questions, the reason you are asking them and how you will manage contributions (or lack thereof) is something we might all do a little  more of, especially when tasked with teaching new topics or to new groups or in new modalities.

 

 

*Rowe, M. B. (1974). Wait‐time and rewards as instructional variables, their influence on language, logic, and fate control: Part one‐wait‐time. Journal of research in science teaching, 11(2), 81-94. (though this original study was on elementary teaching situations the principles are applicable to HE settings)

 

**Worst question? ‘Does everyone understand?’ or some such variant such as nodding and smiling at your students whilst asking ‘All ok? or ‘Got that?’. Instead ask a question that is focussed on a specific point. Additionally, you might want to routinely invite students to jot their most troubling point on a post it or have an open forum in Moodle (or equivalent space) for areas that need clarifying.

 

 

[This is an update- actually more of a significant re-working- of my own post,  previously shared here: https://blogs.gre.ac.uk/learning-teaching/2016/11/07/thinking-about-questions/]

Are you not engaged?

Some colleagues and I were tasked with producing a ‘toolkit’ for other colleagues looking to improve/ optimise engagement. The toolkit can be seen in this online version of the ‘engagement’ toolkit or dowloaded from here in Word format. I hope colleagues find it useful.

What struck me most when talking about and reading around this topic is how problematic it is as a concept and how little time is actually given to deconstructing meaning and principles. We throw words like ‘engagement’, ’employability’ and ‘wellbeing’ (should it be hyphenated?) around, without even checking that we have a shared understanding of what we mean. The same could be said for ‘learning’ and ‘teaching’ too I suppose. For a start, engagement in activity is not a proxy for learning but is easily confused and conflated as such. Back when I was a sessional lecturer in several further education colleges many of the quality assurance processes in every place were informed by the lengthy shadow cast by Ofsted. As I recall, along with ‘differentiation strategies’ and ‘negotiated, individualised outcomes’ (I kid you not) we needed to show that we were on top of student engagement. So, lesson plans were expected to specify student engagement activities and observation forms sought ratings in terms of ‘successful’ engagement. I can imagine people asking ‘what’s wrong with that?’ and I suspect I didn’t question it then to be honest. I was aware that I was constructing an artifice in those plans and observed sessions though. What it tended to do (definitely in my case and certainly later in the case of many of the teachers I observed) was to encourage a cynical acknowledgement of this demand. You end up shoe-horning in activities where engagement (read: students being busy) is visible because the big problem was that only engagement that was in-your-face obvious was likely to count.

This is why I very much like the engagement framework suggested by Redmond et al. (2018) and is represented below. First, it encourages us to conceptualise types of engagement and secondly, and crucially in my view, implores us NOT to find ways to measure engagement but to eschew measurement and focus on developing an environment where different types of engagement are valued and fostered.

Blended learning engagement framework showing the five aspects: social, emotional, cognitive, behavioural and collaborative

Online and blended engagement framework with definitions (adapted from Redmond et al., 2018)

 

Redmond, P., Abawi, L. A., Brown, A., Henderson, R., & Heffernan, A. (2018). An online engagement framework for higher education. Online learning, 22(1), 183-204.

5 reasons why Mentimeter works so well

[if you have never seen or used Mentimeter then a quick look here may help]

[Listen -11 mins or read below]

When it comes to tools that will do a teaching and learning job there is a world of dedicated educational technology and ‘productivity’ tools to choose from. I’m very much an experimenter and a fiddler. If I see someone using or referring to a website or tool that looks interesting in a meeting or at a conference, I am there in seconds signing up and playing around and making judgements that have become a staple of my needs and preference-focussed filtration system. In broad terms I like to be able to try things for free, for it to be relatively intuitive and straightforward and (most important) fit for either pre-defined or imagined teaching or assessment purposes. I have written more about the how and why of this with my former colleague Dr Timos Almpanis here and in chapter 4 of this collection. I try not to evangelise and I am very much of the school that would argue that purpose rather than any given tool should be a starting point for discussions about integrating digital approaches but there’s something about what Mentimeter can do and how it does it that means I do sometimes slip into ultra-enthusiast mode.  Unlike a lot of tech approaches and tools that pass the initial ‘free, easy, fit for purpose’ test there’s something about the breadth of purpose that Mentimeter is fit for and its intuitiveness that, for me, make it a class above other tools. Also see here for why Chris Little at Keele made this point a while back and here for an evaluation of a number student response tools including Mentimeter from when I and a colleague at Greenwich were tasked with identifying  what the best institution-wide student response system would be.

1. It’s not hardware dependent

Like a lot of people similarly enthusiastic about opportunities for enhancing student interaction and engagement with digital technologies, I spent a lot of time (much of which was ultimately wasted) focussed on hardware. From interactive whiteboards to in-class ipad sets to PDAs and ‘flipcams’ the issues that directly impeded scaling of use as well as my own enthusiasm  were related to one or more of the following:

  • Amount of training needed
  • Device security and storage
  • ‘Just in time’ access limits
  • Responsibility for maintenance
  • Rapidity of obsolescence of kit

In my view, all these were factors that afflicted ‘clickers’ (voting pods that were handed round in face to face sessions) – as revolutionary as they promised to be -they were only ever used by the few, despite the gleaming aluminium cases and the sumptuous foam inserts that the clicker devices sat in. The BYOD dependence on user devices when it comes to cloud-based software alternatives like Mentimeter means that:

  • People usually know how to use their own devices or at least access the internet
  • Device security, maintenance, updating is not an issue
  • They are, by definition, available; turning an oft-cited teacher frustration of mobile device distraction into a potential virtue

‘What if students don’t have a device?’ is a common question but, like many things in this domain, it’s largely about framing. I will always make participation optional and make it clear ‘if you have a device on you’ if in a face to face setting or ‘if you have a big enough screen or a separate device nearby’ if online and frequently subvert the assumption that responses need to be individual and precede voting with group-based discussion with one person per group responding.

I have moved between institutions in the last year and both have invested in a site licence and access to the full suite of tools and functionality Mentimeter offers. This privilege is something that must be acknowledged so it’s certainly not ‘free’ any more (though I  personally pay nothing of course!) the free version is still relatively generous. In my view it’s an exemplary freemium set up. Just on the right side of frustrating in amongst the persuasive.

2. It’s a slide/ presentation tool that has many merits in its own right

One thing that is often missed because the ‘point’ of Mentimeter is interaction is how well it works as an alternative to PowerPoint as a presentation tool.  Even though PowerPoint remains the default across higher education for slide production (even during that weird period when everyone was doing Prezis!), for the most part colleagues seem to struggle to break from the desktop app habit. As a consequence, sharing of slides becomes an upload/ download faff or, even if sharing is managed via MS Office cloud storage there are often restrictions on who can view. Mentimeter generates a link so the first benefit is slides can be shared as easily as any website link. Secondly, the participation link enables the students to see the slides (including detail of pictures) on their own devices in real time (as well as or possibly INSTEAD OF the main screen). Thirdly the author interface is simple, there is a variety of slide types and styles, the copyright free image gallery is easy to use as is the ALT text prompt. Fourthly, the ability to add simple interactions (eg thumbs up or down) mean that students can be invited to contribute even to content delivery type slides by, for example, agreeing or disagreeing with a controversial idea or quotation.  The slides have more limited space for text and this (to some a limitation) is an excellent discipline when preparing slides to minimise the text and challenge the tendency many of us have to use too much of it.

A screenshot from the editing woindow of Mentimeter showing a bulleted slide with image and also the content slide types available

The editing window of Mentimeter showing a bulleted slide with a copyright free image and also the content slide types available

3. The participation and interaction options are substantial and adaptable

In a previous post there were a few occasions where students chose notoriety over maturity and tried to undermine sessions by being abusive in open text questions. This led to something of a knee-jerk response by some colleagues who questioned whether the tool should be supported or used at all. Much like (way way back) access to YouTube was banned for all students AND teaching staff in a college I worked in because ONE student accessed a (seriously) inappropriate video. The sledgehammer / nut response was not the way to address things, not least because Mentimeter’s existing tools and functionality enable users to avoid and tackle such behaviours. So, if open text questions are used there are ways of monitoring and filtering content (including a profanity filter) and of the ten interaction/ question types only three are open text.  To grasp this, however, does often necessitate more than superficial exploration and experimentation (or coming to one of my hour-long workshops!) One thing I commonly do is encourage colleagues to consider how they might eschew the favoured word cloud and open text formats and find ways of fully exploiting the lesser used types.  In addition, it’s important to think about how the interactions are presented and managed. A well-designed question can be an excellent vehicle for prompting discussion prior to ‘voting’ or as a prompt for analysing/ rationalising responses that have already been offered.

screenshot from Mentimeter authoring dashboard showing all the question types available

Mentimeter authoring dashboard showing all the question types available

4. Frequent updates and improvements

There’s no resting on laurels with Mentimeter and there does seem to be acknowledgement of user requests. For example, the ability to embed video in slides from YouTube is a real blessing and, if using Mentimeter as a slide tool as well as for interaction, further minimises shifting between tabs or different software.  The recently introduced collaborative authoring of presentations was much requested at UCL and enables more efficient working in addition to the collaborative potential. A very recent and welcome improvement is the ability to have active hyperlinks (in both participation and presentation modes). The ‘Mentimote’ tool that allows you to use your smartphone as a slide clicker, moderation tool and presentation embellisher has also recently switched from beta to ‘fully fledged’ mode and works very well, especially for live in-person events.

5. When Covid came, Mentimeter was equipped to adapt.

The default pace setting in Mentimeter is ‘presenter paced’. That is, the presenter advances slides and only then can participants see them. This is very much in keeping with the how Mentimeter (presumably) was conceived and how many people who are users regard it. However, the non default option (audience paced) allows slide collections with interactions to be accessed at audience pace. When lessons switched online almost across the board it was common for academic colleagues to take the intuitive approach and try to replicate face to face teaching in online environments via Zoom, Teams or Collaborate. They often tried to incorporate Mentimeter slides too. Whilst this is do-able and it is something I routinely use myself, the complexity and both mental and actual bandwidth this layer added to already struggling staff and students (with kit, with space, with implications of Covid) meant that it often felt unsatisfactory. Alongside my and colleagues’ recommendations to rethink how online time could be exploited and optimised I encouraged colleagues to think about the possibilities of using Mentimeter asynchronously. By encouraging participation ahead of a session then presenting results in a session much faffing, device and screen changing is removed but still students have a buy-in to the content. When I came to my current post it was fascinating to see how colleagues in similar positions to my own such as Dr Silvia Colaiacomo were saying the same thing here.

If you want to read more on my thoughts about Mentimeter see this post and also this collaboration with two former colleagues (Dr Gerhard Kristandl from Greenwich and Paramedic extraordinaire Richard Ward who is at Cumbria).

Here, too, is a video case study I made with a colleague and student from the Division of Psychiatry on academic and student use of Mentimeter.

Colleagues at UCL interested in using Mentimeter start here: https://blogs.ucl.ac.uk/digital-education/2020/07/09/mentimeter-at-ucl/