RITS Online Courses – Determining tools to use
t was decided early on in the development of online courses for Research IT Services (RITS) that these should be open educational resources (OER). As the RITS teams promote open science and the use of open source tools, are members of the Software Sustainability Institute and were early deliverers of Carpentries workshops (https://carpentries.org/) it felt hypocritical to not create these materials as OER. In addition, the existing classroom resources are currently shared under the Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/). This immediately excluded the use of our institutional Moodle instance as a login would be required to access the materials.
Therefore, the learning materials were to be hosted elsewhere, and an alternative tool used for their creation.
A number of e-learning creation tools were identified and reviewed, albeit relatively informally.
The criterion used were:
- any tool chosen had to be well documented and easy to use
- this was important as the content owners would be responsible for the maintenance of the course materials once developed.
- the tool was free and/or open source
- the project had a small budget and it was not possible to purchase potentially multiple licenses for a product(s). Additionally, as the courses taught open source tools, the preference was to use open source tools in their creation.
- it was possible to version control the resources
- the courses have multiple content owners and therefore a method to track and manage changes would be required.
Evaluation comments for each of these tools highlighting pros, cons and gut feelings about the tools were recorded in a notebook on SharePoint for the project. This forms part of the project documentation.
The chosen development tool was the Morea framework (http://morea-framework.github.io/) from the University of Hawaii. Content is created via markdown pages that are converted to html files via the running of a script file. The content and development files are stored in GitHub, enabling version control, and the learning content is published and hosted via GitHub pages.
I like the structure of the Morea framework courses as they require the identification of learning objectives for each module and any pre-requisite knowledge. As these materials will be self-service and not facilitated, it is important to explain to course participants what it is they will be learning and what prior learning is required. In the initial phases of the project RITS training courses were identified as being either Foundation, Intermediate or Advanced. The courses build on each other to create a spiral research computing curriculum.
Morea courses are constructed from a series of Modules. Modules themselves consist of three elements: readings, experiences and assessments. This gives a nice structure to the learning materials, especially for those with limited learning design experience. Experiences provide both executable notebooks so that learners can interact with the lesson content and exercises for them to extend their skills and understanding, creating a constructivist learning experience. An additional benefit is that we were able to select the style sheet, enabling a more accessible design. The style sheet used was decided upon in discussion with the IT accessibility officer.
It is not possible to construct interactive content or quizzes within the framework, it is also not possible to log progress within the Morea developed courses. A combination of H5P and Moodle quizzes were used to resolve some of these issues.
Participants on face-to-face courses have previously asked for certificates of attendance. As we are unable to gauge their understanding from accessing the course materials it was decided that we would create a quiz. A summative assessment was created in Moodle, as this can store grades, as an MCQ for each course developed, if the quiz is passed a certificate can be generated and downloaded.
However, if the course content changes a new Moodle quiz and corresponding certificate will need to be created to avoid data loss.
For formative assessment during the course, H5P (https://h5p.org/) MCQs were created and added to the assessment sections of relevant modules. H5P was chosen as it enables the creation of a variety of interactive objects in a user-friendly way. I thought this would be easier to learn by my colleagues in RITS than Moodle quizzes. Although I have currently only used this tool for quiz creation, it can be used to make other interactive objects such as interactive videos and games.
The aim of the MCQs was to identify any conceptual misunderstandings. The Moodle quizzes contain explanations as to why an answer is or is not correct.
The H5P content is hosted on a separate server than the Morea framework content, this is due to the required plugin not being available in GitHub pages additionally, we do not currently have the H5P plugin installed on my institution’s current Moodle instance. This was more necessity than choice. Unless a new version of the content is created, saved changes are instantaneous and auto fed through to sites where this content is embedded. The benefit of this is that the links do not need to be updated, but versioning can be difficult. To try to alleviate this issue a copy of the H5P file for each quiz is stored in GitHub for back-up and version control.
The programming courses developed make use of Jupyter notebooks for the practical exercise elements. These have been made available via Anaconda Cloud (https://anaconda.org/about), the website enables users to easily view and download notebooks. However, an account is required, this may be a problem if participants do not wish to share their details with the service as they are based in the United States. The notebooks can also be downloaded as a zipped folder from within the course materials.
As there is no login to the Morea framework created courses, Google Analytics is used to track usage numbers and activity patterns.
Learning Analytics and Learning Design
As part of my focus on learning analytics I have looked at how data from e-learning systems can be used to inform learning design and reflective practice.
In February 2018 I wrote a blog post about the relationship between technology, curriculum and pedagogy (https://blogs.ucl.ac.uk/digital-education/2018/02/13/tpck-data-and-learning-design/), and in particular expand on the notion of pedgogic intent in the use of technology in learning. As a result of this blog post I was interviewed for the EdTech Podcast Episode 109 – Making Data work for Teachers (https://theedtechpodcast.com/109-making-data-work/).
In this interview I spoke about how technologies such as lecture capture as be used proactively, instead of just passively recording events, to create an engaging experience and how the data collected through its use can be used by teaching staff to reflect on the learning activity undertaken.
Following on from this, I also hosted a community webinar for Echo 360 titled ‘Data, The Lecture and Me’. (Echo360 webinar presentation)
The focus in these talks was on the data that could be extracted from the systems, and the benefits or problems that might present.
It is possible to export a number of standard reports from Moodle, but it is not always immediately ibvious how helpful these reports are to lecturers. They can also present a lot of data that is initially quite daunting without a focus.
Therefore, I have been promoting the idea of identifying 3 questions you want answered about student learning or the learning design. From this starting point, potential data sources for answering the questions can be identified. Where needed traditional analogue techniques of performing a task could be replaced by a digital option to enable data capture.
On reflection it would have been more effective to have obtained development tool requirements from the RITS project before engaging in evaluations. I would also have benefited from doing some preliminary research into evaluation frameworks, and or sought guidance from the ALT community.
The tools that were selected are able to produce the desired outcomes, create standalone online equivalents of face-to-face courses, however the combination of tools is quite complicated and a lot of manual formatting including file conversion is currently required. A review of the tools used has been undertaken and recommendations reviewed by the new Governance structure. The Governance group would like the materials to remain as OER, but UCL branding will be added so that it is easier to identify that the courses have been created by the institution. Additionally, we are exploring options of how we can more easily maintain synchronicity between the online course materials and the taught classroom materials. At present the Jupyter notebooks used for the classroom courses are converted to markdown, then manually formatted to meet the Morea framework requirements and aid readability.
Before the online courses project, none of the face-to-face courses had lesson outcomes clearly defined. Course content owners have now defined the outcomes for each of the face-to-face courses that have had online courses created. There have also been improvements to the notebooks used in the face-to-face courses as additional explanatory text has been added, this is a side-effect of the content being reviewed for the online courses.
The learning analytics work has enabled me to know what data is collected by the centrally supported e-learning technologies such as Moodle and Echo360, and the limitations of the data available for export and re-use including issues of matching data across systems. Before working on the exploring learning analytics project I knew what data was available to course administrators via Moodle reports but I did not know anything about the Moodle database and how some data is context dependent on the value of other fields.
The learning analytics and learning design work is still in the initial phases. At present we have no evidence support the effectiveness of this approach.