Topics, objectives and outcomes
In line with the traditional definition of a hackathon, the expected outcomes of the event are the identification and initial (concrete, technical) pilot implementations of prototype tools/systems/data/studies which arise from the synthesis of educational technology, software development, and data science perspectives. As for previous events, the hackathon will generate a repository of code, sample data, screenshots, slides etc., from the activity of participants. An important intangible outcome will be an improved understanding of the different kinds of expertise present about what is ethical, desirable, and technically feasible. While we welcome research questions, challenges, tools and data from all hackathon participants, we expect to emphasise the following topics which the organisers feel focus particularly on user-centred learning analytics.
1. Multimodal Learning Analytics
In recent years, the Multimodal Learning Analytics (MMLA) community is researching into monitoring learning activities beyond traditional mouse and keyboard interaction. These learning activities include practical skills training, co-located group interactions and represent a big set of learning moments taking place across physical and digital spaces, both in the classroom and at the workplace. These moments can be monitored via the tracking of multiple modalities including motoric and physiological information, learning context, environment and activity. For the second year in a row, in the LAKHackathon we organise MMLAHack, a challenge that focuses on the exchange of best practices, approaches tools and to handle the complexity of multimodal data. This year the MMLAHack seeks to complement the conference workshop CrossMMLA (Maldonado et al., 2018) with a complementary space for hands-on examples, prototype demonstrations, code-sharing and solutions to technical issues in the field of MMLA. MMLAHack will focus on tackling typical MMLA challenges including the data collection, storing, annotation, processing and exploitation of multimodal data for supporting learning (Di Mitri, Schneider, Specht & Drachsler, 2018).
2. Data Interoperability
It should not matter which institution a student attended. Their learning data should make sense for a lifetime and in any environment. This becomes even more urgent in the light of GDPR legislation and the right to data portability. LAK16, in Edinburgh, had a challenge focussing upon data interoperability between xAPI and Caliper specifications which led to the Edinburgh Statement on Data Interoperability (Edinburgh Statement, 2016), and talks between ADL and IMS to start harmonising work between the standards. In LAK19 we propose to return to this challenge, exploring ways of mapping between emerging xAPI Profiles, and published Caliper metric profiles. Bring your xAPI and Caliper data along and help us to find a way to develop LA tools that are agnostic about which data standard was used to generate that data. The semantic web and newer options such as GraphQL will be explored as mechanisms for delivering data interoperability.
3. Goal setting and analytics
Goal Setting (GS) theory and GS tools and methodologies can potentially enhance the performance of individuals. Furthermore, tracking students’ learning through GS comes with a number of opportunities to gain insights into topics like learning objectives, learning pathways or Self-Regulated Learning. GS is also a tool to connect online and offline learning behaviours. Therefore, it is surprising that goal setting has rarely been studied in the context of education. Now, with the help of learning analytics, it is possible to connect students’ learning goals with performance and behavioural data coming from digital learning environments. This notion creates also the opportunity to continuously monitor students’ progress toward their explicit learning goals over time, and provide individual recommendations. Exploiting GS is a novel and a vaguely explored area that gives a lot of room for creativity and development. We will build on the LAK16 Goal setting workshop (Mol, Kobayashi, Kismihók & Zhao, 2016), available open source applications (including the UvA Goal Setting Dashboard), and a large amount of labour market data (vacancy announcements) to formulate personal goals beyond the frames of formal education.
4. Student facing Open APIs
While we are increasingly providing LA solutions for students, a significant opportunity arises to investigate the way in which universities and other open government services can enrich and expose their data stores and analytics, thus fostering the rapid development of student facing solutions. Institutions are increasingly moving towards an API based architecture which will add flexibility to their core IT infrastructure, a situation that offers many opportunities for rapid innovation and development of solutions by the people who will use them (rather than by external providers). At the same time, many universities host a pool of highly motivated students with fundamental ideas about how to improve the student experience by offering innovative new IT services that are beyond core business. This challenge is to facilitate such Companion Proceedings 9th International Conference on Learning Analytics & Knowledge (LAK19) Creative Commons License, Attribution – NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0) 5 access pathways for systems built using the core infrastructure provided by official university data warehousing situations. We will investigate Security, data format, mapping and other core properties.
5. Curriculum analytics
A major motivation behind curriculum analytics is to enable the identification of effective or weak parts of the curriculum, leading to the building of better courses. Early efforts at linking learning design and learning analytics include the Australian “Loop” system which integrates course structures and schedules in its visualisations to help evaluate the effectiveness of the learning activities (Lockyer, Heathcote, & Dawson, 2013) and work at the Open University which assesses the impact of different types of learning and assessment design on various measures of student success (Quan Nguyen, 2017). An approach we intend to explore is the building of learning designs which not only categorise different aspects of the learning process but also specify the data which needs to be captured to show whether the designs are proving effective.
6. Trusted Learning Analytics
With the General Data Protection Regulation (GDPR) of the European Union, each learning analytics architecture in place needs to deal with specific privacy concerns. One of these concerns regards the right of the user to object to the collection of his or her personal data. In the case of user objection, we may still be permitted to carry out analytics on parts of the personal data. Therefore, a system with multiple components is necessary. One component is providing the privacy settings through a common API. Another component is the LRS which will be receiving and storing the data. The Learning Record Store needs to be extended by a privacy guard which is responsible for the filtering based on the privacy settings. The purpose of this hackathon project is to design and implement a privacy filter mechanism for a learning analytics framework. Research questions we consider: how to efficiently synchronize the privacy settings between software components? how to efficiently filter xAPI statements based on more or less generic privacy settings?
You have the power: Please contribute a challenge to the hackathon, check our Call for Proposal.