Can artificial intelligence enhance accessibility and help disabled students get on track?

Students have a lot to manage besides their academic progress. For disabled students, there is increasing sector-wide recognition of the burdens faced and the need to act to reduce the impacts this has on study. In our research at The Open University, we’re exploring how Artificial Intelligence (AI) could change how students explain their needs and how support is delivered. Technologies are rapidly evolving and there’s huge potential to address challenges for students and the sector. 

Disabled students spend additional time and energy understanding what support is available, identifying how this could be relevant to them and their study, explaining their needs to staff (often repeatedly), completing applications, gaining evidence, doing assessments, training and more. When things go well, this makes a big difference to their chances of success. But unfortunately, this often becomes laborious, repetitive and stressful, and the results may be delayed or not achieve what was hoped.

 

AI and Accessibility

Conversations about artificial intelligence today tend to end up fixated on the same few concerns: students getting ChatGPT to complete their assignments, robot tutors replacing teachers, or the impacts of bias, misinformation or racism from systems that are increasingly part of the fabric of our lives.

What can get lost in these debates is the ways that AI-based tools are already enhancing peoples’ lives and study. AI supports people with communication barriers in multiple ways, providing feedback on their writing or allowing them to complete an assignment by speaking rather than typing. Automated captions aren’t always accurate, but they are increasingly useful, and used in the many situations where manual transcription isn’t available. Accessibility and assistive technology has long been a space for innovation born of necessity. 

Like most technologies, there’s dangers as well as opportunities that relate to how we perceive and use these technologies. Some might see AI-based tools as a chance to shirk responsibility for delivering accessible teaching or appropriate support. When access to technology enhances the capabilities of some, others can be left behind, lacking access or literacies to use it effectively. Also, concerns for plagiarism could lead to restrictions that stop tools being used in reasonable, assistive ways.

 

The potential for AI to address the burden on disabled students

So, can we use AI to reduce the burden placed on disabled students in gaining support, and make experiences better? Findings from our work suggest so. 2020 saw the first live trial of ‘Taylor’ a virtual assistant for disclosing disabilities to the Open University. This was co-designed with students and staff, with support by Microsoft. As students disclosed their disabilities, they were given the option to have a conversation with Taylor as well as completing the usual form for comparison. Taylor main advantages at this stage were providing a better experience than form filling, and allowing students to ask questions as they go.  

134 students took part in this study as part of the process of disclosing their disabilities, and 65% preferred using Taylor to completing the form. Students appreciated the way the conversation focused on particular areas such as how they use assistive technologies or want to communicate with tutors, finding this less overwhelming. Some students noted that having to talk to a person about their disabilities induced anxiety, and although Taylor was not a replacement for this, they felt more at ease asking questions and describing their disabilities to a bot. 

We also asked students what more they wanted Taylor to do, and the most popular answers were to provide them with suggestions that could help them in their studies, and to allow them to complete other forms and processes. 

We’ve now finished a second pilot of Taylor that ran from late 2022 until mid 2023. We added further integration into the OU’s systems and new features based on feedback. Taylor now gives each student individual suggestions about tools and strategies that should be relevant to them, based on recommendations made by fellow students who describe similar disabilities and barriers. This feature was positively received, which makes sense as it was what they wanted from the system in the previous trial! 

Over 400 students completed conversations with Taylor this time, and again, feedback has been very positive. 88% of participants said they would like to use Taylor again. It’s encouraging to hear comments such as: “I was nervous about using Taylor initially but as soon as I started felt at ease. It was so straightforward to use. I actually ended up enjoying the experience.” and “I really liked the experience and found it a lot easier than filling in a form or emailing. Felt personable but not as anxiety inducing as talking to a person.” 

We’ve also noticed that more of those taking part reported previous experience of using chatbots and virtual assistants elsewhere than back in 2020. It feels like students now expect these services and will make full use of them.

 

What challenges and opportunities are we finding?

With the rapid advances seen in Large Language Models like GPT, there’s ever greater opportunities for more ‘intelligent’ conversations and more personally-relevant guidance whenever it is needed. We’ve started another phase of design and prototyping work, exploring the potential for assistance and advice to be available to any student that faces barriers. We’ve also recognised the potential for these technologies to help staff understand how to enhance accessibility in their roles.  

Students asked Taylor over 350 questions in the most recent trial. As their queries become more complex or context dependent, there’s a need to draw on multiple sources of information and understand the student’s context further. It’s challenging, but feasible, for a system to use information about the course a student is studying, their student record, and more, to provide better answers. 

We also need wider changes to get the most out of these opportunities. A single conversation with an assistant could replace the effort of completing multiple forms and processes for a student, guide them, and act on their behalf in communicating their needs. But the requirements and practices to support this are not yet in place. 

It's well recognised that there are instances where AI-based systems present plausible but incorrect information, or ‘hallucinate’. A lot of the focus is now on how to avoid or mitigate these. There’s always going to be a need for human experts to guide and monitor the responses made by systems like Taylor. But over time, the benefits are likely to be substantial, allowing staff to make better use of their time. In other settings, students may not have any kind of disability support expertise available to them, so assistants like this could fill a big gap even if they have limitations.

About the author

Dr Tim Coughlan is a Senior Lecturer in the Institute of Educational Technology of The Open University. He has extensive experience of research and teaching on accessibility, online learning, open education, and educational innovations. He has led research projects funded by Microsoft, Jisc and British Council, and leads or contributes to range of institutional initiatives to enhance access and develop inclusive approaches to innovation.

Previous
Previous

More than a PhD: A Guidance Model to Empower Postgraduate Researchers to Explore their Career Options

Next
Next

The Role of Reverse Mentoring in Creating a More Connected Campus