Author: Antonia Huber
ARP – References and Bibliography
The references below are grouped thematically to reflect how they informed my practice.
Pedagogy
Bayne, S. (2015) ‘What’s the matter with “technology-enhanced learning”?’, Learning, Media and Technology.
Biggs, J. and Tang, C. (2011) Teaching for quality learning at university. 4th edn. Maidenhead: Open University Press.
CAST (2018) Universal Design for Learning Guidelines version 2.2. Wakefield, MA: CAST.
Edwards, R. (2015) ‘Software and the hidden curriculum in digital education’, Pedagogy, Culture & Society.
Meyer, A., Rose, D. H. and Gordon, D. (2014) Universal Design for Learning: theory and practice. Wakefield, MA: CAST.
Nicol, D. (2009) ‘Transforming assessment and feedback: enhancing integration and empowerment in the first year’, Assessment & Evaluation in Higher Education.
Orr, S. and Shreeve, A. (2017) Art and design pedagogy in higher education: knowledge, values and ambiguity in the creative curriculum. London: Routledge.
Öztok, M. (2019) The hidden curriculum of online learning. London: Routledge.
Selwyn, N. (2014) Distrusting educational technology: Critical questions for changing times. Abingdon: Routledge.
Selwyn, N. (2016) Education and technology: key issues and debates. 2nd edn. London: Bloomsbury Academic.
Action research
Gray, C. and Malins, J. (2004) Visualizing research: a guide to the research process in art and design. Aldershot: Ashgate.
Kemmis, S. and McTaggart, R. (1988) The action research planner. Geelong: Deakin University.
Schön, D. A. (1983) The reflective practitioner: how professionals think in action. New York: Basic Books.
White, P. (2009) Developing research questions. Basingstoke: Palgrave Macmillan.
Data collection
Bowen, G. A. (2009) ‘Document analysis as a qualitative research method’, Qualitative Research Journal, 9(2).
Cohen, L., Manion, L. and Morrison, K. (2018) Research methods in education. 8th edn. London: Routledge.
Denscombe, M. (2010) The good research guide: for small-scale social research projects. 4th edn. Maidenhead: Open University Press.
Kvale, S. and Brinkmann, S. (2009) InterViews: learning the craft of qualitative research interviewing. 2nd edn. London: Sage.
Analysis
Braun, V. and Clarke, V. (2006) ‘Using thematic analysis in psychology’, Qualitative Research in Psychology, 3(2), pp. 77–101.
Braun, V. and Clarke, V. (2019) ‘Reflecting on reflexive thematic analysis’, Qualitative Research in Sport, Exercise and Health, 11(4), pp. 589–597.
Braun, V. and Clarke, V. (2021) Thematic analysis: a practical guide. London: Sage.
International Journal of Qualitative Methods (2023) ‘Advances in thematic analysis’, available at: https://journals.sagepub.com/doi/10.1177/16094069231205789 (Accessed: 19 December 2025).
Martin, B. and Hanington, B. (2012) Universal methods of design: 100 ways to research complex problems, develop innovative ideas, and design effective solutions. Beverly, MA: Rockport.
Nielsen Norman Group (2024) Thematic analysis, available at: https://www.nngroup.com/articles/thematic-analysis/ (Accessed: 19 December 2025).
Ethics
Banks, S. (2016) Everyday ethics in professional life. London: Palgrave Macmillan.
BERA (2024) Ethical guidelines for educational research. 5th edn. London: BERA.
ARP – Use of AI
Throughout this action research project, I made intentional use of ChatGPT by OpenAI, a generative artificial intelligence tool, to support specific stages of the research and writing process. The tool was not used to generate research findings or make analytical decisions, but functioned as a reflective and technical aid that supported clarity, structure, and confidence.
- At the early stage of the project, AI was used to sense-check the interview and questionnaire questions, both before and after tutor and peer feedback. This included reviewing the order, logic, and number of questions to ensure they were coherent and accessible.
- AI was also used to support the refinement of my research focus, particularly when narrowing and dividing the final research question in response to tutor and peer feedback.
- As a non-native English speaker, I used the technology to copy-edit and spell-check my final blog posts. AI did not generate academic content, but assisted in improving clarity and flow.
- During the artefact review, AI was used to support the organisation and comparison of observations across platforms, helping translate qualitative notes into indicative groupings for visual comparison. Interpretive coding and analytic decisions remained researcher-led.
- Similarly, AI was used to sense-check the thematic analysis during a later stage, supporting reflection on the relative weighting and prominence of themes once they had already been manually identified through reflexive thematic analysis.
- AI was also used to check Harvard referencing, supporting consistency and formatting accuracy. All sources were selected independently and drawn from approved academic readings.
Reflecting forward: closing one cycle and opening the next
The final stage of this action research project involved reflecting on how insights from my enquiry informed and will continue to inform my pedagogical actions, and how these actions, in turn, raised further questions for practice.
Findings from the thematic and artefact analysis highlighted recurring issues around orientation, coherence, and the way learning was experienced as fragmented within the Virtual Learning Environment. These insights were reinforced by recent student feedback from Unit 1 of the MA Graphic Design (Online), which echoed similar concerns. Students described difficulty understanding how weekly tasks related to the wider aims of the unit, despite engaging carefully with individual activities. This tension reflects a wider challenge in online education: while learning design often prioritises focused, task-based engagement, students also need reassurance that these moments of detailed work sit within a considered and meaningful educational direction. Biggs and Tang (2011) argue that constructive alignment is not only about aligning outcomes, teaching, and assessment, but about helping learners recognise how these elements connect. When this sense of connection is unclear, students may complete tasks without trusting that their overall learning journey is being held in view.
In response, two targeted changes were implemented collaboratively with the course leader. The idea of introducing a visual learning arc first emerged during one of the semi-structured interviews, where a participant described the need for students to better understand how weekly teaching connects to the wider unit trajectory. When this concept was shared with the wider UAL Online teaching team, it resonated strongly and was agreed as a meaningful way to open live sessions. Positioned at the start of teaching, the learning arc communicates the learning journey to date, clarifies where students are currently situated, and signals the intended direction for the remainder of the unit. This approach helps counter Moodle’s tendency to function primarily as a “now-focused” environment. By zooming out momentarily, the intention is to reassure students that focused, detailed tasks sit within a considered and purposeful learning journey rather than isolated requirements. This aligns with Orr and Shreeve’s (2017) discussion of studio pedagogy as a culture of shared understanding, where learning is shaped not only through tasks but through awareness of trajectory, dialogue, and purpose. In online contexts, where physical cues and informal studio rhythms are absent, such narrative scaffolding becomes increasingly important.
Second, a weekly overview activity was positioned at the very start of each learning block, explicitly outlining learning intentions, expected engagement, and deliverables. Interestingly, this overview structure had already been developed by one of the learning designers for a single unit, but had not been communicated across the wider course team. Its uneven adoption highlighted the fragmented and siloed nature of course development within UAL Online. Through collective discussion, the overview was recognised as a strong response to the issues identified in both the research findings and student feedback. While it is currently being implemented within one unit, the longer-term intention is to extend this approach across all nine units to support greater coherence. Nicol (2009) suggests that clarity around purpose supports learner self-regulation and helps build trust between students and educators. When learners understand why they are being asked to do something, they are better able to manage uncertainty, workload, and motivation. These interventions were deliberately modest, operating within existing institutional and platform constraints. Their value lies not in redesigning the system, but in reframing how learning is signposted. From a Universal Design for Learning perspective, making purpose and structure explicit supports cognitive accessibility by reducing unnecessary interpretive effort (Meyer, Rose and Gordon, 2014).
However, this stage of action also exposed ongoing tensions. While early feedback suggests improved orientation, these changes remain provisional. Action research requires that interventions be revisited and re-evaluated rather than assumed effective. The next cycle of enquiry will therefore focus on whether these strategies continue to support learning coherence over time, and how they might evolve in response to further student experience. Reflecting on the process overall, I have come to understand accessibility not solely as clarity of content, but as clarity of intent. Supporting students to “zoom out” periodically is not a distraction from focused learning, but a condition for trust; signalling that behind individual tasks sits a thoughtful, connected educational purpose.
References
Biggs, J. and Tang, C. (2011) Teaching for quality learning at university. 4th edn. Maidenhead: Open University Press.
Meyer, A., Rose, D. H. and Gordon, D. (2014) Universal Design for Learning: theory and practice. Wakefield, MA: CAST.
Nicol, D. (2009) ‘Transforming assessment and feedback: enhancing integration and empowerment in the first year’, Assessment & Evaluation in Higher Education, 34(3), pp. 335–352.
Orr, S. and Shreeve, A. (2017) Art and design pedagogy in higher education: knowledge, values and ambiguity in the creative curriculum. London: Routledge.

Figure 1: Screenshot of an anonymised Miro board visualising a full unit structure through an arc-based diagram, mapping weekly activities, teaching phases, and learning progression to support student orientation and overview.

Figure 2: Detail view of the anonymised Miro arc diagram illustrating the relationship between exploratory research, reflection, synthesis, and concept development across the unit timeline.

Figure 3: Screenshot of an anonymised Moodle unit page from the MA Graphic Design (Online), illustrating the use of a “Prepare for the week” activity to foreground key learning activities, expected outputs, and session preparation in order to support student orientation and reduce cognitive load.
ARP – Blog Post 4: OBSERVING
Analysing practice: thematic and artefact-based interpretation
The analytical stage of this action research project required a significant shift in my thinking. While data collection felt relational and dialogic, analysis initially introduced a sense of uncertainty. I was unsure how to move from rich qualitative material toward meaningful interpretation without reducing participants’ voices or oversimplifying complexity. This discomfort prompted reflection on my own analytical habits and disciplinary background, and became an important part of the research process itself.
The analysis followed a reflexive thematic analysis approach (Braun and Clarke, 2006; 2021), understanding themes not as truths embedded within the data, but as interpretive patterns constructed through sustained engagement, reflexivity, and theoretical positioning. As a practitioner-researcher working within the same institutional structures under investigation, my interpretations were shaped by professional proximity and shared language. Rather than attempting neutrality, this analysis acknowledges subjectivity as an analytic resource, while remaining attentive to its limitations.
My initial sense-making process drew on affinity mapping, a method familiar from my background in branding and design. Martin and Hanington (2012) describe affinity mapping as a way of organising complexity through visual clustering rather than predetermined categories. Using Miro, I worked with digital post-it notes to group interview excerpts and questionnaire responses, allowing patterns, repetitions, and tensions to surface visually. At this stage, the emphasis was not on defining themes, but on orientation – understanding what was present in the data before deciding what it might come to represent. Short analytic memos were written throughout this process to capture emerging interpretations, questions, and moments of tension. Engaging with Braun and Clarke’s later work (2021) helped reframe my uncertainty. Rather than seeking stability through categorisation, I began to understand ambiguity as intrinsic to qualitative enquiry. Thematic analysis is not linear or cumulative; it is recursive, requiring repeated movement between data, interpretation, and reflection. This perspective allowed me to work with provisional meanings rather than prematurely fixing conclusions.
As analysis developed, codes were reorganised, merged, and at times abandoned entirely, reinforcing that qualitative analysis is recursive rather than cumulative. Three interrelated themes became increasingly visible. The first concerned orientation and coherence, which did not simply refer to navigational clarity, but to students’ sense of trust in the learning journey and confidence that focused tasks were situated within a meaningful pedagogical arc. The second addressed workarounds as pedagogical labour, highlighting how educators compensate for platform limitations through additional effort. The third revealed ongoing tensions between pedagogical intention and institutional structure. These themes aligned closely with the project’s research questions, addressing both educator practice and digital design constraints. Unlike linear coding models, themes were not repeatedly merged or hierarchically restructured. Instead, emphasis shifted through depth of engagement, as certain patterns became increasingly prominent through recurrence, intensity, and relevance to the research questions. One theme in particular – navigation, structure and cognitive load – emerged as having the strongest bearing on the overall enquiry. This theme extended beyond usability concerns to encompass educators’ observations of student disorientation, fragmented learning journeys, and the emotional labour required to compensate for unclear systems. Its prominence informed the direction of subsequent analysis and became central to understanding why platform design mattered pedagogically rather than merely technically.
To explore this further, I conducted an artefact review of Moodle alongside alternative platforms referenced by participants, including Miro, Padlet, and Notion. The objective was not to evaluate tools comparatively, but to better understand why colleagues consistently pivoted away from Moodle in their teaching practice. Examining navigation depth, hierarchy, spatial organisation, and visual structure helped trace how pedagogical intentions were either supported or constrained by interface design. This artefact-based analysis allowed participant accounts to be examined materially, revealing how platform structure directly shaped cognitive load and teaching strategies.
Following theme development, a light quantitative weighting was applied to indicate relative emphasis across the dataset. This was not intended to produce statistical claims, but to support reflective judgement by visualising which themes carried the greatest analytic weight within the project. Importantly, this stage also revealed absence. While the project was motivated by concerns around student experience, learner voice remained indirect, mediated through educator accounts. Recognising this limitation became part of the analysis itself, reinforcing the ethical and institutional boundaries shaping the enquiry.
Through this process, I came to understand analysis in action research not as the pursuit of certainty, but as the development of confident, situated judgement. Allowing interpretation to remain reflexive and provisional strengthened both the findings and my understanding of how pedagogical meaning is constructed within digital learning environments.
References
Braun, V. and Clarke, V. (2006) ‘Using thematic analysis in psychology’, Qualitative Research in Psychology, 3(2), pp. 77–101.
Braun, V. and Clarke, V. (2019) ‘Reflecting on reflexive thematic analysis’, Qualitative Research in Sport, Exercise and Health, 11(4), pp. 589–597.
Braun, V. and Clarke, V. (2021) Thematic analysis: a practical guide. London: Sage.
Martin, B. and Hanington, B. (2012) Universal methods of design: 100 ways to research complex problems, develop innovative ideas, and design effective solutions. Beverly, MA: Rockport.

Figure 1: Photo of analogue working through anonymised questionnaire responses. Printed survey data were manually annotated using colour coding and handwritten notes to support close reading and early pattern recognition. This process helped surface differences between concise written responses and richer interview narratives, informing subsequent thematic comparison (Huber, 2025).

Figure 2: Photo of printed semi-structured interview transcript 1 pages were annotated through colour highlighting, underlining, and margin notes to support detailed engagement with participant language. This stage functioned as an initial sense-making process prior to digital thematic clustering (Huber, 2025).

Figure 2: Photo of annotated semi-structured interview transcript 2, working physically with the transcript supported reflexive interpretation and helped prepare the data for later thematic grouping and indicative weighting (Huber, 2025).

Figure 4: Screenshot of digital affinity mapping conducted in Miro, showing clustered themes derived from interview transcripts and questionnaire responses. Colour-coded post-it notes were used to support visual sense-making and the identification of recurring patterns (Huber, 2025).

Figure 4 (detail): Detail view of the digital affinity map developed in Miro, focusing on the theme navigation, structure and cognitive load. Post-it notes were rearranged to support closer inspection of recurring issues related to platform navigation, visual hierarchy, and students’ ability to orient themselves within the learning journey (Huber, 2025).

Figure 5: Screenshot of artefact analysis comparing Moodle with alternative digital platforms (Miro, Padlet and Notion). The visual mapping highlights how differences in navigation structure, search functionality, and spatial organisation relate to educators’ reported use of supplementary tools (Huber, 2025).

Figure 6: Screenshot of a colour-coded interview transcript supporting the quantification of qualitative themes. Transcript 1 was annotated with the help of AI using colour markers aligned to established thematic categories. This stage functioned as a preparatory step for comparative weighting rather than thematic development (Huber, 2025).

Figure 7: Screenshot of an indicative thematic weighting matrix across interviews, questionnaire responses, and artefact analysis using AI. Dot weighting represents relative prominence of themes and depth of engagement rather than numerical frequency (Huber, 2025).
ARP – Blog Post 3: ACTING
Research methods and data collection
This stage of the action research project prompted deeper reflection on how knowledge is produced through research. Rather than viewing data collection as a neutral or technical phase, I became increasingly aware that the framing of questions, the structure of conversations, and my own position as a practitioner-researcher actively shaped what could be said – and what could remain unsaid. Kvale and Brinkmann (2009) describe qualitative interviewing as a craft, where knowledge is not discovered but co-constructed through dialogue. Similarly, Alvesson (2012) cautions against treating interviews as transparent windows into experience, arguing that interview data are always shaped by interaction, context, and institutional positioning. This understanding became central to how I approached both the semi-structured interviews and the written questionnaire. Rather than attempting to extract information, the aim was to create conditions in which participants could reflect on their own practices, assumptions, and constraints.
This perspective aligns with Denscombe’s (2014) view that practitioner research methods must be responsive to context, relationships, and purpose rather than relying on standardised procedure. A set of five open-ended questions formed the basis of the enquiry. These were developed iteratively through group tutorials and refined through peer and tutor feedback, supporting clarity and focus while retaining openness. As a second-language English speaker, I also used digital tools to sense-check phrasing, ensuring accessibility without altering the intent of the questions. This process heightened my awareness that linguistic clarity is not only an ethical consideration, but also a methodological one: poorly framed questions risk excluding nuance before analysis even begins.
The decision to keep the questions open-ended was deliberate. Kvale and Brinkmann (2009) emphasise that open questioning allows participants to construct meaning in their own terms, while Cohen, Manion and Morrison (2018) argue that semi-structured interviews offer a balance between direction and flexibility. In practice, this approach shifted the balance of control within the interviews. Participants frequently reframed questions, introduced new concerns, and reflected on tensions between pedagogical intention and institutional constraint. Irvine et al. (2012) note that such moments – where participants negotiate the adequacy of their responses – are not interruptions to the research process, but central to how meaning is collaboratively produced. These interactions became significant sites of insight, often revealing issues that had not been anticipated during the planning stage.
The same question set was adapted into an open-ended questionnaire to enable asynchronous participation. While this approach increased accessibility and respected colleagues’ time constraints, it also revealed clear methodological limitations. Written responses tended to be shorter, less reflective, and more descriptive than interview data. This contrast highlighted how depth in qualitative research is not simply a function of questioning, but of relational space. Without conversational cues, prompts, or opportunities for clarification, participants appeared more likely to prioritise efficiency over reflection. This observation reinforced the idea that questionnaires, while useful for breadth, are less effective for accessing layered pedagogical reasoning in small-scale practitioner research.
Alongside participant-generated data, I conducted a visual artefact review of Moodle and alternative platforms including Miro, Padlet, and Notion. This review was prompted by a recurring pattern across interviews: colleagues frequently described “pivoting away” from Moodle when attempting to enact their pedagogy. The artefact analysis therefore sought to understand why these shifts occurred, not at the level of preference, but through interface structure. Drawing on Bowen’s (2009) conception of document analysis, digital platforms were treated as institutional texts, embedding assumptions about hierarchy, sequencing, and learner behaviour. Reviewing these environments made visible a distinction repeatedly referenced by participants: the difference between verbal cognitive load (instructions, explanations, written guidance) and visual cognitive load (navigation depth, spatial organisation, and information density). This contrast clarified how learning friction often emerged not from content itself, but from the effort required to locate, interpret, and contextualise it.
Examining artefacts alongside interview and questionnaire data supported methodological triangulation, while also revealing productive tensions between what educators articulated verbally and what platforms materially afforded. This process underscored that pedagogy is not only expressed through teaching intention, but enacted — and at times constrained – through interface design.
Throughout this stage, reflexivity remained central. As a practitioner researching within my own institution, shared language and professional proximity enabled openness, yet also risked normalising constraint. Recognising this tension became part of the enquiry itself. Rather than striving for detachment, I came to understand my situated perspective as shaping both the questions asked and the issues that felt most urgent. This stage therefore generated more than data for analysis: it deepened my understanding of qualitative research as a relational, situated practice and shaped how I approached the subsequent phase of thematic and artefact-based interpretation.
Reference
Alvesson, M. (2012) ‘Views on interviews: a skeptical review’, Qualitative Research in Organizations and Management, 8(1), pp. 23–44.
Bowen, G. A. (2009) ‘Document analysis as a qualitative research method’, Qualitative Research Journal, 9(2), pp. 27–40.
Cohen, L., Manion, L. and Morrison, K. (2018) Research methods in education. 8th edn. London: Routledge.
Denscombe, M. (2014) The good research guide: for small-scale social research projects. 5th edn. Maidenhead: Open University Press.
Irvine, A., Drew, P. and Sainsbury, R. (2012) ‘“Am I not answering your questions properly?” Clarification, adequacy and responsiveness in semi-structured telephone and face-to-face interviews’, Qualitative Research, 13(1), pp. 87–106.
Kvale, S. and Brinkmann, S. (2009) InterViews: learning the craft of qualitative research interviewing. 2nd edn. London: Sage.

Figure 1: Visual summary of the iterative development of semi-structured interview questions, showing the progression from initial self-generated prompts through tutor and peer feedback to final refinement, including AI-supported sense-checking to improve clarity, focus, and accessibility (Huber, 2025).

Figure 2: Screenshot of anonymised interview transcript (Transcript 1), showing cleaned formatting with line breaks applied to support readability and subsequent thematic analysis (Huber, 2025).

Figure 3: Screenshot of anonymised interview transcript (Transcript 2), prepared for analysis through formatting refinement and improved visual legibility (Huber, 2025).

Figure 4: Screenshot of Microsoft Forms questionnaire design used to collect qualitative staff responses for this action research project (Huber, 2025).

Figure 5: Screenshot of Microsoft Forms interface displaying anonymised questionnaire responses collected during the enquiry phase (Huber, 2025).

Figure 6: Screenshot of Excel spreadsheet summarising qualitative questionnaire responses to support thematic analysis (Huber, 2025).

Figure 6: Screenshot of an anonymised Miro board from a residential art and design course, showing a full unit mapped within a single shared visual workspace to support overview, continuity, and non-linear learning pathways (Huber, 2025).

Figure 6 (detail): Close-up view of the same anonymised Miro board, illustrating clustered activities, iterative making, and reflective stages presented simultaneously within one visual field (Huber, 2025).

Figure 7: Close-up view of the same anonymised Miro board, illustrating clustered activities, iterative making, and reflective stages presented simultaneously within one visual field (Huber, 2025).

Figure 8: Screenshot of an anonymised Notion workspace used to reorganise all learning materials from a single unit into an alternative interface, demonstrating how content can be presented within a continuous white-background environment to support readability, visual coherence, and navigational clarity (Huber, 2025).
ARP – Blog Post 2: PLANNING → ACTING
Defining the research focus: questions, constraints and ethical positioning
Following the identification of cognitive accessibility as a pedagogical and social justice concern in the previous blog post, the next stage of the action research project focused on developing research questions that were both purposeful and feasible within my teaching context. Moving from a broad professional concern to a structured enquiry required careful reflection on how research questions function within practice-based research.
Gray and Malins (2004) describe research questions as devices for focusing enquiry rather than statements of intent or prediction. They argue that effective questions should remain open, exploratory, and capable of evolving as understanding develops. This perspective was particularly relevant to my project, as my initial concerns about online learning environments were grounded in lived teaching experience rather than clearly bounded problems.
Engaging with this literature helped me recognise the need to narrow the enquiry without closing it down. Early iterations of my questions risked becoming either too broad (attempting to address accessibility at an institutional level) or too solution-focused, implying improvement before sufficient understanding had been developed. Gray and Malins caution against research questions that seek outcomes too early, instead advocating for enquiry that enables discovery rather than confirmation. This understanding was reinforced by White (2009), who frames research questioning as a form of disciplined curiosity. She argues that uncertainty plays a productive role in research, allowing understanding to emerge gradually rather than being fixed at the outset. This framing helped me recognise that refining the questions was part of the enquiry itself, not a preliminary task to be completed before learning could begin. Rather than seeking generalisable findings, the enquiry needed to support reflective learning about my own practice and inform future development of my teaching.

Figure 1: Visual representation of the iterative development of the research question, showing the progression from an initial broad enquiry toward more focused and cognitively accessible research questions, informed by tutor and peer feedback.
At the same time, the institutional context influenced what forms of enquiry were possible. While Unit 1 of the MA Graphic Design (Online) was live at the time of the project, access to the current student cohort was not available, and feedback on the VLE was discouraged from director level at this stage of delivery. This limited the scope of participant involvement and required careful reconsideration of the research design. In response, the project pivoted towards engaging colleagues as participants, recognising their role in shaping learning environments, platform structures, and pedagogical decisions. This adaptation reinforced Gray and Malins’ view of research as situated practice, shaped by real conditions rather than idealised models.
As a result, the enquiry was articulated through two related research questions. Together, these questions allowed my enquiry to address both elements which co-exists in the online teaching context (professional practice and digital design structures) while remaining open-ended and appropriate to an action research framework.
Pedagogical approach
How do educators understand and approach cognitive accessibility when designing online learning experiences?
Platform design
How do the design structures of Virtual Learning Environments and alternative platforms support or constrain clarity, navigation, and cognitive accessibility?
Despite this refinement, I remain aware that research questions continue to evolve as understanding develops (Gray and Malins, 2004; White, 2009). Approaching the enquiry in this way enabled flexibility, responsiveness, and critical reflection, establishing a strong foundation for the data collection stage that followed. Ethical guidance from BERA (2024) and Banks (2016) informed decisions around participation, consent, and anonymity, particularly given my position as an insider researcher. These considerations supported the transition from planning into action within the action research cycle and shaped the methodological choices that followed.
References
Banks, S. (2016) Everyday ethics in professional life. London: Palgrave Macmillan.
BERA (2024) Ethical guidelines for educational research. 5th edn. London: BERA.
Gray, C. and Malins, J. (2004) Visualizing research: a guide to the research process in art and design. Aldershot: Ashgate.
Kemmis, S. and McTaggart, R. (1988) The action research planner. Geelong: Deakin University.
White, P. (2009) Developing research questions. Basingstoke: Palgrave Macmillan.
ARP – Blog Post 1: PLANNING
Situating my enquiry: teaching context, rationale & social justice
This action research project is grounded in my current role as a lecturer working fully within an online MA programme at the University of the Arts London. As all teaching, communication, and learning in new fully online postgraduate programme will take place through digital platforms, the Virtual Learning Environment (VLE) has become the primary site through which students encounter the course. This shift has required a fundamental rethinking of how learning space is designed, experienced, and sustained.
In residential design education, the studio has traditionally functioned as a shared physical environment. Entering the studio offers a form of cognitive and social transition: students move into a space that supports focus, experimentation, and collective learning. In art and design education, the studio has traditionally functioned as a central pedagogical space. Orr and Shreeve (2017) describe studio learning as a signature pedagogy, shaped by immersion, dialogue, critique, and shared participation. Entering the studio offers a cognitive and social transition, allowing students to become absorbed in creative practice and collective learning. In contrast, online learning environments are inherently blended. Students often engage from bedrooms, kitchens, cafés, or shared spaces, where learning competes with everyday life. In this context, the studio becomes less a physical place and more a mindset that must be actively supported through structure, rhythm, and clarity. Without clear framing, opportunities for focus, experimentation, and belonging can be weakened.
This shift has significant implications for cognitive accessibility. When learning takes place across fragmented environments, the demands on attention, memory, and organisation increase. If digital course spaces are unclear, overly text-based, or poorly signposted, students must invest additional cognitive effort simply to orient themselves. Over time, this contributes to digital disadvantage; not limited to specific marginalised groups, but embedded within the structure of online education itself.
While students experience online learning differently depending on language background, neurodiversity, or personal circumstance, the issue is not solely individual. Digital platforms operate as a form of hidden curriculum, embedding assumptions about how learning should occur and who is able to navigate complexity with ease (Edwards, 2015; Öztok, 2019). When platforms prioritise linear navigation, dense information, and implicit logic, they tend to reward students with prior academic confidence while disadvantaging others. In this sense, digital disadvantage becomes systemic rather than exceptional.
Universal Design for Learning (UDL) offers a framework for addressing these challenges by encouraging educators to design learning environments that anticipate learner variability rather than respond to difficulty after it arises (CAST, 2018). Principles such as clarity, consistency, and multiple modes of representation are particularly important in online contexts, where students cannot rely on physical cues, informal peer interaction, or shared studio presence. However, institutional VLEs such as Moodle also shape what is possible. While designed to support learning, they often prioritise administrative logic over learner experience, limiting how far pedagogical intentions can be realised. This tension between educational values and platform structures sits at the centre of my enquiry.
From an ethical perspective, this raises my key question: who carries responsibility for sustaining the studio mindset in online education?
When clarity is not embedded within course design, students must compensate through additional effort, peer-led workarounds, or repeated clarification. As Banks (2016) argues, ethical practice emerges through everyday professional decisions that shape care and fairness. This project represents the planning stage of my action research cycle (Kemmis and McTaggart, 1988). By examining cognitive accessibility as both a pedagogical and social justice concern, the enquiry seeks to support online learning environments that do not simply replicate residential models, but actively reimagine inclusive studio learning in digital form.
References
Banks, S. (2016) Everyday ethics in professional life. London: Palgrave Macmillan.
CAST (2018) Universal Design for Learning Guidelines version 2.2. Wakefield, MA: CAST.
Edwards, R. (2015) ‘Software and the hidden curriculum in digital education’, Pedagogy, Culture & Society, 23(2), pp. 265–279.
Kemmis, S. and McTaggart, R. (1988) The action research planner. Geelong: Deakin University.
Orr, S. and Shreeve, A. (2017) Art and design pedagogy in higher education: knowledge, values and ambiguity in the creative curriculum. London: Routledge.
Öztok, M. (2019) The hidden curriculum of online learning. London: Routledge.
Introduction
This report outlines a planned intervention aimed at improving cognitive accessibility of the Virtual Learning Environment within a fully online Graphic Design course at University of the Arts London (UAL). As a lecturer having taught across BA and MA levels in residential settings for the last then years, my positionality is shaped by a commitment to equity and the everyday observation of barriers that students face in accessing and engaging with academic content outside the taught live sessions. I want to test how course material for a solely online course is written and structured within Moodle—from complex, jargon-heavy texts to content that is inclusive, comprehensible, and purposefully designed for diverse learners. This intersects with my academic practice by directly influencing the pedagogical strategies I employ in online environments, where design and clarity are crucial to student learning experience.
Context
The intervention will take place within the context of developing teaching content for two units for a fully online Graphic Design course at UAL, namely the Unit 3 – Critical Perspectives and Unit 6 – Systems Thinking and Society. Online learning environments, while offering flexibility, often place greater cognitive load on students due to increased textual content and reduced real-time interaction (Cinquin, Guitton and Sauzéon, 2019). This course is being designed from the ground up, giving me the opportunity to embed accessibility principles at the structural level. The intervention will consist of piloting a plain-language and content structure framework by redesigning one week’s worth of course content. The goal is to make content more accessible to students who are neurodivergent, have learning differences, or speak English as an additional language—although the benefit extends to all students. The long-term utility is the development of a toolkit or checklist to apply this framework consistently across the course and potentially beyond.
Inclusive Learning
Inclusion is central to contemporary design education. Graphic Design as a discipline increasingly demands critical awareness of audience, accessibility, and ethical communication. If we expect students to design inclusively, we must model inclusive practices in our own teaching. The rationale for this intervention draws on the framework of Universal Design for Learning (UDL), which encourages proactive design that accommodates variability in how students learn (CAST, 2018). Principles such as providing multiple means of representation and minimising unnecessary complexity align with the intervention’s goals. Additionally, plain language principles have been shown to improve comprehension and reduce cognitive overload (Redish, 2010). Seymour (2024) highlights how applying UDL in online research methods teaching significantly enhanced engagement, especially for learners with access needs. Similarly, Cinquin, Guitton and Sauzéon (2020) argue for integrating cognitive accessibility features into MOOCs to improve participation and learning outcomes.
At the same time, there is a balance to be struck. Within Graphic Design education, the use of discipline-specific terminology plays an important role in enabling precise and critical discussion. Rather than removing such terminology entirely, the intervention promotes the use of accessible definitions and contextualised and interactive glossaries to support student understanding. This approach aims to retain the richness of academic discourse while removing unnecessary barriers to entry. Applying these combined strategies in course design supports a learning environment that is both equitable and intellectually rigorous.
Reflection
My thinking has been shaped by both pedagogical theory and lived teaching experience. Conversations with colleagues on the PgCert in Academic Practice affirmed the need for clearer content and highlighted potential for broader application. Starting with a single week of content will allow me to test the intervention without overextending its initial scope.
In an early version of the intervention, shared through a peer presentation (Figure 1), helped clarify my goals and sparked valuable feedback, helped articulate design intentions more clearly and allowed me to gather useful visual references from the Moodle pages from another unit which is in development but indicative of how my content will look within the VLE (Huber, 2025).
Fig. 1
I also drew inspiration from colleagues working around Moodle’s limitations. A compelling example was Adrian Allen’s Miro-based visual overview of a course (Figure 2) which offers an intuitive content map to replace Moodle’s rigid structure. This approach helped me reimagine navigation not just as technical infrastructure, but as pedagogy (Allen, 2025).

Fig. 2
Looking beyond UAL, the interface of the University of Warwick’s “Literature and Mental Health” course on FutureLearn (Figure 3) provided an example of strong UX principles in practice—especially its consistent pacing, visual hierarchy, and accessible layout (FutureLearn, 2025). It reinforced the idea that digital environments are not neutral but deeply pedagogical.

Fig. 3
Each of these examples shaped how I approached my intervention: the importance of clarity (Figure 1), spatial and narrative logic (Figure 2), and inclusive interface design (Figure 3). Together, they affirmed that visual design is not decorative, but foundational to accessible and effective online learning.
I began my intervention design by only looking primarily at plain language and content structure. However, after being prompted by my peers and tutors I am now interested in expanding the intervention to consider technical tools and APIs that could enhance accessibility—for instance, the ability to adjust type size, toggle colour modes for contrast sensitivity, or integrate plugins for sign language or screen readers. While these go beyond the immediate scope of the unit redesign, they point to a wider ecosystem of accessible practices that could transform how we approach online learning.
There is, however, significant institutional resistance. There are licensing constraints, technical limitations, and competing priorities that impede more ambitious innovation. This project is not intended as a critique of past or current design decisions—which have necessarily been shaped by the limits of time, technology, budget, and institutional style guides. Rather, the intervention is intended to imagine what the future of accessible online learning could look like, and explore UAL’s potential to lead in this area.
Action
To deepen the intervention’s relevance and usability, I plan to involve a small group of students in a participatory design process. Drawing on “deep data” methods (peer feedback, 2025), this process will surface detailed insights from a diverse group—some with access needs, others representing more typical learners. This balanced group can identify pain points in the current content design and test the revised materials. As the EDI Champion of the LCC Design School, I also have access to UAL’s EDI student forum, a potential recruitment space for this pilot cohort.
This kind of inclusive co-design aligns with well-established UX and inclusive design principles and would allow me to move from assumptions about accessibility to co-created solutions. It also would enablesthe development of clear case studies and redesign examples—evidence that can be shared with colleagues and advocates to support wider institutional change. I hope the toolkit would then ultimately inform internal training or form part of a practical accessibility guide, complementing broader initiatives like UAL’s digital accessibility campaigns (UAL, 2025) or the ALT’s ethical learning technology framework (ALT, 2024).
Initially, I will pilot the intervention by redesigning one week within a unit in the MA online Graphic Design course, but I could envision the framework becoming a shared resource across UAL’s Design School—something accessible to other course teams seeking to embed accessibility from the outset. At its core is a framework for plain language and structured content that I will refine and document as a practical toolkit. This intervention could hopefully become not only as a one-time hypothetical redesign of course content, but could form the basis for a sustainable shift in how online learning materials are produced.
Evaluation
This process has already illuminated several key lessons. First, I’ve learned that accessibility interventions require institutional negotiation. While individual course leaders may advocate for inclusive design, the limits of platforms like Moodle (e.g. inflexible navigation, poor visual affordance, lack of responsive design) can undermine these intentions. The feedback from peers underscored that many of these issues are shared across courses with several courses pivoting to other platforms to compensate for the shortcomings of Moodle, as well as several colleagues offered examples of local Moodle redesigns.
Second, the process has helped me clarify what kinds of evidence are most impactful. Quantitative student satisfaction data may not capture the nuances of exclusion. By contrast, qualitative insights—user journeys, quotes, screen recordings—can make visible the frustrations and cognitive effort required to navigate poorly designed systems. These “deep data” methods provide a compelling case for change.
If implemented, I would measure success in three ways: (1) through student feedback on usability and clarity of the revised content, (2) through changes in student engagement (e.g. completion rates, interaction with materials), and (3) through feedback from academic support specialists or EDI stakeholders. Over time, if other course teams began adapting the toolkit or requesting guidance, that would be a further marker of success.
Conclusion
This project has brought into focus how my positionality is shifting – from a face-to-face lecturer to an online course designer, and from individual practitioner to potential advocate for structural change. Teaching online alters the terms of engagement; without the nuance of classroom interaction, the written word, visual design, and navigation become primary pedagogical tools. This creates both a challenge and an opportunity; course materials can either be a barrier or a bridge.
The move toward digital education has intensified questions around access and equity. Students who are neurodivergent, have learning differences, or come from linguistically diverse backgrounds are disproportionately affected by poor content design (Cinquin, Guitton and Sauzéon, 2019; Seymour, 2024). Yet these same students often lack the power to shape how content is delivered. My intervention is a small but strategic step toward rebalancing that dynamic.
The feedback I received was affirming but also critical in the best sense – challenging me to deepen the participatory element, consider wider institutional applicability, and balance the needs of “extreme” and “average” users. I now see this not just as a personal practice shift but as a potential catalyst for a larger conversation around inclusive digital pedagogy. As such, my goal is not perfection but transformation: creating a culture where cognitive accessibility is a shared, embedded practice rather than an afterthought.
References
Allen, A., 2025. Course map for Central Saint Martins programme [Miro board], 27 June. Available at: https://miro.com/app/board/uXjVLZHWM8Y=/ (Accessed 25 July 2025).
ALT (2024) New Resources: ALT’s Framework for Ethical Learning Technology. Association for Learning Technology. Available at: https://www.alt.ac.uk/news/all_news/new-resources-alts-framework-ethical-learning-technology (Accessed: 25 July 2025).
CAST (2018) Universal Design for Learning Guidelines Version 2.2. Available at: http://udlguidelines.cast.org (Accessed: 5 June 2025).
Cinquin, P.-A., Guitton, P. and Sauzéon, H. (2019) ‘Online e-learning and cognitive disabilities: A systematic review’, Computers & Education, 130, pp. 152–167. https://doi.org/10.1016/j.compedu.2018.12.004
Cinquin, P.-A., Guitton, P. and Sauzéon, H. (2020) ‘Designing accessible MOOCs to expand educational opportunities for persons with cognitive impairments’, Behaviour & Information Technology, 40(11), pp. 1101–1119. https://doi.org/10.1080/0144929X.2020.1740485
FutureLearn (University of Warwick), 2025. Literature and Mental Health course interface [online screenshot], FutureLearn. (Accessed 25 July 2025).
Huber, A., 2025. Intervention Design (PDF presentation), delivered online on 27 June 2025. [Online]
Lasekan, O.A., Pachava, V., Godoy Pena, M.T., Golla, S.K. and Raje, M.S. (2024) ‘Investigating factors influencing students’ engagement in sustainable online education’, Sustainability, 16(2), p. 689. https://doi.org/10.3390/su16020689
Redish, J. (2010) Letting Go of the Words: Writing Web Content that Works, 2nd edn. San Francisco: Morgan Kaufmann.
Seymour, M. (2024) ‘Enhancing the online student experience through the application of Universal Design for Learning (UDL) to research methods learning and teaching’, Education and Information Technologies, 29(3), pp. 2767–2785. https://doi.org/10.1007/s10639-023-12357-z
UAL (2025) Quick Tips to Improve Accessibility in Miro. UAL Teaching and Learning Exchange Blog, 15 May. Available at: https://support.myblog.arts.ac.uk/2025/05/15/quick-tips-to-improve-acces