Categories
General

ARP – Blog Post 3: ACTING


Research methods and data collection

This stage of the action research project prompted deeper reflection on how knowledge is produced through research. Rather than viewing data collection as a neutral or technical phase, I became increasingly aware that the framing of questions, the structure of conversations, and my own position as a practitioner-researcher actively shaped what could be said – and what could remain unsaid. Kvale and Brinkmann (2009) describe qualitative interviewing as a craft, where knowledge is not discovered but co-constructed through dialogue. Similarly, Alvesson (2012) cautions against treating interviews as transparent windows into experience, arguing that interview data are always shaped by interaction, context, and institutional positioning. This understanding became central to how I approached both the semi-structured interviews and the written questionnaire. Rather than attempting to extract information, the aim was to create conditions in which participants could reflect on their own practices, assumptions, and constraints.

This perspective aligns with Denscombe’s (2014) view that practitioner research methods must be responsive to context, relationships, and purpose rather than relying on standardised procedure. A set of five open-ended questions formed the basis of the enquiry. These were developed iteratively through group tutorials and refined through peer and tutor feedback, supporting clarity and focus while retaining openness. As a second-language English speaker, I also used digital tools to sense-check phrasing, ensuring accessibility without altering the intent of the questions. This process heightened my awareness that linguistic clarity is not only an ethical consideration, but also a methodological one: poorly framed questions risk excluding nuance before analysis even begins.

The decision to keep the questions open-ended was deliberate. Kvale and Brinkmann (2009) emphasise that open questioning allows participants to construct meaning in their own terms, while Cohen, Manion and Morrison (2018) argue that semi-structured interviews offer a balance between direction and flexibility. In practice, this approach shifted the balance of control within the interviews. Participants frequently reframed questions, introduced new concerns, and reflected on tensions between pedagogical intention and institutional constraint. Irvine et al. (2012) note that such moments – where participants negotiate the adequacy of their responses – are not interruptions to the research process, but central to how meaning is collaboratively produced. These interactions became significant sites of insight, often revealing issues that had not been anticipated during the planning stage.

The same question set was adapted into an open-ended questionnaire to enable asynchronous participation. While this approach increased accessibility and respected colleagues’ time constraints, it also revealed clear methodological limitations. Written responses tended to be shorter, less reflective, and more descriptive than interview data. This contrast highlighted how depth in qualitative research is not simply a function of questioning, but of relational space. Without conversational cues, prompts, or opportunities for clarification, participants appeared more likely to prioritise efficiency over reflection. This observation reinforced the idea that questionnaires, while useful for breadth, are less effective for accessing layered pedagogical reasoning in small-scale practitioner research.

Alongside participant-generated data, I conducted a visual artefact review of Moodle and alternative platforms including Miro, Padlet, and Notion. This review was prompted by a recurring pattern across interviews: colleagues frequently described “pivoting away” from Moodle when attempting to enact their pedagogy. The artefact analysis therefore sought to understand why these shifts occurred, not at the level of preference, but through interface structure. Drawing on Bowen’s (2009) conception of document analysis, digital platforms were treated as institutional texts, embedding assumptions about hierarchy, sequencing, and learner behaviour. Reviewing these environments made visible a distinction repeatedly referenced by participants: the difference between verbal cognitive load (instructions, explanations, written guidance) and visual cognitive load (navigation depth, spatial organisation, and information density). This contrast clarified how learning friction often emerged not from content itself, but from the effort required to locate, interpret, and contextualise it.

Examining artefacts alongside interview and questionnaire data supported methodological triangulation, while also revealing productive tensions between what educators articulated verbally and what platforms materially afforded. This process underscored that pedagogy is not only expressed through teaching intention, but enacted — and at times constrained – through interface design.

Throughout this stage, reflexivity remained central. As a practitioner researching within my own institution, shared language and professional proximity enabled openness, yet also risked normalising constraint. Recognising this tension became part of the enquiry itself. Rather than striving for detachment, I came to understand my situated perspective as shaping both the questions asked and the issues that felt most urgent. This stage therefore generated more than data for analysis: it deepened my understanding of qualitative research as a relational, situated practice and shaped how I approached the subsequent phase of thematic and artefact-based interpretation.

Reference
Alvesson, M. (2012) ‘Views on interviews: a skeptical review’, Qualitative Research in Organizations and Management, 8(1), pp. 23–44.
Bowen, G. A. (2009) ‘Document analysis as a qualitative research method’, Qualitative Research Journal, 9(2), pp. 27–40.
Cohen, L., Manion, L. and Morrison, K. (2018) Research methods in education. 8th edn. London: Routledge.
Denscombe, M. (2014) The good research guide: for small-scale social research projects. 5th edn. Maidenhead: Open University Press.
Irvine, A., Drew, P. and Sainsbury, R. (2012) ‘“Am I not answering your questions properly?” Clarification, adequacy and responsiveness in semi-structured telephone and face-to-face interviews’, Qualitative Research, 13(1), pp. 87–106.
Kvale, S. and Brinkmann, S. (2009) InterViews: learning the craft of qualitative research interviewing. 2nd edn. London: Sage.


Figure 1: Visual summary of the iterative development of semi-structured interview questions, showing the progression from initial self-generated prompts through tutor and peer feedback to final refinement, including AI-supported sense-checking to improve clarity, focus, and accessibility (Huber, 2025).

Figure 2: Screenshot of anonymised interview transcript (Transcript 1), showing cleaned formatting with line breaks applied to support readability and subsequent thematic analysis (Huber, 2025).

Figure 3: Screenshot of anonymised interview transcript (Transcript 2), prepared for analysis through formatting refinement and improved visual legibility (Huber, 2025).

Figure 4: Screenshot of Microsoft Forms questionnaire design used to collect qualitative staff responses for this action research project (Huber, 2025).

Figure 5: Screenshot of Microsoft Forms interface displaying anonymised questionnaire responses collected during the enquiry phase (Huber, 2025).

Figure 6: Screenshot of Excel spreadsheet summarising qualitative questionnaire responses to support thematic analysis (Huber, 2025).

Figure 6: Screenshot of an anonymised Miro board from a residential art and design course, showing a full unit mapped within a single shared visual workspace to support overview, continuity, and non-linear learning pathways (Huber, 2025).

Figure 6 (detail): Close-up view of the same anonymised Miro board, illustrating clustered activities, iterative making, and reflective stages presented simultaneously within one visual field (Huber, 2025).

Figure 7: Close-up view of the same anonymised Miro board, illustrating clustered activities, iterative making, and reflective stages presented simultaneously within one visual field (Huber, 2025).

Figure 8: Screenshot of an anonymised Notion workspace used to reorganise all learning materials from a single unit into an alternative interface, demonstrating how content can be presented within a continuous white-background environment to support readability, visual coherence, and navigational clarity (Huber, 2025).

Categories
General

ARP – Blog Post 2: PLANNING → ACTING



Defining the research focus: questions, constraints and ethical positioning

Following the identification of cognitive accessibility as a pedagogical and social justice concern in the previous blog post, the next stage of the action research project focused on developing research questions that were both purposeful and feasible within my teaching context. Moving from a broad professional concern to a structured enquiry required careful reflection on how research questions function within practice-based research.

Gray and Malins (2004) describe research questions as devices for focusing enquiry rather than statements of intent or prediction. They argue that effective questions should remain open, exploratory, and capable of evolving as understanding develops. This perspective was particularly relevant to my project, as my initial concerns about online learning environments were grounded in lived teaching experience rather than clearly bounded problems.

Engaging with this literature helped me recognise the need to narrow the enquiry without closing it down. Early iterations of my questions risked becoming either too broad (attempting to address accessibility at an institutional level) or too solution-focused, implying improvement before sufficient understanding had been developed. Gray and Malins caution against research questions that seek outcomes too early, instead advocating for enquiry that enables discovery rather than confirmation. This understanding was reinforced by White (2009), who frames research questioning as a form of disciplined curiosity. She argues that uncertainty plays a productive role in research, allowing understanding to emerge gradually rather than being fixed at the outset. This framing helped me recognise that refining the questions was part of the enquiry itself, not a preliminary task to be completed before learning could begin. Rather than seeking generalisable findings, the enquiry needed to support reflective learning about my own practice and inform future development of my teaching.

Figure 1: Visual representation of the iterative development of the research question, showing the progression from an initial broad enquiry toward more focused and cognitively accessible research questions, informed by tutor and peer feedback.

At the same time, the institutional context influenced what forms of enquiry were possible. While Unit 1 of the MA Graphic Design (Online) was live at the time of the project, access to the current student cohort was not available, and feedback on the VLE was discouraged from director level at this stage of delivery. This limited the scope of participant involvement and required careful reconsideration of the research design. In response, the project pivoted towards engaging colleagues as participants, recognising their role in shaping learning environments, platform structures, and pedagogical decisions. This adaptation reinforced Gray and Malins’ view of research as situated practice, shaped by real conditions rather than idealised models.

As a result, the enquiry was articulated through two related research questions. Together, these questions allowed my enquiry to address both elements which co-exists in the online teaching context (professional practice and digital design structures) while remaining open-ended and appropriate to an action research framework.

Pedagogical approach
How do educators understand and approach cognitive accessibility when designing online learning experiences?

Platform design
How do the design structures of Virtual Learning Environments and alternative platforms support or constrain clarity, navigation, and cognitive accessibility?

Despite this refinement, I remain aware that research questions continue to evolve as understanding develops (Gray and Malins, 2004; White, 2009). Approaching the enquiry in this way enabled flexibility, responsiveness, and critical reflection, establishing a strong foundation for the data collection stage that followed. Ethical guidance from BERA (2024) and Banks (2016) informed decisions around participation, consent, and anonymity, particularly given my position as an insider researcher. These considerations supported the transition from planning into action within the action research cycle and shaped the methodological choices that followed.

References
Banks, S. (2016) Everyday ethics in professional life. London: Palgrave Macmillan.
BERA (2024) Ethical guidelines for educational research. 5th edn. London: BERA.
Gray, C. and Malins, J. (2004) Visualizing research: a guide to the research process in art and design. Aldershot: Ashgate.
Kemmis, S. and McTaggart, R. (1988) The action research planner. Geelong: Deakin University.
White, P. (2009) Developing research questions. Basingstoke: Palgrave Macmillan.