Categories
General

ARP – Blog Post 3: ACTING


Research methods and data collection

This stage of the action research project prompted deeper reflection on how knowledge is produced through research. Rather than viewing data collection as a neutral or technical phase, I became increasingly aware that the framing of questions, the structure of conversations, and my own position as a practitioner-researcher actively shaped what could be said – and what could remain unsaid. Kvale and Brinkmann (2009) describe qualitative interviewing as a craft, where knowledge is not discovered but co-constructed through dialogue. Similarly, Alvesson (2012) cautions against treating interviews as transparent windows into experience, arguing that interview data are always shaped by interaction, context, and institutional positioning. This understanding became central to how I approached both the semi-structured interviews and the written questionnaire. Rather than attempting to extract information, the aim was to create conditions in which participants could reflect on their own practices, assumptions, and constraints.

This perspective aligns with Denscombe’s (2014) view that practitioner research methods must be responsive to context, relationships, and purpose rather than relying on standardised procedure. A set of five open-ended questions formed the basis of the enquiry. These were developed iteratively through group tutorials and refined through peer and tutor feedback, supporting clarity and focus while retaining openness. As a second-language English speaker, I also used digital tools to sense-check phrasing, ensuring accessibility without altering the intent of the questions. This process heightened my awareness that linguistic clarity is not only an ethical consideration, but also a methodological one: poorly framed questions risk excluding nuance before analysis even begins.

The decision to keep the questions open-ended was deliberate. Kvale and Brinkmann (2009) emphasise that open questioning allows participants to construct meaning in their own terms, while Cohen, Manion and Morrison (2018) argue that semi-structured interviews offer a balance between direction and flexibility. In practice, this approach shifted the balance of control within the interviews. Participants frequently reframed questions, introduced new concerns, and reflected on tensions between pedagogical intention and institutional constraint. Irvine et al. (2012) note that such moments – where participants negotiate the adequacy of their responses – are not interruptions to the research process, but central to how meaning is collaboratively produced. These interactions became significant sites of insight, often revealing issues that had not been anticipated during the planning stage.

The same question set was adapted into an open-ended questionnaire to enable asynchronous participation. While this approach increased accessibility and respected colleagues’ time constraints, it also revealed clear methodological limitations. Written responses tended to be shorter, less reflective, and more descriptive than interview data. This contrast highlighted how depth in qualitative research is not simply a function of questioning, but of relational space. Without conversational cues, prompts, or opportunities for clarification, participants appeared more likely to prioritise efficiency over reflection. This observation reinforced the idea that questionnaires, while useful for breadth, are less effective for accessing layered pedagogical reasoning in small-scale practitioner research.

Alongside participant-generated data, I conducted a visual artefact review of Moodle and alternative platforms including Miro, Padlet, and Notion. This review was prompted by a recurring pattern across interviews: colleagues frequently described “pivoting away” from Moodle when attempting to enact their pedagogy. The artefact analysis therefore sought to understand why these shifts occurred, not at the level of preference, but through interface structure. Drawing on Bowen’s (2009) conception of document analysis, digital platforms were treated as institutional texts, embedding assumptions about hierarchy, sequencing, and learner behaviour. Reviewing these environments made visible a distinction repeatedly referenced by participants: the difference between verbal cognitive load (instructions, explanations, written guidance) and visual cognitive load (navigation depth, spatial organisation, and information density). This contrast clarified how learning friction often emerged not from content itself, but from the effort required to locate, interpret, and contextualise it.

Examining artefacts alongside interview and questionnaire data supported methodological triangulation, while also revealing productive tensions between what educators articulated verbally and what platforms materially afforded. This process underscored that pedagogy is not only expressed through teaching intention, but enacted — and at times constrained – through interface design.

Throughout this stage, reflexivity remained central. As a practitioner researching within my own institution, shared language and professional proximity enabled openness, yet also risked normalising constraint. Recognising this tension became part of the enquiry itself. Rather than striving for detachment, I came to understand my situated perspective as shaping both the questions asked and the issues that felt most urgent. This stage therefore generated more than data for analysis: it deepened my understanding of qualitative research as a relational, situated practice and shaped how I approached the subsequent phase of thematic and artefact-based interpretation.

Reference
Alvesson, M. (2012) ‘Views on interviews: a skeptical review’, Qualitative Research in Organizations and Management, 8(1), pp. 23–44.
Bowen, G. A. (2009) ‘Document analysis as a qualitative research method’, Qualitative Research Journal, 9(2), pp. 27–40.
Cohen, L., Manion, L. and Morrison, K. (2018) Research methods in education. 8th edn. London: Routledge.
Denscombe, M. (2014) The good research guide: for small-scale social research projects. 5th edn. Maidenhead: Open University Press.
Irvine, A., Drew, P. and Sainsbury, R. (2012) ‘“Am I not answering your questions properly?” Clarification, adequacy and responsiveness in semi-structured telephone and face-to-face interviews’, Qualitative Research, 13(1), pp. 87–106.
Kvale, S. and Brinkmann, S. (2009) InterViews: learning the craft of qualitative research interviewing. 2nd edn. London: Sage.


Figure 1: Visual summary of the iterative development of semi-structured interview questions, showing the progression from initial self-generated prompts through tutor and peer feedback to final refinement, including AI-supported sense-checking to improve clarity, focus, and accessibility (Huber, 2025).

Figure 2: Screenshot of anonymised interview transcript (Transcript 1), showing cleaned formatting with line breaks applied to support readability and subsequent thematic analysis (Huber, 2025).

Figure 3: Screenshot of anonymised interview transcript (Transcript 2), prepared for analysis through formatting refinement and improved visual legibility (Huber, 2025).

Figure 4: Screenshot of Microsoft Forms questionnaire design used to collect qualitative staff responses for this action research project (Huber, 2025).

Figure 5: Screenshot of Microsoft Forms interface displaying anonymised questionnaire responses collected during the enquiry phase (Huber, 2025).

Figure 6: Screenshot of Excel spreadsheet summarising qualitative questionnaire responses to support thematic analysis (Huber, 2025).

Figure 6: Screenshot of an anonymised Miro board from a residential art and design course, showing a full unit mapped within a single shared visual workspace to support overview, continuity, and non-linear learning pathways (Huber, 2025).

Figure 6 (detail): Close-up view of the same anonymised Miro board, illustrating clustered activities, iterative making, and reflective stages presented simultaneously within one visual field (Huber, 2025).

Figure 7: Close-up view of the same anonymised Miro board, illustrating clustered activities, iterative making, and reflective stages presented simultaneously within one visual field (Huber, 2025).

Figure 8: Screenshot of an anonymised Notion workspace used to reorganise all learning materials from a single unit into an alternative interface, demonstrating how content can be presented within a continuous white-background environment to support readability, visual coherence, and navigational clarity (Huber, 2025).

Leave a Reply

Your email address will not be published. Required fields are marked *