Methodology

Designing the Intervention

In order to develop a method to collect data which would address my research question, I mapped where I was in the Action Research Cycle. I had already made observations both first hand during workshops I had taught previously and through informal conversations with colleagues Initial Observations. Reflecting on these, along side the research I had done, I designed an intervention to encourage students to learn through active observation. My research question asks whether the introduction of non-verbal supporting material into my teaching practice will improve students’ learning. How would I be able to measure this?

When considering the methodology for my ARP, I identified the following:

  1. Research population: the students from the undergraduate programs at LCF. This group, mainly first and second year students, would be accessible to me during the time frame for the ARP, as it coincided with the scheduled technical workshops that my team delivers. This group would be ideal subjects as most would be novice learners and more likely to be interested in learning a skill. They are the group that will benefit from the results of the ARP if they prove to be positive. Also, as less experienced students, there would be skills which they were not familiar with, which I could use to test the intervention.
  2. Sampling method: Convenience sampling, a Non-probability method, which would be the most practical way of producing data in the short time available. However, being aware of the higher risk of sampling bias when using this technique, I would recruit volunteers from as wide a range of courses and cohorts of students as possible (Mcleod, 2023). The skill I choose to teach will result in some self-selection, eliminating students who are already proficient in it, as they will probably choose not volunteer to take part. This should help to focus the data on the population with the background characteristics I want to be testing.
  3. Data Collection: both Quantitative and Qualitative; a Mixed method approach. Collecting data comparing the effectiveness of different methods of lesson delivery, presents particular difficulties. It is only possible to teach someone something new once, so I would need to test the different teaching methods on different individuals. However, Convenience sampling meant that I would not be working with participants of standardised ability levels, making it difficult to compare identical measurements of each learning outcome. I determined that surveying a large cohort of subjects could mitigate for the potential range of differences in participants, establishing a Quantitative approach as an appropriate method of data collection in the form of a questionnaire. Additionally, to investigate the research question from an Action Research approach, I considered the social justice element of promoting the participant voice and co-construction of the learning environment. In order to collect data around the experience of the participants as they took part in the intervention, a Qualitative approach was also necessary, leading to a Mixed methodology, which would incorporate open ended questions on the questionnaire as well as field observations. This would also act to triangulate the data.

Relying on subjective knowledge enables teachers to engage more effectively as researchers in their educational context. Educator-researchers especially rely on this subjective knowledge in educational contexts to modify their data collection methodologies. Subjective knowledge negotiates the traditional research frameworks with the data collection possibilities of their practice, while also considering their unique educational context.

(Clark et al., 2020)

From my experience of working with undergraduate students at LCF, I considered their previous knowledge of sewing, time pressures, interests and attitudes to learning when designing my project. I felt that, to increase the participant response rate, the intervention would need to be tested in a very short, simple workshop, delivered in no more than 10 minutes. I chose a hand sewing technique called a Chain stitch loop to demonstrate. This is a skill that looks difficult, but once the technique is understood, it can be mastered quite quickly. The demonstration entails a number of steps and the skill requires the application of tension on thread, something best understood through doing. After the demonstration, the participants would be given the materials to practice the stitch and could take their sample home with them to add to their technical files if they wished. I was calculating that students would be interested in taking part in the workshop because of the usefulness of the stitch to their practice.

I planned three versions of the workshop to teach the chainstitch:

Control: A live demonstration of how to construct the stitch with step by step verbal instructions, then students practice the skill. This would be the control version as there would be no intervention.

Workshop Materials

Video: First, a video is shown of the demonstration with no verbal accompaniment. After this, the live demonstration with instructions will be delivered, then students practice the skill.

Video of how to sew a Chainstitch Belt Loop

PDF: First, a PDF is offered which illustrates the skill in steps without text explanation. After this, the live demonstration with instructions will be delivered, then students practice the skill.

Illustrated guide of how to sew a Chainstitch Belt Loop

To measure any increase in ‘noticing’, which I hypothesised would help students in their learning, I identified the learning outcome to be the completion of a sample of the chain stitch. I reflected that the more a student noticed, the more they would be able to comprehend the process and thereby independently complete the sample. So the key question to survey would be, after the demonstration, ‘how much additional instruction (teacher assistance) would the participant need to complete the sample?’ The questionnaire would also pose a number of questions to determine the ability level of the participant, previous knowledge of the stitch and their understanding of the technique they were being taught. This would establish some background on the participants skill level prior to the intervention and help to contextualise the data collected.

To quantify participants responses to questions other than the binary (yes/no) ones, I would use an interval scale of measurement. I designed the scale with 10 grades to give a more nuanced choice of response. This was because I believed that any increase of competency resulting from the intervention would be small, and there fore needed a measurement system that could record that.

I also added some open ended questions asking for comments on the teaching itself, to create space for participants’ voices, as the experts on their own learning. Multiple methods of data collection aid in triangulating the data during analysis, increasing validity and understanding of the results (learning for action). To increase the response rate and in keeping with the principles of inclusive practice, I also informed students that they could write in their own language and I would get the results translated.

Qual enables you to open the world of the ‘small, measurable gain’ see what is happening in there. That may turn out to be another interesting unexpected outcome.

(O’Reilly, 2023)

Testing the Intervention

Having gathered all the necessary equipment, created the supporting material, information sheet and the questionnaire (info sheet and questionnaire), I was ready to start surveying. I could not predict the number of participant responses, so I began with the control version of the workshops and when I had received a good number of responses (22), moved on to testing the video intervention workshops (23 responses) and finally, the PDF intervention workshops (18 responses).

I asked for voluntary participation by making an announcement about my project at the beginning of a class. The classes consisted of 1st and 2nd year fashion design students across menswear, womenswear, pattern cutting, sportswear and fashion design and development courses. At the end of the class, I gathered the interested students together around a table and performed the workshop, giving them the forms and questionnaire at the end to complete. Almost all students who took part in the workshop returned a questionnaire.

To help to increase response rates, and with an awareness of inclusive practices, I let participants know that if they wanted to write responses in their own language, that they would be welcome to and I would have them translated. I did not mention any language in particular so as not to single out any group. In the end, I only had one answer returned in Chinese, which a colleague translated for me.

Additionally, taking an ethnographic approach to qualitative data collection and to increase validity through triangulation of data (Clark, 2020), I observed the students as they watched me, and as they practiced, giving assistance and answering any of their questions as I would during a technical workshop. Then, I wrote my notes down from memory after the session was completed (Field Observations). When collecting the completed questionnaires, I kept them grouped by session, enabling me to assign my corresponding observations to them when encoding.

References

Mcleod, S. (2023) Sampling Methods In Reseach: Types, Techniques, & Examples. SimplyPsychology

Clark, J.S. et al. (2020) Action Research New York: New Prairie Press

O’Reilly, J. (2023) Email to E-Sinn Soong, 20 October.

Leave a Reply

Your email address will not be published. Required fields are marked *