Lecturing Versus Student-Led Research

Brief Description:

Discover how one educator's experiment comparing traditional lectures to student-led research revealed dramatic differences in learning outcomes and student engagement.

Summary:

This episode tackles a question every educator faces: does traditional lecturing actually work, or should we be empowering students to lead their own learning? We explore a fascinating action research study that compared these two approaches in vocational education, uncovering surprising results about student engagement, writing skills, and exam preparation that could change how you think about teaching.

 
  • Action Research Lecturing Versus Student-Led Learning

    (0:00 - 0:12)

    Welcome to the Deep Dive. This is part of MKLC training's research series where we explore different investigations into teaching and learning. And today we're looking at a really interesting action research project.

    (0:12 - 1:01)

    It's from a student on the Level 5 Diploma in Education and Training. Yeah, and this one tackles something I think a lot of us grapple with, especially if you're in, you know, vocational training. It's that whole thing of getting learners, particularly in practical subjects, how do you get them to really hold on to the theory? Exactly.

    And not just the theory, but also developing those essential skills, like writing for assessments, which is becoming even more crucial with things like the new T levels coming in. Absolutely. So this research is all about that.

    The educator felt, well, just lecturing wasn't really doing the job, wasn't embedding the knowledge or getting learners ready for the kinds of assessments they face now. Right. So they decided to try something different, move away from the sort of stand and deliver lecture towards a more student-led approach, getting the learners doing the research themselves.

    (1:02 - 1:39)

    OK, so let's dig into the specifics. The project title itself is, well, it's pretty direct, isn't it? It really is. It's called Lecturing Does Not Seem to Embed Knowledge and Prepare Learners for Endpoint Assessments.

    So we are going to use project-based, student-led research learning to see if the learners have better retention of knowledge and are better prepared for endpoint assessments. Phew. Yeah, leaves little doubt about the focus.

    So the author, we know they're a further education teacher. Yes, teaching electrical installation in a building services department. We don't have their name from the summary, but you can definitely feel that practical focus behind the question they're asking.

    (1:40 - 1:58)

    And the main aim then was basically to see if this shift, this student-led research idea actually worked. Did it, you know, improve knowledge retention? Did it help with writing? Did it make them feel better prepared for assessments, whether that's projects or exams? Precisely. And critically, they wanted to compare.

    (2:00 - 2:28)

    Does this new approach work better specifically for T-level learners compared to learners on a more traditional technical qualification who are still being taught mainly through lectures? That comparison is key. OK, so they used action research to structure this. Can we just quickly touch on what that involves based on this summary? Sure.

    Action research, fundamentally, it's a cycle, isn't it? It's what educators use to reflect on and improve what they're doing. You spot an issue or something you want to change. Right.

    (2:28 - 2:44)

    You plan an intervention like this new teaching method. You put it into practise. You collect data to see what's happening.

    You analyse that data and then you reflect what worked, what didn't, what next. It's very practise focused. And the specific method here was that comparison you mentioned earlier.

    (2:44 - 3:00)

    Yeah. One group gets the new research style, the other gets the lectures. Exactly.

    The T-level group got the student-led research approach, and the technical qualification group got the traditional lecturing. So how did they measure the difference? What kind of data were they collecting? They were quite thorough, actually. They used a few different methods.

    (3:01 - 4:09)

    First, written assignments. They looked at the word count learners produced before they started the new approach and then again after the intervention period. OK, so some hard numbers there.

    Quantitative data. Yep. But numbers don't tell the whole story, do they? So they also did observations, watching the learners in class.

    How engaged were they? What were their behaviours like under each teaching style? How are they using their time? That gives you the qualitative side, the how and why. Makes sense. And anything else? Yes.

    They also had formal conversations with the learners, actually talking to them to get their perspective. What did they feel worked? What did they prefer? What challenges did they face? What skills did they think they were gaining? Right. Getting the student voice in there.

    So written work, observations, and student feedback. A good mix. Definitely.

    And they tried hard to make the comparison fair, you know, keeping things consistent between the groups. How so? Well, things like the topics being taught, the classroom environment, the amount of time they had for tasks, the resources available, all kept as similar as possible. And the class sizes were the same too, 15 learners in each group.

    (4:09 - 4:28)

    And crucially, I suppose, thinking about their starting points. Especially with writing. Absolutely.

    The researcher was aware of that. They looked at the learners' prior results, specifically their GCSE English grades, when picking the groups. The idea was to start with two groups who were, broadly speaking, at a similar level in terms of literacy.

    (4:28 - 4:40)

    So any differences they saw later were more likely down to the teaching method, not just because one group started off stronger. That was the intention, yes. To isolate the impact of the teaching style as much as possible.

    (4:40 - 4:50)

    OK, great. So the setup is clear. Comparative action research, T-level versus technical, research-led versus lectures, measured through word counts, observations, and conversations.

    (4:51 - 5:12)

    Let's get to the juicy bit, the findings. What did those written assignments show? The quantitative stuff first. Right.

    This is where it gets quite striking. Initially, before the intervention, the word counts were pretty similar for both groups. Most learners were handing in work that was, say, 100 to 200 words long, maybe up to about 260 words max for most.

    (5:12 - 5:31)

    OK, so starting on a fairly even footing. Exactly. But then, after the period of different teaching, they did another written assessment, and the results were, well, very different.

    The technical group, the ones who had the lectures, they did improve a bit. Their average word count went up by about 100 words. So some progress.

    (5:32 - 5:40)

    Some, yes. But the researcher noted they really struggled with it. They found it hard work, seemed quite disengaged, and apparently weren't keen on writing more than about 300 words.

    (5:41 - 5:46)

    It felt like pushing water uphill, maybe. OK. And the T-level group, the ones doing the research-led learning.

    (5:46 - 5:57)

    Ah, now that was a completely different picture. Their improvement was much, much bigger. The author observed they stayed focused during the research and writing, and actually got better, more efficient at the research part over time.

    (5:57 - 6:40)

    Most of them were producing work that was 700 to 800 words long. Wow, 700 to 800, compared to maybe 300-ish for the other group. Yeah.

    And there was even one learner who started off writing about 370 words, and in the post-assessment, they managed 1,000 words. 1,000 words. That's a huge jump, from under 300, typically, to 700, 800, even 1,000.

    I mean, that quantitative data on its own is pretty powerful, isn't it? Suggests the research approach really unlocked something in terms of their writing output. It's compelling evidence, certainly. But like we said, the observations and the conversations help explain why we might be seeing that difference, what was actually happening in the classroom.

    (6:40 - 7:59)

    Right. What did the observations reveal about how the two groups reacted? Well, it showed a really interesting contrast. The T-level learners, the ones doing the new research style, they found it tough at first.

    The author notes they weren't used to finding information for themselves. They expected the teacher to just give it to them. It took them apparently a couple weeks to kind of adjust and get into the swing of it.

    That's understandable. It's a different way of working. It is.

    But then the observation was that they started taking more ownership. Initially, yeah, there was maybe a bit of just copying and pasting stuff from the internet, which you might expect. Sure.

    The easy route. But over the weeks, they got better. They started actually using the information they found for their projects, putting it together.

    And importantly, when the tutor prompted them with questions, they could think more deeply. They moved beyond just describing things to actually analysing and evaluating the information. Having PCs available in the theory lessons was noted as being pretty important for this.

    So a definite development towards more independent, higher level thinking. What about the technical group, the ones with the lectures? They settled in quite well at the beginning. It was familiar, like school, you know, they took notes, behaviour was fine.

    But then around week three or four, things started to dip. Observations showed them becoming more disengaged, taking fewer notes. Losing focus.

    (7:59 - 8:20)

    Seems like it. And when the teacher tried to push them a bit, get them to think for themselves or do tasks that required processing information rather than just receiving it, they tended to get distracted, go off topic. The researchers specifically mentioned that when it came to the actual task for data collection, the research and writing bit, this group really struggled to stay on track and produce the work.

    (8:20 - 8:43)

    That's a really stark difference in observed behaviour. Makes you wonder what the learners themselves were thinking. What came out of those formal conversations? Yeah, that adds another layer.

    For the T-level learners, the feedback on the research project style was mostly positive. They said it kept them more engaged. They felt they were covering the content more thoroughly because they were actively involved in finding and discussing it.

    (8:43 - 8:58)

    Did they mention any downsides? Any difficulties with it? They did. They were honest. They pointed out that because they were sometimes researching different bits individually, they relied on sharing info with peers, which meant they might have to go back later and fill in gaps for themselves.

    (8:59 - 9:19)

    Some also admitted feeling a bit overwhelmed sometimes by just how much information was out there. OK, fair enough. But, and this is key, they felt they were developing skills beyond just the course content, like research, putting information together, and they felt this made them much better prepared for the kinds of assessments they knew they were facing on the T-level.

    (9:19 - 9:32)

    Right. And the technical learners, what did they say in their chats? They appreciated getting the core knowledge directly from the tutor in the lectures. They felt confident in that information, but they also said the lessons could be boring if there weren't enough activities.

    (9:32 - 9:39)

    Ah, the engagement factor again. Exactly. And when it came to tasks that required independence, they openly said they struggled.

    (9:40 - 10:16)

    Some actually said they felt they just weren't capable of working in that T-level style, the research and independent writing way. They lacked confidence in finding information themselves or taking responsibility for tasks that weren't clearly laid out by the tutors. Wow.

    So you've got the word count showing a huge difference, the observations showing different levels of engagement and independence, and the learners' own words reflecting feelings of capability, preference, and preparedness. It all seems to point towards the idea that the method really suited the qualification type, doesn't it? That was very much the author's interpretation, yes. A really central finding.

    (10:17 - 10:29)

    The research-led learning style seemed tailor-made, almost, for the T-level employer-set projects. Those assessments demand research, synthesis, detailed writing, applying knowledge. The teaching method was directly developing those skills.

    (10:29 - 10:58)

    Where is the lecturing? The lecturing, while maybe good for getting across foundational facts needed for recall, seemed to align better with the assessment style of the technical qualification in this instance, which leaned heavily on multiple-choice tests. Good for knowledge transfer, maybe, but not so much for building those independent research and writing skills needed elsewhere. So it's not just about what you teach, but how you teach it can actively shape the skills learners develop, and whether they're ready for specific types of assessment.

    (10:58 - 11:13)

    That's a really important point for educators, isn't it? It really is. And another interesting aspect of action research is how it makes the educator reflect on their own practise. The author here found some unexpected plus points for themselves and their team.

    (11:13 - 11:28)

    Oh, really? Like what? Well, they found that using the research-led approach actually made them better at questioning. Instead of just telling, they had to ask more probing questions, encouraging learners to describe, analyse, evaluate the information they found. It pushed their own skills.

    (11:28 - 11:32)

    That's brilliant. So it benefited the teacher's development too. Exactly.

    (11:32 - 11:47)

    And because they weren't just standing at the front talking the whole time, it opened up space during lessons for more one-to-one support with students, which they felt improved relationships. They could be more of a guide, a facilitator. More coaching, less lecturing.

    (11:47 - 11:56)

    Precisely. It also apparently made lesson delivery itself less demanding on the tutor, weirdly enough. Less pressure to perform, maybe.

    (11:56 - 12:08)

    And they also mentioned that creating the resources for this new T-level approach kind of gave the whole teaching team a bit of a refresh, a creative boost. Those are fantastic side benefits. But realistically, there must have been challenges too.

    (12:08 - 12:16)

    What difficulties did they run into? Oh, definitely. They highlighted a few things. We already touched on the fact that it took learners time to adjust to this new way of working.

    (12:17 - 12:25)

    That initial phase wasn't necessarily smooth sailing. And practically, managing a room full of students on computers. Yep, that came up.

    (12:25 - 12:34)

    More need for classroom management to keep everyone on task, minimise the inevitable distractions of the internet. It's just a reality. And the marking.

    (12:35 - 12:41)

    With the T-level group writing so much more. Huge impact. A significantly increased marking load.

    (12:41 - 12:54)

    The author felt real pressure to give timely, useful feedback, especially as it was project-based work. But just the volume of marking made that tough. Getting feedback done quickly before the class moved on was a bottleneck.

    (12:54 - 13:05)

    Very relatable challenges, I suspect. Now, good research acknowledges its limits. Did the author reflect on the limitations of this particular study? Yes, they were quite clear on that.

    (13:05 - 13:19)

    It was small scale, only 15 learners per group. So while the comparison was useful within that context, you can't necessarily say the findings would apply everywhere. They also pointed out it was done with 16-18-year-olds who had pretty similar starting English grades.

    (13:19 - 13:31)

    Right, so it might not work the same way with different groups. Exactly. The author specifically wondered if the results would be different with, say, lower-level learners, maybe those resitting English, who might need more foundational support.

    (13:31 - 13:47)

    Or perhaps adult learners who bring different experiences and motivations to the table, acknowledging that scope is important. Definitely. So having done this cycle of research, what's next? What did the author recommend? Well, the first thing is to keep an eye on the actual summative assessment results for both groups.

    (13:47 - 13:59)

    See how these teaching styles ultimately play out in their final grades and assessment outcomes. That's the crucial long-term data. In terms of teaching practise... They want to explore finding a kind of middle ground.

    (13:59 - 14:10)

    Try blending elements of the research-led style with traditional lecturing. See if they can get the best of both worlds, maybe. Build knowledge recall and those research and writing skills together.

    (14:10 - 14:41)

    Which raises a really interesting research question itself. It does. How do you specifically target the skills that seemed weaker in each approach? For the T-level group, how do you explicitly build exam technique, if they need it, without losing the benefits of the research focus? And for the technical group, how do you effectively weave in writing and research practise without hindering their success in the recall-based exams they face? It really sounds like this isn't the end of the investigation, but just the first step in an ongoing process.

    (14:41 - 14:56)

    That's the heart of action research, isn't it? It's cyclical. The plan is to repeat the cycle, look at the whole year's results when they're final, and use that to further refine the teaching strategies for both types of qualification. Which brings us neatly to a final thought for you, the listener.

    (14:57 - 15:52)

    If you're involved in teaching, training, professional development, this deep dive really highlights something important. How much are teaching methods shape not just knowledge, but those fundamental skills, research, writing, independence, confidence? Yeah, it really makes you ask, doesn't it? Are we teaching mainly to the assessment, designing everything around the final test or project format? Or are we trying to build a wider set of skills, critical thinking, research that will serve learners longer term, even if they aren't the main thing being tested right now? Based on this research, how much should the specific assessment drive our day-to-day teaching methods? And can we find ways, maybe that middle ground, to blend approaches? Can we develop that broader skill set for all our learners, whatever course they're on, without putting their success in the required assessments at risk? It's a tricky balance, for sure, but this project gives us a really valuable case study to think about in our own work.