Employability in Childcare Curriculum
Brief Description:
Explore how one childcare educator used evidence-based research to dramatically improve student employment outcomes and course progression rates.
Summary:
This episode challenges educators to move beyond assumptions and embrace evidence-based practice. We examine how a childcare tutor used action research to test whether embedding employability skills into their curriculum actually worked - and the results were remarkable. From a 75% increase in student progression to dramatic improvements in job placement rates, discover how systematic inquiry can transform educational outcomes and spark organizational change.
-
[Speaker 2]
Welcome to the Deep Dive, where we sift through the sources to bring you the clearest insights. Today, we're diving into a challenge that's, well, it's pretty common in education, making big changes based on assumptions, maybe ideology or even political wins instead of, you know, solid evidence.
[Speaker 1]
Right. It happens a lot.
[Speaker 2]
Yeah. And it's a big deal. Sir David Bell, who used to head up Ofsted here in the UK, he pointed out how short-term politics can lead to these educational shifts that just, frankly, lack rigorous proof behind them.
[Speaker 1]
The pressure is definitely there.
[Speaker 2]
And it's not just the policymakers, is it? Sometimes educators themselves right on the front lines might adopt methods that sound good.
[Speaker 1]
Like learning styles.
[Speaker 2]
Exactly. That was huge for a while, wasn't it? But there wasn't much solid research showing it actually worked the way people thought it did.
We just, we assume things work without the proof sometimes.
[Speaker 1]
And that assumption is exactly what we're digging into today. We're looking at how a certain kind of enquiry, often done by teachers or leaders themselves, can cut through all that guesswork. It gives educators a way to actually test their own methods properly, you know, to see what really works.
[Speaker 2]
Right. So our mission for this deep dive is to unpack a real world example. We're looking at how one educator applied this kind of focused enquiry to improve something really crucial, learner employability in a child care course.
We want to understand how going about it in this targeted, evidence-based way reveals what's effective and, you know, leads to real improvements for the students. Makes sense. Okay.
So let's start there. Why is it so vital to base changes on evidence, not just feelings or what we've always done? Why not just hope for the best?
[Speaker 1]
Well, the thing about this specific way of investigating this kind of teacher-led enquiry is that it's fundamentally about looking closely at your own practise. It really pushes educators to think deeply about what's working and what isn't, right there in their own classroom or setting. Its main goal in education is really to explore how to make practise better, shine a light on areas that need a bit of work, and prompt real thought through changes.
And it's not just about fixing problems. It helps the educators grow professionally too, become more reflective.
[Speaker 2]
Yeah. And I could see that would feel empowering for the educator. You get to experiment with different approaches.
[Speaker 1]
In a structured way. Yeah.
[Speaker 2]
Right. And you really reflect on what happens. It helps everyone grow the teacher, the students, and you feel like you own the improvements because you figured it out yourself.
It's not just some top-down thing.
[Speaker 1]
Exactly. And it's not usually just a one-time thing. It's more like a cycle, you know, a continuous process of looking at what you do, thinking about it, and then acting to make it better.
It often gets people working together, sharing ideas and experiences. It helps you clearly see the issues, gives you a kind of step-by-step way to tackle them, and keeps everything professional and fair.
[Speaker 2]
Okay. So that's the why. Let's get into the how.
How did this specific tutor and assessor actually use this approach with their child care learners? They wanted to understand where their level two and level three learners went after the course, right? Their progression routes.
[Speaker 1]
That was the goal. The specific question was, all these employability topics they'd recently started weaving into the course, were they actually working? Were they helping learners get jobs or move on to further study?
[Speaker 2]
A very practical question.
[Speaker 1]
Absolutely. Directly looking at the impact of a change they'd made.
[Speaker 2]
And the group they studied was, what, 26 learners?
[Speaker 1]
Yeah, 26 in total. There was a Monday level three class with 18 learners, and a Thursday level two class with eight. Small enough to really get into the details.
[Speaker 2]
Right. So how did they start?
[Speaker 1]
Okay. So back in November 2023, they gathered the baseline, you know, where everyone was starting from. This meant one-to-one chats, looking at group profiles, checking work placement forms to see who was employed or volunteering at that point.
[Speaker 2]
Setting the stage.
[Speaker 1]
Precisely. They also made sure learners knew their options by emailing out a map of potential progression routes and different choices after the course.
[Speaker 2]
And then came the actual teaching part, the interventions. It sounds like they really embedded this employability stuff throughout the course.
[Speaker 1]
They did. It wasn't just one workshop. They covered CV writing, got learners attending broader employability workshops the organisation offered, discussed CPD, continuous professional development, and reflective practise.
[Speaker 2]
Or practical skills alongside reflection.
[Speaker 1]
Specific course units were adapted to include things like interview skills and how to use online job sites effectively.
[Speaker 2]
Okay.
[Speaker 1]
Learners also did specific online training, like PREVENT, which is about safeguarding. And they talked about their career vision as part of their individual learning plan, their ILP.
[Speaker 2]
That personalised roadmap.
[Speaker 1]
Exactly. And they even gave advice on using eNance, you know, for learners with qualifications from outside the UK to get them recognised here. A really practical touch.
[Speaker 2]
That sounds quite thorough.
[Speaker 1]
Yeah.
[Speaker 2]
What about the ethics? Doing research with your own students needs careful handling.
[Speaker 1]
Definitely. And the researcher was very mindful of this. The priority was balancing the research aims with, you know, doing no harm, keeping things confidential, following all the rules.
They were really clear with the learners about why they were doing the study, emphasising that their input was valuable and that the findings would actually influence decisions. Making them partners, in a way.
[Speaker 2]
That's crucial.
[Speaker 1]
Yeah. And they also anticipated, you know, potential questions, or maybe even a bit of resistance from colleagues about trying new things, being prepared for that.
[Speaker 2]
Right. So they laid the groundwork, ran the interventions. Then came the moment of truth.
The data. How do they actually measure the impact?
[Speaker 1]
They used a mix, which is often a really strong way to do it, both quantitative and qualitative methods.
[Speaker 2]
Numbers and stories.
[Speaker 1]
Basically, yeah. Quantitative data came from online forums, JotForums apparently, focussing on the hard facts. How many were employed doing voluntary work on placements?
That got put into spreadsheets, charts, you know, the numerical side. Then the qualitative side involved one-to-one informal chats, interviews, and just observing things directly. That was about getting the deeper picture.
The learners' motivations, their feelings, their ambitions. The why behind the numbers.
[Speaker 2]
That makes sense. Smart. The numbers give you the objective what, the patterns.
And the qualitative stuff gives you the why, the context, the human element.
[Speaker 1]
Absolutely. It provides a much richer understanding than just one type of data alone.
[Speaker 2]
And the timeline. November 23 to July 24.
[Speaker 1]
Yeah. That gave enough time for the changes they made to actually have an effect and to collect meaningful data on the outcomes. Almost a full academic year.
[Speaker 2]
Okay. Let's get to the results then. This is where it gets really interesting.
What did they find when they compared July 2024 to that starting point in November 2023? Let's start with Level 2.
[Speaker 1]
Right. For the Level 2 learners, the change was pretty dramatic. Employment went up by 37.5%. That was from just one learner employed at the start, up to four by July.
[Speaker 2]
Okay. A decent jump in jobs.
[Speaker 1]
But the really big shift was progression onto the Level 3 child care course. That increased by a massive 75%. It went from zero learners planning to progress in November to six learners heading for Level 3 by July.
[Speaker 2]
Wow. 75%. That's huge.
What about volunteering?
[Speaker 1]
That stayed at zero for the Level 2 group.
[Speaker 2]
Interesting. Okay. Now what about the Level 3 learners?
[Speaker 1]
For Level 3, the number who actually secured a job also saw a big increase up by 50%. That went from five learners employed initially to 14.
[Speaker 2]
14 out of 18. That's a strong outcome.
[Speaker 1]
It is. And progression to further education. So Level 4 courses or university that also went up by 22.23% from zero learners initially planning this, up to four.
[Speaker 2]
So some progression to higher levels too. And volunteering for this group.
[Speaker 1]
Volunteering actually saw a slight decrease here by about 5.5%. Went from two learners volunteering down to one.
[Speaker 2]
Okay. So lots of movement there. What did the researcher make of these patterns, especially that huge jump from Level 2 to Level 3?
[Speaker 1]
Well, the connection they made, which seems very plausible, is about the reality of the early years job market. Settings often really prefer staff with a Level 3 qualification.
[Speaker 2]
Right. Because it implies more knowledge.
[Speaker 1]
Exactly. More perceived knowledge and experience, especially in really crucial areas like safeguarding children, planning educational activities. So it makes sense that Level 2 learners seeing this would be highly motivated to move on to Level 3 to boost their job prospects significantly.
[Speaker 2]
That really validates the pathway, doesn't it?
[Speaker 1]
Yeah.
[Speaker 2]
But you mentioned something else, a question that came up.
[Speaker 1]
Yes. This was a key insight, almost a finding in itself. While lots of the Level 3 learners got jobs, which is great, why did only a relatively small number, just four, actually progress to higher education, like Level 4 or university?
[Speaker 2]
Ah, interesting point. They got jobs, but didn't necessarily climb the next academic ladder.
[Speaker 1]
Right. And the researcher flagged this straight away. That specific question, they noted, could easily be the focus of a whole new deep dive.
Why that particular pattern?
[Speaker 2]
So even successful research sparks new questions.
[Speaker 1]
Often the best research does. But the immediate sort of bottom line conclusion from the study was clear.
[Speaker 2]
Which was?
[Speaker 1]
That embedding the employability skills and topics directly into the curriculum, it was definitely effective. The data really backed that up.
[Speaker 2]
OK. So it worked. But did it lead to ideas for doing things even better?
[Speaker 1]
Yeah.
[Speaker 2]
Recommendations?
[Speaker 1]
Yes, absolutely. Based on the findings and the learner feedback, the researcher suggested more structured workshops. Things like first aid, food hygiene, very practical for child care, alongside the CV writing and interview skills.
[Speaker 2]
Makes sense. Anything else?
[Speaker 1]
Yeah. A really interesting collaborative idea, actually inviting people from the industry into the classroom sessions, like representatives from recruitment agencies, managers from local nurseries, or early years settings, supervisors.
[Speaker 2]
Ah, to talk about what they really look for.
[Speaker 1]
Exactly. To discuss the specific skills and knowledge they need when hiring. This could directly feed into units like the one on professional development and reflective practise, make it really grounded.
[Speaker 2]
That sounds incredibly valuable. Did these findings and suggestions actually lead to any changes?
[Speaker 1]
They did. This is where you see the real world impact. The researchers shared the results with the quality assurance team and the employment support team in their organisation.
And because of that evidence, they started offering more workshops. Now, initially there were constraints, you know, funding, finding suitable locations.
[Speaker 2]
The usual practicalities.
[Speaker 1]
Right. But significantly, the organisation also began offering free, standalone employability courses that learners could take alongside their main child care programme. So a direct programmatic change spurred by this research.
[Speaker 2]
That's fantastic. It shows how one person's enquiry can ripple outwards. What about the impact on the researcher themselves?
[Speaker 1]
They described it as eye-opening. They said it really helped them understand better how to make sure learners have very clear ideas about the skills and knowledge they need to progress, whether that's to the next level of study or into a job.
[Speaker 2]
So it clarified things for their own teaching practise.
[Speaker 1]
Definitely. And it also helped them pinpoint specific CPD courses that would be really beneficial for learners, like that free online prevent training. They mentioned things that could give learners an edge.
[Speaker 2]
So a personal professional development angle too. What about the bigger picture looking forward?
[Speaker 1]
Well, the researcher saw this as a really valuable first step, but they also recognised its limitations, you know, being focused on just these two groups. They suggested that what's needed next is broader action research involving more groups, maybe across different departments in the centre, to see the overall trends.
[Speaker 2]
A more systemic view?
[Speaker 1]
Exactly. And also tracking learners after they leave XLearners to see the longer-term impact of the skills they gained. Plus, digging into those other factors behind the level three progression puzzle.
[Speaker 2]
The reasons why more didn't go into higher education.
[Speaker 1]
Right. Investigating the social factors, financial pressures, personal circumstances that might play a role. And the key, they suggested, is making this a truly collaborative effort involving all the tutors and staff.
[Speaker 2]
It really underscores that idea that knowledge isn't just abstract, is it? It's most powerful when you connect it to real problems and use it to make things better.
[Speaker 1]
Couldn't agree more.
[Speaker 2]
So just to recap, this one-focus deep dive really shone a light on what was working in that curriculum, what could be tweaked, and even uncovered brand new questions to explore next. It's such a clear example of how using evidence, not assumptions, can genuinely improve things in education.
[Speaker 1]
It really is. And it leads to a final thought, perhaps, for you listening. Think about your own field, your own work, or even just an area you're interested in.
What assumptions might be floating around? What things are taken as given, maybe without solid proof?
[Speaker 2]
That's a challenging question.
[Speaker 1]
And then consider, what small, focused investigation, like the one we discussed today, could you maybe undertake? Just to explore one of those assumptions, gather a bit of your ored evidence and see what the real story is.
[Speaker 2]
It doesn't have to be a huge project, does it?
[Speaker 1]
Not at all. Just a targeted look. It could potentially reveal some hidden insights and maybe even help you make a real evidence-based difference in your own practice or understanding.