Replies
No one has replied to this post.
Eleanor Piper ah - whilst that's not good for you either, I'm glad I'm not the only one! I have also raised a query via the Help Portal, but as you say, I'm not expecting a prompt answer so thought I'd ask here also as I've found people to be very knowledgeable and helpful in the past. I'll also update here if/when I get a response.
Hi
We are seeing the same thing in our data so the action is to raise with the service desk which you have done.
I would also advise being on top of the data by having your own internal reports that mimic these measures so that you are in control of managing these risks internally and can explain your performance with the ESFA. Many MIS providers have these built into their systems now so speak to them in the first instance, but this can be derived from your data through some simple reporting.
Although we do not have a technical document in the detail we need the updated specification linked below gives you most of what you need to do this:
Apprenticeship training provider accountability framework and specification - GOV.UK
Total number of apprentices = all zprogs that have been in learning in the current academic year excluding those that did not quality as a start.
Past planned end date = all of the above where the zprogs are 90-180 days or 180 days+ passed their planned end date. This includes learners that have completed.
Withdrawals = all Zprogs (excluding the 42 day leavers not qualifying) that have withdrawn in the current academic year.
Breaks in learning = breaks in learning is a little bit more difficult to do as you need to exclude learners that have returned so will need to cross check against other lines of data which QAR data may do for you if you have that available in a reportable format. Other than that complication the process is the same in that you need to identify which zprogs have been on a break for 180-365 days and 356 days+ and measure them against your total apprentices number.
Hi Paul Taylor,
Thanks for taking the time to respond - that's very helpful (and again, reassuring it's not just us).
I do have my own internal measures which I run after each months ILR submission and have been reporting to our Governance board every 6-8 weeks. We moved to a new MIS last year but we still have historic learners not on that system, so it's a bit of a faff combining/checking various data sources.
Since the AAF was now in place for this FY, I wanted to check my manually calculated stats against theirs - they are very close to what's in the learner export from the AAF (and I can now account for the difference since I realise I hadn't excluded 1 or 2 withdrawn learners), but it doesn't match the numbers and percentages reported in the dashboard (which we've also established doesn't match their own data export) - hence the confusion! I'm fairly confident that my calcs are correct - a shame as their figures are lower!
And yes, I always have much fun with the BILs - I highlight the rows with the re-start indicator, sort by ULN and then exclude the rows not required. Seems to be working ok, but is another step in the process.
It's certainly keeping me on my toes.
Charmaine Keeley
AAF PPED dashboard not reconciling with associated data export
Created
Hi all,
I'm going round in circles trying to reconcile the figures displayed on the AAF dashboard for the PPED metric with the associated learner level data export.
For example, the overall count of current learners PPED on the dashboard is ~60. The report shows more than that, though granted, some of them have actual end dates in previous FY.
The over 90 days also doesn't match - far fewer shown on the dashboard than listed on the report.
How do I know which learners they're excluding and why? And if they're not included why are they even on the report?!
Hoping someone can shed some light, or point me to some documentation which clarifies in detail the rules and logic used for the metric.
Thanks in advance!