Thursday, 27 April 2017
Over the past couple of years, I've been supporting a couple of local schools and gained a lot more experience delivering digital media support and helping improve understanding of the Digital competency framework. I've also gained a better idea of what works, what doesn't and what's required going forward.
Firstly to our one mistake. At the start of the first term, we jumped in gung ho! With 100% enthusiasm, we threw varied lesson after varied lessons at the kids and it was a lot of fun. The kids lapped it up, and we kept delivering. Making music one week, designing a website the week after, learning about what makes a good photograph, a bit of coding, cloud storage, cleaning up the kids files followed by safer internet day the week after. At the end of the term though we devised a fun test, a test that revealed a lot more than I'd anticipated. I'll come back to that later.
One of the great things about the DCF is it's intention. Although it's quite intricate and detailed it's reason for existence is completely necessary, arguably overdue. The problem is that it fails to take into account individual pupils capability. We got lucky in that respect, partly because of lack of a coherent plan (by design) at the start of the term, and partly because of class sizes, I ended up teaching pupils from year 4 through to year 6. But as it happens, this is just about perfect for a primary school. The DCF details incremental improvements in knowledge from one year to the next. But the variation in knowledge of year 4-6 pupils is vast. For example some year 4 students might already have their own websites, social media accounts (whether right or wrong) and even Youtube channels. Where as some year 6 might not even have their own computer, laptop, smartphone or tablet
Initially I designed a paper based test, which the kids marked and we discussed the answers. It was a lot of fun, but a lot of paper,
and worse, (a resource that appears to be becoming more scarce in schools with dwindling teacher numbers) time! Whilst it was effective, the workload required afterwards (to correct the kids marking mistakes) and analyse the results was prohibitive. The test was personalised for the children based on what they'd been taught, I came up with the questions I thought they should know the answers to, and hoped they'd give us an even spread of results. It seemed I got this right as the results varied from 3 to 15 out of a possible 19. Maybe I made it a touch too easy?
The information I managed to get from the results was that across the 60 children, there was a chasm of difference in knowledge. The solution clearly was to spilt the group of 60 children across the 3 years into 4 groups of 15 pupils. Eagles, Falcons, Gulls and Hawks (E,F,G,H) with the digital Eagles being the top set. This loose framework, whilst not exactly a coherent plan would empower the digital media trainer to know the skill set of the group they're looking after and have an idea of what they ought to be spending time working with that group on, and more importantly, the level which they should be talking to them.
For example, if working with movie creation, are we at the capturing images, tools and importing file stage, or advanced transitions, titles and exporting formats?
For an external digital media trainer, such as myself, ideally the time required to look after these groups would be based on a daily school plan.
9am-10am school support and or planning
10am -noon Group A (With a defined group)
1pm - 3pm Group B (With a different group)
3 - 4 After school media club (something advanced for the Eagles/Falcon group).
This would enable the support worker to help the school out with technical assistance in the first hour and maximise their time with the pupils.
After thinking about the wasted time completing the paper based test, I tried using a new feature of Google forms to create a quiz (pictured). Which, could make life much easier for teachers. Not only does this new google form allow you to review the answers after the test (as the creator) but it can be integrated into google classroom, cutting down on valuable time marking. The problem is with the implementation. Unless a school has a dedicated digital media support lead, the creation of the forms can be time consuming in itself. Not only that, but the names of the pupils who take the test are only viewable in google docs, detailed analysis of the test results (to define class group and identify learning opportunities) is only achievable with
1) comprehensive spreadsheet knowledge, filtering, ordering etc
2) personalised question design based on previous lesson plans.
While budgets are being strangled and head teachers under increased pressure to deliver more with less, it's essential that our children's computational education isn't effected, it's one of the most exciting, thriving and lucrative* career choices a child could choose and arguably one of the most exciting and enjoyable (for the right sort of person).
I've attached a revised version of the quiz below, feel free to have a play, a local head really struggled with some of the answers yesterday. Then check out your results at the end.
Rough Cuts KS2 test
Many people, myself included, were both excited and fearful of contactless payment. On the one hand buying a few spuds by simply wafting you...
With lockdown, childrens time spent online is naturally going to increase, and thus, the likelihood of bumping into an idiot online and chi...
So my Mum is a retired headteacher, although you would struggle to notice as she volunteers her time to a local school as a teacher, and lik...