In a few weeks' time I'll be running another one of my Assessment in Computing courses, and I have some workshops coming up as well (more details soon). As part of the course I include a hefty booklet containing, amongst other things, a comprehensive list of questions by which to evaluate your own practice, and a list of resources. It's the latter that I wish to talk about.
You'd think that finding and collating lists of assessment-related resources would be fairly straightforward, but in my experience there are a number of challenges.
First, when it comes to assessment in general, there are hundreds, maybe even thousands, of books, articles, and videos on the subject. So the first challenge is to try and identify ones that seem most likely to be useful and relevant.
Secondly, I don't like to recommend something just because it has a nice title, or even because it's been written by an expert. The corollary of that statement is that I have evaluated everything I do recommend. It's a labour-intensive process to say the least!
Thirdly, when it comes to assessing Computing in particular, there are relatively few resources. And many of these are not useful at all. For example, when it comes to measuring progress, I've seen assessment tasks that ask the student to make the program say "Hello, world" and then, as a so-called harder task, to make the program say "Hello, world", followed by "What a lovely day it is today!"
Unless that second sentence is based on an input like a temperature reading or something, it's no harder to code than the first one. It's just another sentence.
I've also noticed that many people (most people?) cite Bloom's Taxonomy, while inadvertently making it clear that they've never actually read Bloom's Taxonomy. (I have, but had to use the British Library to do so.)
Those are just two of the examples of disappointing 'expertise' I've come across.
So sorting the wheat from the chaff is a bit of a task.
I would also suggest that relying on crowd-sourcing, as the Project Quantum does to some extent, is no solution. Or at least, not a complete solution. Why not? Because not all of the questions and quizzes that have been uploaded will be good.
However, in my list of resources I do include Project Quantum and other teacher-created resources because even if you don't find one you regard as excellent, you may find one that is good enough, or that you can tweak (if the licence allows that) or that will spark some ideas of your own.
Another reason I include such resources is that while something created by a teacher may not satisfy the assessment 'expert' as far as technicalities such as validity and reliability are concerned, more often than not it will be very useful by dint of the fact that it actually works.
For instance, take a quiz created for use with Socrative. It's not likely to win any prizes for assessment design, but if it gives you a 'quick and dirty' idea of whether your class understands what a conditional is, who cares?
Because of the sheer number of resources 'out there', I can't possibly scrutinise them all. So what I do when I come across one that looks promising, but which I don't have time to examine in detail, is list them with an asterisk next to them. That means, in effect, "This doesn't look bad at all, but I haven't gone into it enough to be able to recommend it as such -- but do take a look."
A challenge of a completely different nature is when resources originally released as a freemium model, by which you could access a certain amount free of charge and then have access to even more when you subscribe, no longer offer the free bit. That has happened with several of the resources I listed when I first ran the course in 2013. In those cases, I've deleted them from my list -- not because they're not free, but because I can no longer test them to make sure they're OK. (In many cases you can't ask for a free trial unless you can prove you're a school.)
A similar issue is when a resource I looked at and recommended a year ago has now become subsumed into another resource, or another company, or has completely changed its nature even though the name and website remain unchanged.
(To give you an idea of why it's so important to revisit previously-recommended resources, 20 years ago a geography website badged by the government as good for schools to use was taken over by a company peddling pornography. But its url had remained exactly the same.)
Finally, a challenging situation is one where something is called something which it isn't. A case in point was the introduction of Assessing Pupils' Progress in England and Wales a few years ago. Billed as 'Assessment for Learning', it had about as much in common with AfL as Vlad the Impaler had with defenders of human rights.
Despite these challenges, my list has grown from 62 to 88 -- and possibly 89 as I have just heard about another assessment app that I need to explore.
And no doubt the number will have changed again by the time I actually run the course!
If you're interested in coming on one of my assessment courses, (or hiring me to run a workshop or give a talk on the subject), but aren't sure, check out what people have said about them on the Course Testimonials page. If you are sure, then please get in touch.