Assessing “soft” skills

Do you have anything you’d like to add to the discussion, Terry?

The scene was a meeting at the Edusummit conference at UNESCO in Paris in 2011. The question came from the Chair.

Thank you, but no: everything I was going to say has already been said.

That was my response, because I didn’t see any purpose in repeating points that had not only been made, but also generally agreed upon. In fact, my contributions to many meetings are based on Salvator Rosa’s, dictum:

Be silent, unless what you have to say is better than silence.

The question is: does that make me a good collaborator, or not so good? How do we measure such things? And does any of it matter anyway?

Read More

How to collaborate with other schools when you're not allowed to

The preference of some Academies for not collaborating with other schools is not only annoying, it is, ultimately, self-defeating. Whether it stems from hubris, aggressively defensive commercialism, or a combination of the two, this practice seems to assume that one school cannot learn from another. Or, at least, that it will learn less than it "gives away". 
Read More

11 Reasons to collaborate with other schools in implementing the new Computing Programme of Study

1942 ... Rosie the Riveter!John Donne wrote that no man is an island; he might have said the same thing about schools. Many schools have a mindset perhaps best described as “splendid isolation” – except that there is nothing splendid about it. In fact, in many cases it is just plain daft.  Here are my reasons for saying so.
Read More

Web 2.0 For Rookies: Working Together

Web 2.0 is about nothing if it isn't about working with other people in some way. It doesn't matter what application we're talking about, working with other people is what it's designed to do. That, in fact, is what Web 2.0 is, hence my very pragmatic definition: Web 2.0 is as Web 2.0 does, as explained in the very first article in this series.

It's not just kids who work togetherNow, the reason I'm talking about 'working together' rather than 'collaborating' is that it seems to me that 'working together' is more encompassing. Why? Because there are so many ways in which people can work together.

They may indeed collaborate, for example in the development of a mindmap using a program like Bubbl.us or Mindmeister. Or they may contribute a note or a comment which, while possibly insightful, is not as involved, perhaps, as collaboration.

Perhaps this is splitting hairs, but I am thinking in particular of the sort of youngster I had in my Business and Information Technology class 20 odd years ago. Group work was the order of the day, but he preferred to chat with members of a neighbouring group about last night's soccer. Nevertheless, in a feat of multitasking not usually seen in males (sorry to sound sexist, but it's true), he was also able to follow the discussion in his group.

Thus, every so often he would look back over his shoulder and say, "Well how about a targetted advertising campaign?" or "An overdraft would be better." Invariably, the rest of the group would continue in this new direction, and he would go back to discussing the game.

The interesting thing here is that the rubric supplied by the Examinations Board (now called an Awarding Body) didn't have any provision for that sort of contribution, which meant that my colleagues and I spent ages debating whether he was really good at collaborating, or excruciatingly bad at it - because he wasn't really collaborating at all in the true sense of the term. Had there been a box for "Makes useful contributions" it would have been a non-issue: A for contribution, D minus for collaboration.
I'll deal with assessment issues in a separate article. The point I'm making here is that Web 2.0 facilitates working together in all its guises.

Now, if you think of Web 2.0 from this point of view, it makes life easier if you're not allowed to use Web 2.0 applications in your school, because there are alternatives to some applications. For example, some Learning Platforms and Virtual Learning Environments include a forum feature that takes the form of an area on which people can post 'stickies'. So, if you can't use Wallwisher you may have something like that instead. It will have a limitation in that nobody will be able to view it without logging in to the VLE, but for many schools that would be seen as an advantage anyway.

And at the risk of causing you to shudder, even a program like Word, or the OpenOffice version of it, has a review facility whereby people can make suggested changes and leave named comments. OK, it's not something you can use in real-time, and you can end up with the most terrible problems of version control if you're not careful, but if push comes to shove you can use it instead of a wiki or, say, Google Docs.

Now, I have to be honest with you and say that in my opinion non-Web 2.0 applications do not have the same level of excitement as proper Web 2.0 ones. They don't have the same breadth of collaborative features as a rule, and working in real-time, or near real-time, is exciting in itself. Most of all, though, is the tremendous buzz that everyone gets from working with someone thousands of miles away without having to email documents back and forth.

However, if you have been unable to convince the powers-that-be of the need for access to Web 2.0 applications, it is not the end of the world.

The important thing, I think, is not to think in terms of application but in terms of the activity and the learning. If your school has a VLE then it probably has a built-in word processor application designed for collaborative working, if what you'd like pupils to do is work on a story together. If you would like them to be able to upload and share photos, there's almost certainly a facility for that. I've already mentioned the post-it notes approach to discussion.

When pupils have completed their work, they may still be able to show it off to the world by uploading it to the server to enable it to be embedded in the school's blog or website, as described in the article about embedding.
Getting back to the idea of working together, what lies at the heart of it is a particular philosophy of education and an underlying theory of how people learn. If you think that the teacher is the expert, and people learn best by keeping quiet and taking notes, then Web 2.0 is not the approach for you. If you feel that everyone has or should have an equal voice, and that people learn best by discussing things and working together, a Web 2.0 suite of applications will be on your list of 'must-haves'.

In reality, these approaches are not mutually exclusive, but are dependent on circumstances. For example, if I am going abroad, I would like someone to tell me what sort of plug adapter I need. I don't want a discussion about it, or someone's opinion; I want an expert to say to me: you need X.

In an ideal situation, the teacher will have  access to a whole range of types of application and classroom repertoires.

And the knowledge and skills to use them effectively. 

Have you seen the other articles in the Web 2.0 for Rookies series? Feel free to comment, and to recommend them to your colleagues and students.

Collaborative Approaches To Learning: Always A Good Thing?

Collaborative approaches to learning certainly have their place -- but not at the expense of the facts!

This is an updated version of an article which first appeared on Wed, 7 Sep 2005.That sounds like a long time ago, but I think the issues I was describing then are still relevant today. But I'd value your opinion on this matter. It's a longish article: go grab yourself a cup of tea.

In March 1923, in an interview with The New York Times, the British mountaineer George Leigh Mallory was asked why he wanted to climb Mount Everest, and replied, 'Because it's there'. That seems to be exactly the attitude of some educationalists when it comes to recent developments such as blogging, podcasting and wikis. That is to say, they use them purely and simply  because they are there.

I'm all in favour of pioneering and trailblazing, but the downside is that evangelistic fervour can sometimes outweigh, or cloud over, any objective judgement. In my view, what we educationalists should be aiming for is not to get our students and colleagues to use technology, but to use appropriate technology appropriately. Unfortunately, that message sometimes seems to get lost in the hubbub.

I am thinking in particular of the apparently increasing adulation of, and reliance on, collaborative tools for the purpose of research, especially blogs, podcasts and wikis (the most well-known of the last is, of course, Wikipedia). In case you are new to all this, blogs are online journals, podcasts are recordings, usually in MP3 format, and wikis are web pages which can be edited live on the internet, either by anybody or by people who have subscribed to the group concerned. Wikipedia is an online encyclopaedia which features articles which can be published, then edited and counter-edited.

Is ‘truth’ relative or absolute?

Wikipedia in particular is often hailed as a fantastic resource, and one which has grown through collaboration by ordinary people. It is, if you will, a perfect example of democracy in action -- apparently, at least. The question we need to ask, however, is whether this and similar enterprises are actually useful.

For most people, and societies, the ultimate goal is absolute truth, not relativism. This isn't only a religious quest: in the field of finance, one of the main attributes of money is that it should be a measure of value which does not, in itself, change value. Hence, in modern societies, the attempts to fix a currency's value by pegging it to gold or to another, more stable, currency. Trying to measure the value of something if the value of money is constantly changing is like trying to measure the length of something with a ruler whose length keeps changing.

Is collaboration always a good thing?If relativism is not ok in our religious or economic lives, why should it be ok in our intellectual life? We all know that knowledge and understanding are constantly evolving, and that the self-evident "truths" of yesteryear are sometimes found to be wrong in the light of new evidence. That is disconcerting, to say the least, but at least it's a process that happens over years rather than overnight.

It's also a process that happens with the involvement of experts in their field. Now, I am not so naive as to not understand that viewpoints which do not fit into the convention wisdom of the age are unlikely to be heard. You only have to look at the experiences of Freud, Darwin and, in our own age, homeopaths and others to realise that. And the economist J M Keynes, when asked why he had failed his Economics examination at university, said that it was because he knew more about Economics than his tutors.

Nevertheless, you can't have an article published in a scientific journal or the Encyclopedia Britannica unless it has been scrutinised and vetted by another expert. This is in contrast to wikis, where for the most part anybody can come along and change an article without knowing the first thing about the subject area.

Two cheers for democracy*

Now, this may seem like a very anti-democratic point of view, and that's because it is -- in this context. If that sounds arrogant, consider this: if you are the world's leading expert in a particular area, do you really want some virtual passer-by to "improve" your work by chopping bits out or adding bits in? Of course not! But even if you are an ordinary expert, as distinct from a world one, you will still not want someone correcting you. At least, not in that way. You might enjoy a good debate, and be open to have your views challenged, and may even change your views through that process, but that, I would contend, is a very different situation.

Even more important, though, is the potential confusion it creates for students. Imagine finding a great fact to put in an essay, and then double-checking it the next day, only to find that it's disappeared. Does that means it was incorrect, or that someone didn't like it? The only thing the student can do is to seek verification from another source. That's good practice, but the question is: what kind of source?

When I asked Limor Garcia, the inventor of Cellphedia** (a kind of mobile phone version of Wikipedia), how she would advise students to check the truth of the information they find, she said that people would be able to correct each other's answers, but also that they could check the answer in Google. That seems to me to beg the questions: (a) if you are going to check the answer in Google, why use Cellphedia? and (b) how would you know if the information you found in Google is correct?

The Library of Babel

Searching, searching...Interestingly, these kind of paradoxes are not new. In a story called "The Library of Babel", written in 1941, the Argentinean writer Jorge Luis Borges describes a vast library in which there is not only a copy of every book ever written, but every book which could be written. There is, for example, a library catalogue, and an infinite number of variations of it. There is a marvellous passage in which he describes the quest for the "master" book:

"In some shelf of some hexagon, men reasoned, there must exist a book which is the cipher and perfect compendium of all the rest: some librarian has perused it, and it is analogous to a god. Vestiges of the worship of that remote functionary still persists in the language of this zone. Many pilgrimages have sought Him out. For a century they trod the most diverse routes in vain. How to locate the secret hexagon which harboured it? Someone proposed a regressive approach: in order to locate book A, first consult book B which will indicate the location of book A; in order to locate book B, first consult book C, and so on ad infinitum."

(J L Borges, The Library of Babel, in "Fictions", which is featured on our Amazon page)

 The worrying development for me is not the invention and expansion of tools such as Wikipedia and Cellphedia. I actually think they have vast potential and are, in fact, tremendously exciting. From the point of view of the learning process, taking part in such collaboration is bound to engage or re-engage a lot of learners.

What I am more concerned about is the often uncritical stance of some educationalists in relation to these tools. For example, I have read articles which favourably compare Wikipedia to traditional encyclopaedias on the basis of weight, its ability to constantly change, its democratic ethos, and other characteristics. Surely the most important yardstick is accuracy? And a couple of months ago I met the Head of ICT at an independent secondary school who said, quite seriously, "We don't need to teach kids how to search the internet; they use Google and Wikipedia all the time at home."

Essential skills for users of ICT in education

We need to teach our students a number of skills or approaches when it comes to verifying information:

  • a questioning approach rather than a willingness to accept things at face value;
  • triangulation, which is the cross-checking of supposed facts with other sources of information;
  • in triangulation, the use of different types or sources of evidence; for example, there is no sense in cross-checking the accuracy of the comments I've made here by looking at other comments I've made: you should look in other sources; otherwise,it all becomes self-referential.

Above all, we educationalists should not fall into the trap of using a new technology in every situation just because it is there.

Conclusions

So what does this mean in terms of the educational benefits of services like Wikipedia, Cellphedia and, in a wider context, blogs and podcasts? Does it mean we should reject them entirely? The answer is that we need to treat them in the same way as we would encourage our students to treat any other source of information: with caution and, as stated above, to cross-check the information found using them.

We should also recognise that these new tools have some distinct advantages: they are fresh, they allow "breaking news" in academic fields to be published with a lower burden of proof required, meaning that a debate can be entered into at an earlier stage and by more people. They also enable the ordinary person and the maverick to have their say. Finally, they can also have profound benefits in a social context, especially mobile phone-based services like Cellphedia (the need for which has, I would suggest, been superceded by the wonderful mobile phone apps that are available these days) : imagine being able to go to a new area and find out where other people would recommend eating or staying (there are apps for exactly this).

Finally, taking part in such projects can be very useful for students, because it involves the skills of research, writing, collaboration and editing. It is easy enough to set up your own blog, podcast or wiki, as you will know if you've looked at the Web 2.0 Projects book .

In conclusion, we need to steer a fine line between using something in all situations, regardless of how appropriate it is, and rejecting it out of hand. I'm sure that the line is a wavy one as we continue to grapple with and debate these issues.

Postscript: The Demise of Wikipedia?

According to the London Evening Standard, editors are leaving Wikipedia in droves. Apparently, they don’t like the recently changed rules which, supposedly, make it harder to get away with writing rubbish or deleting good stuff. Read the comments too. Kate, for example, got fed up with her expert postings being deleted by some nameless and faceless person who decided that she hadn’t cited enough references. That sounds reasonable, but for me, having your work commented upon and rejected by someone who won’t or can’t even give you their name is unacceptable.

* Apologies to E M Forster.

** Unfortunately, at the time of writing the Cellphedia website seems to be unavailable.