Friday 31 January 2014

E-assessment harder to deliver than e-learning?

I have another reflection on the learning technologies conference which is about the creation of e-learning in comparison to developing e-assessment. In this context the e-assessment I am referring to is not simple testing but rather the more complex process, of capturing evidence of competence and then having multiple people interacting with that evidence to confirm its validity.

On the eve of the conference City & Guilds/Kineo announced the launch of the Adapt Learning Open Source Framework. At face value it appears to be another ‘open source kit’ on the market and a generous offer from Kineo because in effect it empowers people to build their own e-learning.

At the conference itself I watched a presentation from Epic, like Kineo another successful e-learning company which showed how they use Moodle; Sharepoint and Drupal to create e-learning content for a range of customers. All this software is open source and therefore Epic’s skill is in adapting the software and in creating plug-ins to create solutions that meet the specific requirements of their customers. Their income comes from maintenance contracts and the adaptations of the software and whilst their customers are not tied into licensing specific software they are ‘tied’ to those capable of adapting the open source software.

In both cases I doubt that any of the e-learning is underpinned by the sort of comprehensive and complex database that is required to deliver e-assessment particularly where this a highly complex rule base for the assessment.

To illustrate the difference there are two ways to ensure compliance using electronic methods. First take one of the examples of e-learning used by Mark the presenter from Epic. If for example you are a member of the customer services team working on Jet 2, one of the ways your understanding of the duties you are required to perform can be tested, is by asking questions about a number of scenarios and then tracking your responses. Another way is for someone to actually record how they saw you perform, to cross reference this against specific standards and then to have their judgements checked twice to make sure it was fair and equitable.

I am not implying that either of these two processes are better or worse. What however should be self-evident is that one of the processes is more complex.

The failure to understand the complexities involved in assessing competence electronically, is in part down to the industry itself which still sometimes places the ‘recommendations’ provided by Linked in, on the same level as thorough assessment that required rigorous evaluation and cross checking.

Nevertheless those who have tried to build an e-portfolio for genuine and rigorous assessment know how complex the process is and the type of sophisticated database that is required. It is not something that is easy to build from a ‘do-it-yourself’ kit.

No comments:

Post a Comment