11 November 2008 Leave a comment
Last Thursday, I attended a Learning Lab seminar, entitled Personal Response Systems: Learning through ‘ask the audience’ at the University of Wolverhampton, Telford.
After a short introduction from the seminar facilitator, Steve Draper, a lecturer based at the University of Glasgow, led a session based his use of voting technology to support his (and others’) lectures in the Department of Psychology. After giving the seminar a brief introduction to Audience Response Systems (ARS), Steve structured his talk around the following five levels for using the ARS in teaching,
- Generic educational aims
- Specific learning aims or objective for a particular course, including “Just In Time Teaching” (JITT)
- A learning activity e.g Mazur’s peer instruction course [I'll come back to this]
- Question design, question-set design
- The technology e.g EVS/PRS/ARS, one minute papers…
Steve’s excellent website [http://www.psy.gla.ac.uk/~steve/ilig/] covers much of the content covered within the seminar, so I won’t repeat it all here. I will however pick up on one of the things mentioned, namely Mazur’s Peer Instruction course. Reference as below,
Crouch, C.H. and Mazur, E. (2001), “Peer Instruction: Ten years of experience and results”, American Journal of Physics, vol.69, no. 9, pp.970-977 pdf copy [Empirical evidence of improved exam pass rates.]
Mazur’s Peer Instruction course, whilst not necessarily revolving around a method of teaching which involves the use of an ARS, is grounded in the psychology of how peers aid learning. In addition, Mazur’s approach addressed a long researched principle weakness of his particular subject matter, Mechanics. By following a straightforward path whereby students are encouraged to work individually and with peers to find a particular answer, students are assessed twice and given feedback n times in a given sequence.
Action: As I move towards a more pedagogical (rather than functional) approach for the use of the ARS at the University of Bath when running Staff Development workshops, I would hope convey such approaches for colleagues to consider. The same would apply to some of the approaches to question design (Assertion-Reason questions, Brain-teaser questions, Diagnostic trees, Class test) that Steve mentioned.
The second seminar was led by Maureen Haldane, a Senior Learning and Teaching Fellow at the Manchester Metropolitan University (MMU), who talked about working closely with a commercial organisation, Promethean, for mutual benefit. Based within the Transformative Learning Centre (previously, the Promethean Centre of Excellence), Maureen gave an insight into some of the benefits and challenges of working within such as structure.
This seminar then moved onto the role of electronic response devices as one of the responses to MMU’s Challenging Assessment initiative, which is looking at providing “an opportunity for all faculties, departments and programme teams to get involved in a process that will be focussed on achieving lasting change in MMU’s assessment culture”. The ‘kit’ being used during this project is Activexpression, which is a rather different (and bulkier!) hardware to the TurningPoint system we use at the University of Bath.
Action: I noted Maureen’s comments on making the handheld devices more accessible to students, in particular, the possibility of producing a Braille version of the handheld device for visually impaired users. I’m keen to get in touch with TurningTechnologies to see if they have done any work on this.
Finally, Diana Bannister and Andrew Hutchinson from the University of Wolverhampton led a seminar detailing their work on the REVEAL Project, which reviewed the use of electronic voting in schools and evaluation of uses of this technology within assessment and learning. Whilst much of their work revolved around the use of the Promethean Activexpression hardware, Diana and Andrew did put forward the REVEAL-team design “Learner Response Pyramid Model”, elements of which could argubly be adapted and applied to teaching contexts at HE level.
Action: Explore the “Learner Response Pyramid Model” in more depth over the coming weeks, and attempt to contextualise the approach and advice for HE level. Watch the REVEAL project DVD!
Whilst the use of Audience Response Systems in HE still remains a bit of a niche, I was pleased to see that institutions are beginning to move away from a technology led approach to one where pedagogical principles and theory are beginning to be considered (especially relating to question design), more than they were perhaps in the past.
This seminar provided an excellent opportunity to begin to develop relationships with other HE institutions. In fact, I’ve already started a conversation with a colleague at any institutions, who also uses the TurningPoint ARS, about we can work together (exchange good practice, training materials) for mutual benefit.