Crunching the numbers
- Article 9 of 15
- EducationInvestor, September 2012
Learning analytics could revolutionise education. But the technology isn't quite ready yet
Page 1 | Page 2 | Page 3 | All 3 Pages
For as long as there have been teachers, there has been some form of analysis of students' progress. Whether it was a viva, exams or a process of continual assessment, teachers and schools have wanted to know how students have been performing, either to help them with areas they're having difficulty with or to fine-tune the course and its materials. But with the advent of modern ICT, distance and online learning has come the first tentative steps into something more: 'learning analytics', a term that encompasses a variety of technologies designed to understand student behaviour to a far greater extent than before and to act on that information, potentially automatically.
The aim is that by analysing student behaviour, both online and offline, it will be possible to know how they're really faring on a course. With many organisations seeing non-trivial drop-out rates from courses or that live or die on the ratings of their students, the idea of knowing when students are having problems, even if the students haven't mentioned them - and, equally, knowing what to do to help them overcome those problems - could mean serious financial rewards, marketing opportunities and a better experience for students. Indeed, EDUCAUSE in the US has announced a major programme in partnership with the Gates Foundation, the Hewlett Foundation, and others that identifies learning analytics as one of five key areas for development, and the New Media Consortium of education technology experts predicts widespread adoption of learning analytics in the next four to five years.
However, as of yet, most forms of learning analytics are relatively basic compared to these ambitions. Virtual learning environments, such as Moodle and Blackboard, ship with dashboard systems that include web-analytics style views of pages seen, links clicked on and so on, but these are relatively basic, compared with ambitions for the field.
"They tackle the basic problems of the plumbing, but they're really just overviews for educators to look at of logs and clicks - they don't grant any real insight," says Simon Buckingham Shum, associate director of the Open University's Knowledge Media Institute and a member of the Society for Learning Analytics Research (SoLAR), a collaboration between researchers and universities around the world.
Mark McCusker, whose company TextHelp is working on incorporating learning analytics technology into its own software, Fluency Tutor, agrees. "It's very early stages as a field - I tend to think of it as still in the labs." With 50,000 tests taken last year with Fluency Tutor, for example, McCusker says that TextHelp can use learning analytics to tell general trends in student performance and tell which students will progress to the next level in the next few months. But anything more is still in the testing stage and will only appear in the "generation after the next one".
Indeed, even organisations that should be able to benefit from learning analytics are balking at what they perceive as its relatively basic levels of insight. TLC Live!, a tutoring company that has now moved into online tutoring, has 6,000 hours of bespoke content that can be viewed by students. Being able to track how students progress through its site or how they respond to the content should give TLC the ability to fine-tune its courses. But, says the company's founder Simon Barnes, even though "online is where we expect the future to be", TLC is looking to maintain the human element as much as possible because it finds current learning analytics too clumsy.
"We have online assessments derived from our centre-based assessments. From these assessments, we can build up learning programmes - series of questions to identify which areas students have understood." With just a few pupils in each class, teachers have the time and resources to build individual learning plans based on those assessments, rather than to rely on computers.
More automated forms of tailoring for larger class sizes, as well as more sophisticated manually tailoring, may be possible with ePace's online assessment tool. Rather than tracking progress during a course, this allows both learners and teachers to discover students' abilities and preferred learning styles at the beginning of the course: do they respond better to the written word, audio information, videos or games? How good is their ability to focus? How good is their reading ability? Teachers, whether working online and offline, can then segment their classes so that each segment gets the kind of content they respond to best. "When you get 30 children through the door, it's hard to look into their minds," says Mary Blake, founder of ePace. "So if you can access the ePace class profile, look at the auditory memory profile and find there are six kids who find that difficult, you can give them what they need."
As of yet, the tool is still standalone, but the company is now working with Learning Platforms to incorporate it into its online learning platform, as well as with further education institutions to incorporate it into Moodle, so that customers can automatically serve course content to each student that is appropriate to the student's abilities.
Classroom Monitor, which started out "essentially as an ICT-based markbook", according to Chris Scarth, the company's commercial director, is also working to tailor content using learning analytics. "Teachers can link into things like TDS or the BBC, or paid for content and as they highlight and record progress against the National Curriculum or progress, they create a bank of teaching resources, some in the classroom or some for the students to work on independently." The company is partnering with universities to include content that can be used by the system. "It links number crunching and data to softer information. It's all about joining up all those different bits into one central hub that links content, assessment, data, tracking and everything else in one usable process"
However, the challenges of learning analytics may be such that only the largest of companies - or universities - may be able to push it to the next level. The OU has been using some form of learning analytics since its earliest days, building predictive models of its students to see which are the most likely to need assistance. Buckingham Shum says the university uses these not to decide who should be accepted and who should be rejected, but "It gives tutors the head-up alert of who might need support." But the university has progressed beyond that to look at more advanced forms of analytics, looking at what students are doing online, how they gather and make sense of information, what sources of information they go to, how they discuss information online and who makes contributions.
To do this, the OU has had to develop its own tools - as have other major universities such as MIT and Harvard: "There are really no good analytics tools, especially for learning. We want to give tools to students and that's where most of our R&D is: how you build reputation online as a learner or an expert, how to support the development of higher order thinking skills. It's one thing to track who's edited a wiki, that kind of things, which is what you currently get in a dashboard. I'm interested in whether we can we tell if someone is demonstrating more critical thinking, creativity or resilience when things get tough."
As well as this need for R&D to cope with the gap in the market, TextHelp's McCusker says there are challenges that need to be overcome. "In an ideal world, learning analytics goes way beyond test scores and into social networks, peers in the classroom, learning on mobile phone and so on. But to link all those things together, we need universal data standards and there's a long way to go before then." Even within its own technology, that lack of data standards makes it hard for the company to analyse its information effectively, let alone when sharing data, with all the potential ethical problems that might expose. Nevertheless, it's collaborations between organisations that will open up learning analytics' full potential, McCusker believes. "The thing that grows a market is the introduction of standards."
Although such collaborations are rare at the moment, they are happening. Gareth Davies, MD of Frog, is involved in such as a collaboration to add more advanced learning analytics to the company's learning platform. "The next step we see for education is an Amazon-style recommendation service for resources, based on a student's unique profile. This will allow a level of personalisation never seen before in education. Through partnering with organisations such as Google, Microsoft, the Khan Academy, over 10 different MIS systems, Education City, I Am Learning and many more, we will have the data and resource to make all of this happen. By profiling a student based on what they do inside and outside the school, a teacher will know what makes a student tick, what motivates them, how they prefer to learn and how they process information, so they can understand their students better, and teach them in a way they want to be taught."
Academia is also looking at ways to bring about learning analytics standards that are independent of any particular company or companies. "Open Learning Analytics" is a standard mooted by the SoLAR that will incorporate a learning analytics engine, an adaptive content engine, an 'intervention engine' that will provide recommendations and automated support for students, as well as a dashboard and reporting and visualisation tools - all based on open standards. It's very early days for the project, however, but it is looking for industry partners to work with it on these systems.
Despite the ambitions for it, learning analytics is still in its infancy. Many companies are dipping their toes into the water and are offering systems that can include far more insight into student performance and behaviour than many teachers could have gained even a few years ago. But the need among teaching organisations for the full promise of the technology, particularly as online and distance learning and student-teacher ratios increase, is only going to grow and there are companies already working to meet their needs.
Case study: Plymouth University
Plymouth University's BEd course is about to take part in a trial of a new version of Classroom Monitor's student monitoring and assessment tool that is intended for further and higher education institutions. According to Peter Yeomans, a lecturer in education at the university, the institution has been using a form of learning analytics, but "Up until now, it has been quite mechanistic, computerised testing that gives information back, but that judgement is mediated by the machine. Students can do tests and receive feedback, but those are pre-determined judgments by computer, matching where they are to what they need."
But from September, Plymouth is going to trial Classroom Monitor with the 200 students that are on placement on its BEd course. "The aim is to be able to reach into the system of the cohort to find out which parts of the competencies they have to develop have been met by less than 75% and build adjustments into the course for the next year." Traditionally, that part of the analytics has been paper-based and if students have been away from school on placement, the university has only known there's been a problem when teachers have visited them. "They've had to flag it and you have to be quite mean to do that and nobody wanted to do that. But with the system, we can call the report in each week, what progress they've made wherever they are in the country."
In particular, the system will allow teachers at placement schools to enter information about students and how they are progressing, and access information about them as well. With their progress in various areas understood, the system can then recommend additional online learning content to the student. "The student can have a look at what 'good' looks like through videos on the site, extra paperwork and so on, and how they can reach that. It's hard to do this consistently on a weekly basis in terms of manpower." The university has also deployed web analytics technology in the online learning part of the schools' system so that it can see what students are accessing and what they're not. It's hoping in future to use that information to identify what students need and find helpful.
Yeomans predicts that once the pilot is over and the system is live, it will more than pay for itself each year: "With 1,000 students a year on our books, that's £20,000 a year on paper and photocopying savings alone, before you look at the savings on travel with tutors no longer having to visit students on placement. Then there's the National Student Survey - a student who comes to a degree course expects that experience to be great and that will help."
Financially, Plymouth also gains from a 20% share of revenue with Classroom Monitor, having co-developed the new system. Yeoman believes that it can be useful to any institution that has to work in training professionals so that they can demonstrate competencies. That might be universities teaching opthamologists, for example, or it could be other kinds of organisations. "If I was a health authority, this would provide me with the kind of courses I needed to lay on and offer them direct to the right people through the Classroom Monitor tool. If the nurses aren't too clear about a protocol, we could embed a link in it through Classroom Monitor so they could have something pop up saying 'Here's a course you could go on.'"
Page 1 | Page 2 | Page 3 | All 3 Pages
