Tony Karrer's eLearning Blog on e-Learning Trends eLearning 2.0 Personal Learning Informal Learning eLearning Design Authoring Tools Rapid e-Learning Tools Blended e-Learning e-Learning Tools Learning Management Systems (LMS) e-Learning ROI and Metrics

Tuesday, November 11, 2008

Per Question Reports

I received a good question from a reader:
Our school district uses Blackboard. We have recently been exploring putting more coursework online and have experimented with the Articulate line of products. Blackboard has done well for us although I am not pleased with their support of the SCORM standards.

I would like to be able to export SCORM test results by user/by question but all Blackboard seems to support is general pass/fail records. Now I can drill down to the individual results via the Bb gradebook but that just won't work for exams that are being giving to hundreds of students.

I have spent many hours researching this issue related to SCORM compliance and more specifically how to get the test results exported out of my LMS for more in depth analysis in Excel or perhaps some type of statistics package.

Would you be able to point in the right direction of a resource to help me get this done or please let me know if I am wasting my time. Perhaps the world has moved onto something else as most of the conversations online I am able to find regarding SCORM matters are 2-4 years old which in Net time as you know may as well be decades.
This brings up a bunch of initial thoughts. First, having worked on custom LMS, LCMS, Authoring projects and being pretty deeply familiar with SCORM, it's easy to blame the LMS for poor reporting on per-question reporting, but its very hard for an LMS to create generic reports that work well across all the variations that you run into. So, if the source of the statement "I'm not happy with their support for SCORM" is only the issues of reporting, that's a bit unfair.

I'd be curious what they are going to do with question level reporting. I've had a lot of clients initially say they want question level reporting. I always drill down on why they are going to do it. The most common answer is that they want to know if there are particular questions that lots of people are missing. But, in many cases (in corporate eLearning) there will be no attempt to go back and fix the course or update the test. So, if you really aren't going to do anything with the data, don't bother. It's a lot of work to get the data, look at it, decipher what it means and do something with it.

So let's get to the more helpful pieces of information. First, it appears that they are getting question level data in the LMS. That's good. And not always the case. In many cases, people will implement a SCORM course and/or LMS to only handle SCO / Module level reporting or even just course level reporting. But in this case, they have the data. Good job Articulate and Blackboard!

So, the easiest thing to do is to get a data extract from the LMS into a CSV that then you can manipulate in Excel. You should absolutely start with this to figure out what you really want to do with the data. You can do this in Blackboard by using the the "Download Results" with the "By Question and User" option. Here's a good plage showing this:

Once you've done this a few times and have figured out what you want your reports to do, then you can define a custom report using Crystal. Sometimes it's a bit hard to get at the data on a per-question basis, but I'm pretty sure there's a good view in Blackboard.

Let me say again, do not bother to try to define your custom report until you've done this manually across several courses/tests that will expose all the different ways that the data may come out. Then you will likely begin to appreciate the complexity. You may end up deciding to just look at the data via CSVs.

With all that, I'd love to hear your thoughts/ideas around:

* Do you use question level data / reports? If so, what do you do with it?
* Generic LMS reports that you have found useful for reporting at the question level?


V Yonkers said...

I have an additional question. If you do want this type of information, is it better to go outside of the LMS when you are designing the course (and link the site to an LMS) for testing to make it easier to manipulate data from the "tests"?

I am finding that it is easier for some functions just to link outside applications as the LMS just does not have the flexibility as conditions change.

Anonymous said...

Tony, I try to avoid SCORM whenever possible (because the medication's so expensive). However, I wanted to chime in on a highly valuable use for question-level reporting: as part of the development stage.

*** soapbox mode on ***
I think you're right that "in many cases (in corporate eLearning) there will be no attempt to go back and fix the course or update the test." Or, "fixing" the course means "changing the test question so more people pass."

It flows in no small way because of the multiple-choice mentality that an LMS tends to encourage and that senior management believes has a connection to learning.
*** soapbox mode off ***

If you're going through the trouble of creating an online learning course, then as soon as someone can take it -- even with placeholders for some graphics -- it's extremely useful to have typical learners take the course with answer recording on.

Here's what an intelligent system can bring to you:

Unexpected acceptable answers -- the ones nobody thought of except the people in the job.

Unexpected wrong answers -- ones for which you can then design more specific feedback, or arrange specific consequences for.

Patterns of expected wrong answers -- which, as they increase, are a suggestion that you are doing something wrong. Maybe the question is just plain wrong (you've coded for the wrong answer). Maybe the way the question is phrased is... suboptimal, let's say. Maybe the wrong-answer pattern is highlighting a topic that's more difficult for learners than you thought.

(I've seen more than one course where dates without a leading zero were flagged as wrong, while the actual system being taught did not require leading zeroes.)

About the only non-useful information is a pattern of expected correct answers, because then you don't know if people learned from the course, or knew this stuff before they got here.

Anonymous said...

It seems that she is receiving the data on a per question basis, which as you stated is a good first step. Getting the data out should be relatively easy if you have the right technical person on your team. Now I don't know Blackboard at all, but I know other LMS/LCMS pretty well and there has to be a backend database where all of this data is being saved. Moodle for example has a MySQL backend database with a table which stores all the tracked SCO information. Writing a report from this table is simple and can go a long way in shaping your content and future updates to test/quizzes.

Anonymous said...

I appreciate posting on the question I originally asked a week ago. To followup with some questions in your posting. My intent on getting the per question data is use it to regroup my students in order to make my subsequent instruction of those children more effective. I would like to use this data to figure out who gets it and can move to new, more challenging content as well as to figure out who has not learned the new content so that I can go back and find new ways to reach that student in an effort reteach them and ultimately retest them to measure their learning.

A typical Blackboard test works exactly as the page you linked to....allowing you to download the results to a spreadsheet. It is useful feature...however SCORM objects do not offer those same options of downloading. There simply is not a front end interface to pull this SCORM test data out.

I am intrigued by the comment about accessing the backend database that Blackboard sits on and pulling reports upon that but Blackboard is a closed system. Even if I could get at the server I do not have the expertise to manipulate the database.

Any other thoughts from yourself or your readers would be much appreciated.