Tony Karrer's eLearning Blog on e-Learning Trends eLearning 2.0 Personal Learning Informal Learning eLearning Design Authoring Tools Rapid e-Learning Tools Blended e-Learning e-Learning Tools Learning Management Systems (LMS) e-Learning ROI and Metrics

Sunday, January 13, 2008

Test SCORM Courses with an LMS

I was just asked by someone about how they could test a course they were creating against a particular Learning Management System (LMS) (in this case Docent 6.5). This is something we run into fairly often. We are developing a SCORM or AICC compliant course either custom or using an authoring tool. And we run into the same tough situation each time:
  • If there's an integration issue, like the score or completion is not getting set appropriately, we want to be able to make changes to the course to try to fix it.
  • We don't have direct access to a version of the LMS to test on.
  • The client's staff won't give us direct access to the LMS to run tests.
  • The client's staff is too busy to run a series of tests and they have no patience for problems.
  • The client has not set up a version of the LMS just for this purpose.
This happens both to external and internal developers. Likely this sounds familiar to many of you. The question is what to do about it. Here are some ideas -

SCORM Test Suite

One of the best things about SCORM (ADL's Sharable Content Reference Model) is that it is pretty easy to test and diagnose problems. In fact, it comes with a do-it-yourself test suite. The test suite was created to try to help get implementations to work together better. The problem is that there is no guarantee that two SCORM implementations that pass the self-test suite will integrate okay. The test suite can be downloaded here.

My first piece of advice for anyone developing a course is to make sure it runs with the ADL's test suite because that gives you cover in case there are issues. If it doesn't work out, then you can say - "Well it works with the SCORM test suite. We'll have to figure out what this LMS does different than the SCORM test suite."

SCORM Technical Test Version

At the start of each project where integration with an LMS is required, we create an early technical test version of the course. This is created using whatever authoring tool we will be using and it will contain no content. Rather, it simply allows us to quickly set particular variable values, e.g., test scores. Most often we put this in a SCORM wrapper to be able to diagnose what issues there are. This can show you what works and doesn't work long before you author the course.

The late Claude Ostyn created one test wrapper and also a diagnostic SCO. There is another called SCORM Test Tracker. We use our own so I can't vouch for these tools.

Note: there have been cases where we needed to go to HTTP stream tracking software to figure out what was wrong. This is less common these days, but it's good to know that there are tools at this level.

LMS Specific Testing

I would love to tell you that there was an easy way to take this technical test version and/or your course itself and load it under the LMS to test it before you hand it to your client. My experience has been that it's rare to have this happen.

Does anyone else have a different experience?

Do any LMS vendors provide test beds for people to test their course?

For the person who wants to test on Docent 6.5, what should they do other than what I suggest above?

35 comments:

Hulk said...

Running the course in the SCORM test suite is good advice - as is creating a "shell" course to test the various features in the client LMS.

I used to have a version of Docent 6.5 set up for just this purpose, but none of our customers use it any more, so I scrapped it - otherwise, I'd volunteer to test the courses for your friend.

Docent is particularly troublesome - because it uses server-side javascript. That's not bad in an of itself, but people tend to mess with it to make things work - so you can't count on things being the same from one installation to another.

If you have a choice - use SCORM over AICC. AICC seems to have more wiggle room in the spec and since SCORM provides a test suite, people seem to write with some attention to it.

-adam

B.J. Schone said...

I've run into the same issues. There isn't an easy way to build a course that will be guaranteed to function correctly in every LMS. It's sad that the SCORM specification is often interpreted differently by LMS vendors. I’ve learned to anticipate minor glitches when integrating courses. They generally aren’t too difficult to fix, especially when you’ve got the right tools (ex. SCORM test suite, etc.). Trident has been a helpful SCORM tool for me, too.

We have two LMS environments set up at work: development and production. We're able to test all new courses in our development environment, and that's also where we test all LMS software upgrades. If everything behaves nicely, we move it to the production environment. This setup has worked really well for us.

-B.J.

Paul said...

If you're only dealing with SCORM 1.2 courses (and many of our clients still use it) then the Reload player is very useful: http://www.reload.ac.uk/scormplayer.html

I don't know if there are any plans to produce a SCORM 2004 version, but hopefully there are.

Mark Aberdour said...

This is not something we seem to suffer from very often but it's an interested topic nonetheless! As any LMS incompatibilities would result in delaying rollout I am very surprised that clients would not take it seriously.

A couple of things we do that may help avoid this include: We always request direct contact to a client technical representative right at the start of the project and ensure they know what is expected of them. We ensure that the client knows exactly when we will be sending over LMS-ready deliverables, and we add the client technical testing phase to the project schedule. Because the client LMS testing is on the critical path, if they don't take it seriously, then the whole project schedule slips. Maybe that is what gains the buy-in?! We make roles and responsibilities very clear at project kickoff, and have a clear escalation process for both sides of the relationship so if someone at either end is blocking progress, it gets communicated upwards.

We have a very similar system at Epic in that we follow many of the same processesas mentioned, plus a few others we employ from time to time...

1. We have a standard Test SCO which contains a small amount of content and our standard SCORM API Wrapper. An associated test script exercises all of the SCORM tracking data we would normally expect to use. When we have a new client or a client has made changes to their LMS, we send this to the client right at the start of the project and get it uploaded to their LMS. If there are any problems, we can then get them resolved way before LMS Testing proper hits the critical path.

2. We use the SCORM Test Suite, it has it's limitations but it's a vital piece of the process.

3. We test on our own LMS - we have a few in house that we purchased in the past through lms vendor partnerships. But these partnerships are a complete rip off in my experience, as are LMS vendor test lab offerings (I remember being aghast at finding out how much Thinq wanted for the use of their conformance testing service). Plus many clients get vendors to bespoke their LMS in some way so getting hold of the vendor's out-of-the-box product is not always the right answer anyway. With open source it's a doddle, with many clients moving to Moodle for example, this makes life much easier, although you need to manage your test environment closely to ensure you know which clients use which version...

4. I find that remote access to the client LMS is more often granted when you are in a long-term relationship with lots of projects going through testing. Less likely with single-project relationships.

5. In one case we even had a virtual machine created and given us on CD-ROM which we then installed onto a virtual server here. Your client needs to check their license conditions before doing this though!

6. Get remote access in some other way. We have started using WebEx a lot to view the client's own desktop and take control of it for debugging purposes. Very, very useful indeed.

7. Finally, consider sending a developer to the client office and debug/fix on-site.

Anonymous said...

Hi all,

(Hulk, we have also found customers had problems with Docent and moved on to more cost-effective alternatives.)

The company I work for works predominantly with SCORM 1.2 courses and we have developed our own diagnostics tool. When you publish the course to SCORM format, you simply switch debug on or off accordingly. This adds a debug popup window to the course which traces all the SCORM operations and whether or not they have been successful.
Although much simpler, I guess it's pretty similar in concept to the "diagnostic SCO" except it's designed to be built in to existing "production" courses.
(All the customer needs to send us is a screenshot of the debug window and there is no need for accessing the platform, which they usually refuse anyway.)

Otherwise, courses are usually proof-tested on a variety of platforms: Reload ("top favorite"), Moodle, Dokeos, ADL, Olat being our favorites.

Most problems we have encountered are linked to customer-side browser configuration and/or to server-side unzip utility versions.

- Elinor

vyonkers said...

At the risk of sounding ignorant, what is SCORM and AICC?

B.J. Schone said...

SCORM and AICC are the two most common methods used to allow eLearning courses to communicate with learning management systems. Technically, they are APIs (application programming interfaces) that allow very specific information to be sent back and forth between the course and the LMS.

MilesDavisSucks said...

I'd agree with Mark that using WebEx is such an effective way of debugging problems.

Most people are using SCORM and AICC properly these days. There used to be a lot more integration issues related to people not understanding the spec.

If both sides are compliant, though, it becomes a matter of "Well, I'm compliant!" "Well, so am I!" That can devolve into an ugly situation quickly.

I've been on both sides of this issue. I used to work for a company that created an LMS and now I work for a company that creates courseware. Most of the time, problems are the result of environmental variables. The last one that I diagnosed involved launching a hidden frame that resided in a different domain...security problem. One of the other ones I've dealt with lately is a proxy server not relaying SCORM calls.

I used to be a network administrator and that's really helped me spot some of these issues.

In summation, though, I can't overstate the value of being able to see the course launching from the client's perspective. I've seen huge projects derailed by both sides saying they were compliant and not doing anything else.

Mike Rustici said...

Thanks for the great post Tony, here at Rustici Software (www.scorm.com), we deal with similar issues for our clients every day. I'll add a few more thoughts and tips of my own:

-Exercising your content through the aforementioned SCORM Conformance Test Suite is an absolute must before releasing. If you can't pass the Test Suite, you have no business saying that your content is SCORM conformant. As Tony mentioned, it not the be all end all. It is very easy to pass the Test Suite and implement things in ways that, while technically correct, make no logical sense to a user or that will break in other conformant LMS implementations.

-Testing your content in a "real" LMS before delivery is always a good idea. One free tool that hasn't been mentioned yet is the ADL Sample Run Time Environment.

-We created the SCORM Test Track to serve our own internal need for a "real LMS" test environment, so it is optimized for quick and efficient content testing. It is completely free for the public to use and the same underlying SCORM Engine implementation is currently used by over 60 LMS's. You can also tweak how content is launched to simulate different LMS behaviors.

-If you do encounter a problem with a SCORM integration, the first thing you should ask is whether or not the LMS vendor has a certified SCORM implementation. If the answer is no, ask them if they have at least passed the Conformance Test Suite. Many times vendors will claim to be conformant without really knowing what that means. If you are working with an LMS that hasn't passed the Conformance Test Suite, chances are you are going to have to make some adjustments no matter how good your implementation is.

-If you get to the stage of looking at HTTP streams, check out Firebug, it's an add-on to Firefox that provides HTTP sniffing as well as robust JavaScript debugging and analysis. Since SCORM is mostly JavaScript based, it can come in quite handy.

-As you develop for different LMS's and come across all their quirks, try to keep a common code base of all the work arounds necessary to be compatible with as many LMS's as you can. We've done this over the last 6 years with our SCORM Driver software with great success. You'll find that many LMS's share common quirks, so "remembering" them in a reusable code base pays great dividends over time.

JackSlash said...

We use the Rustici test LMS to confirm our courses, Tony. That has been really helpful, but you can still have glitches when you get to a client LMS (recently had this with a course going to the Learn.com LMS). Sometimes this is no more than bandwidth or new users at the client facility. But even those simple issues provide problems that are every bit as frustrating.

Mark Aberdour said...

Thanks Mike, we're going to give your SCORM Test Track a whirl, always useful to have another test tool in the arsenal!

I do often wonder what with all the interoperability problems we have getting validated SCORM content to seamlessly work with supposedly SCORM compliant LMSs, just how successful SCORM as a standard really is? If the point of the standard is to avoid content purchasers becoming locked in to particular LMS vendors and for content providers to be able to sell into a wider range of organisations, then from what I am seeing and reading, it's increasingly looking like the SCORM standard is failing.

Maybe instead of updating the standard with nice new features like sequencing, what's really needed is an industry-wide 'back to basics' push? Open standards like SCORM lead to market growth so it's clearly in everyone's interests for the standard to work. It just seems to be going off the rails a bit right now...

Mike Rustici said...

Hi Mark,

I can see where you're coming from, but I also have to disagree a bit. While SCORM 2004 courses using sequencing currently suffer from some interoperability issues, SCORM 2004 courses that only use the functionality of a SCORM 1.2 course are actually very interoperable.

I sometimes like to use the analogy of a Microsoft product. When Microsoft first releases a product, it's quirky and a bit rough around the edges. It's not until the first service pack that things generally stabilize. It's kind of the same way in SCORM. SCORM 1.2 was a great success, it recieved lots of adoption and was far and away more interoperable than anything that came before it. Yet, there were a few places where it left room for interpretation and that led to some interoperability issues here and there.

Then came SCORM 2004. It's not mentioned frequently enough, but SCORM 2004 actually did an outstanding job of tightening down all the little holes that were left open in SCORM 1.2. If you look at the Content Packaging and Runtime parts of the specification (i.e. the parts of SCORM 1.2 that carried over into SCORM 2004), they are actually tremendously stable and interoperable. They are a significant improvement over SCORM 1.2 are are likely good enough to be considered "done".

SCORM 2004 sometimes gets a black eye from people who are frustrated about the lack of interoperability in the Sequencing part of the specification. Sequencing is anything but simple. It is a massive and complex addition to the specification. While it is constantly evolving and getting better, sequencing is still in that first release phase awaiting a few good service packs. What the critics fail to mention is that if you want to create interoperable content, nobody says you have to use sequencing. If you stick to 1.2 style content under SCORM 2004, you should have great results in creating interoperable content.

I agree that we should sometimes get back to basics. ADL / LETSI seems to share a similar vision. One of the items being discussed for SCORM 2.0 is the concept of a CORE SCORM, that focuses on the basic interoperability issues and leave more complicated or specialized issues to be tacked on to solve domain specific problems. At this point, it's all discussion and rumor, but I would expect to see some siginficant improvements in the future while continuing to maintain backwards compatibility.

I have published some similar thoughts on the evolution of SCORM on our blog as well if you're interested.

Mike

Tony Karrer said...

Mark and Mike,

I've got to say that I agree that SCORM the fact that there are still little gotchas all the time with loading and running SCORM courses under SCORM compliant LMS products is a bit of a surprise. My guess is that most people creating content (like 95%+) don't care about things like sequencing, etc. They likely are building a course with a single SCO.

In these cases, there shouldn't be any question about how to report scores back and how the LMS and/or course decides whether the course is considered complete, etc. There are a few semantics that always seem to cause the gotchas.

So, while defining the SCORM CORE in SCORM 2.0 is a good idea, I think all of us would welcome a more definitive, probably tightened definition of SCORM compliance for courses and LMSes that would avoid the issues for these 95%+ simple cases.

BW022 said...

Tony,

I am somewhat puzzled by the question. How does any other software company ensure compatibility with say various operating systems, web-browsers, databases, etc.? Easy – they buy any third-party software necessary, fill a room full of machines, hire QA staff, and install and test their software against each.

Why is courseware different? Why assume that the client or someone else is supposed to provide systems for testing for courseware vendors but not say accounting systems?

This is not an impossible task. Some courseware vendors do provide lists of which LMSs they work with. Most LMS vendors have partnering programs. Many have evaluation copies, pilot licenses, or hosted test systems. Perhaps you can rent or lease the software? Most major LMS vendors have their own test labs, support, and professional services. All of them sell their LMSs. There are third-party companies which you can hire to test courseware against LMSs. Some options might just be a few phone calls and/or some paperwork; others might require time and effort to install and setup; and others might cost money.

Have you ever tried contacting SumTotal about what options they have available for third-party courseware testing? What did they say? How many other LMS vendors have you spoken to?

LMS vendors are not mystical entities. They also have a vested interest in ensuring that their clients do not have problems with courseware. I suspect that most do not enjoy wasting time on courseware vendors who have not done their due diligence – join the ADL, run courseware on the sample run-time system, passed the SCORM courseware test suites, published compliance statements, tested with other LMSs, perhaps have courseware certified, gone to PlugFests, contacted the LMS vendor before clients run into issues, maintain QA departments, etc.

Tony Karrer said...

BW022 - great questions ... Some thoughts -

"Why is courseware different? Why assume that the client or someone else is supposed to provide systems for testing for courseware vendors but not say accounting systems?"

But the problem is that the scale is quite different. It takes a LOT to get an LMS somewhere for testing. And most courseware creation is on a very small scale.

I think you are thinking about the large scale courseware vendors who produce tons of courses. This is not the norm. The vast majority is produced in very small scales.

"Perhaps you can rent or lease the software? Most major LMS vendors have their own test labs, support, and professional services."

Again it's a cost issue. How much would this cost? If they had something relatively easy, cheap to use for the small guy, that's great. But not my experience.

"All of them sell their LMSs."

Even if it was free, it would take too much to get it set up and test it.

"There are third-party companies which you can hire to test courseware against LMSs."

Some of them commented in this post and I would love to hear about experiences that people have using third parties - or the LMS vendor for that matter.

Anyone have experience with this?

"Have you ever tried contacting SumTotal about what options they have available for third-party courseware testing? What did they say? How many other LMS vendors have you spoken to?"

Short answer is no. I have worked with several of the LMS vendors when we were working on larger scale implementations, but my sense is that once the vendor is done doing the install, they are not interested in working with 3rd party courseware developers. Maybe I'm wrong on this.

Any comments from the LMS vendors?

Any experience out there?

BW022 said...

Tony,

I would strongly recommend taking a step back and looking at how your last post would look to a client, potential client, or an LMS vendor. QA is not optional in today’s software industry. If you are too small to perform QA, then any serious company is going to consider you are too small to be selling software. Again, I see no reason to treat courseware differently than any other software.

Re: Contacting SumTotal
>Short answer is no.
You can’t be serious? It isn’t worth five minutes of your time to phone someone, send off an e-mail, or check-out their web-site, yet it is worth your time posting to the world how hard it is to get hold of an LMS or get your courseware tested?

>Any comments from the LMS vendors?
I’ll merely restate my previous assumption – I suspect that most [LMS vendors] do not enjoy wasting time on courseware vendors who have not done their due diligence.

mike said...

I am an Implementation Manager with Element K. While we offer an LMS (KnowledgeHub) and provide testing/QA for KHUB, we also have an alternative that sidesteps the LMS "problem". With our Content Server offering, we can host any content for play through Content Server. We are "content agnostic" if you will. We have managed services functions where we will certify our clients content (off the shelf or custom) for play. I thought this might be something your readers might want to consider. You can reach me at mike_lally@elementk.com.

Mike Rustici said...

BW022,

>Why is courseware different?

For several reasons:

1) You talk about buying a room full of computers to test against different OS's, browsers and databases. How many different products are we talking about here? In any of those categories, you can easily get to well over 90% of market share by testing 3 product families (for instance IE, Firefox and Safari). The LMS market is much much more fragmented. The top 10 LMS vendors don't even come close to 90% of the market share, never mind the fact that many clients of the larger, most popular LMS's will customize each implementation.

2) Go back to Tony's original post and look at the situation he's talking about "We are developing a SCORM or AICC compliant course either custom or using an authoring tool." It sounds like he's in a situation where he is making a specific course for a specific client targeting a specific environment. This situation is much more analogous to a small software consultancy developing a custom application for a client. In the case Tony describes that client has a non-standard environment but won't give the developer access to test on it, yet still expects seamless interoperability.

3) "most [LMS vendors] do not enjoy wasting time on courseware vendors who have not done their due diligence." Agreed, nobody wants to waste their time with somebody who hasn't done their homework. The rub comes when there is a problem even after everybody has done their homework. We deal with our client's integration problems all time. Unfortunately the prevalent attitude (of both the LMS and content developers) is to point the finger (and sometimes raise a finger) rather than constructively and collaboratively trying to solve the problem.

Tony Karrer said...

Mike - Element K - the testing hub sounds interesting. Is there more information on it somewhere?

Tony Karrer said...

BW022 - I appreciate your challenge here. And I did take a step back.

First - to clarify, the original question came from someone else who was facing the situation around Docent.

That said - we are in the midst of 3 different projects with major LMS vendor installations at clients where our stuff needs to integrate. These are not big projects. And there is no expectation by the client that we will have tested with the LMS prior to a technical test delivery or beta delivery. They do expect that we have done standard SCORM testing.

By the way, we definitely do QA - but our issue is specifically not having the particular LMS available to us. Instead, just like the person who sent the question, most people do their development using other test tools and then work with the LMS staff to install and QA it.

You are right that we would prefer to test with the specific LMS version that is installed prior to involving the LMS staff. If that were available to us, we would definitely test. And in some cases it is. Most often it's not. But, truly I may be missing it.

Maybe there is another approach.

That's what I'm asking now.

BW022 - I'm somewhat surprised that you are surprised by this discussion and that it is common practice in this industry to have things operate this way. I would say that we (TechEmpower) test more than most. But, again, I welcome you challenging my assumptions. It did cause me to step back.

BW022 said...

Tony,

I understand the issue. I hear it often. Courseware vendors stating that they can’t get access to LMSs. However, it always stated as fact – no proof is offered and everyone tends to just accept it.

I suspect it is far from being impossible. We test several courseware titles a week from clients, partners, courseware vendors, and consultants. We have Citrix boxes, pilot licenses, hosting companies, etc. We do not have huge issues getting hold of MSDN licenses, WebEx, credit card processing software, courseware, CRM systems, authoring tools, etc. which we require in development. Many courseware vendors do have lists of LMS vendors they have partnered with and tested against, so it is possible.

I could be wrong. Maybe everyone else requires fees, ISO-9000 certification, minimum sales figures, etc.? In which case, someone could at least give reasons why LMSs are impossible for small courseware vendors to get access to.


What I do find shocking is the attitude of courseware vendors, consultants, etc. If the courseware industry would follow the same software-engineering best practices as other reasonable software companies, the problem goes away. If you don’t have access to third-party software needed to test what you are going to build – then don’t quote on the project, you don’t write any code, and you don’t sell it.

Yet, courseware is somehow different? Courseware vendors can quote/sell first and then worry about getting hold of the LMS later. You say “not big projects”, “client does not expect”, and “common in the industry”; Mike says “too many LMSs” or “custom software”. We never have this issue with third-party integration in our client service department, because our client services department is forbidden from quoting on any work without a plan including access to the third-party software, documentation, support, training, etc. needed for the project. The size of the project, what the client expects, what other companies do or don’t do, whether the project required one or twenty sets of third-party software, etc. is irrelevant. In fact, we want to know this before we quote such that if it is impossible to get hold of the software (or too expensive) we can not quote on the project.

The answer is so easy. Don’t accept projects until after you have the LMS. If you can’t get the LMS, don’t accept the project. It is a simple software-engineering rule. The attitude that it is not necessary is directly refuted by the fact that nearly everyone who does run into integration issues is not following the rule.

Anonymous said...

>The answer is so easy. Don’t accept projects until after you have the LMS. If you can’t get the LMS, don’t accept the project. It is a simple software-engineering rule. The attitude that it is not necessary is directly refuted by the fact that nearly everyone who does run into integration issues is not following the rule.


I understand where you are coming from but this is a sure-fire way to go out of business. LMS Integration isn't painful if you follow the sensible steps:

1. Test your courseware against the ADL SCORM test suite or AICC test suite
2. Test against a.n.other LMS - whether proprietary or Reload or similar. Check writebacks, values of the main SCORM variables etc
3. Send a test course to the client and ask them to import and run and send back information from a debug console or similar to confirm the support for SCORM in the LMS.
4. Deliver the course
5. Consult with the client to iron out any niggles such as user experience, extra windows etc (ideally this gets done at the same time as the test course) and should be with a technical LMS person (not a project sponsor)

Access to LMSs is costly (believe me) unless the client has a good (valuable) relationship with the LMS provider, I have integrated with 30+ LMSs and following a process like the one above, have successfully integrated with all. About 90% are plug-and-play, 10% require additional consultation with the client, but if they know it is a possibility at project inception then the issue goes away.

My favourite was a client who had dealt with a courseware company for three months and hadn't integrated the courseware. They called us. We said we would have a course running in 24 hours and did. The incumbent was out and we were in.


Sorry about the anonimity but this is my first post and I am just checking out how the land lies.

DGR

BW022 said...

DGR,

>I understand where you are coming from but this is a sure-fire way to go out of business.

That is untrue since we, our partners (including courseware vendors), and many of our clients are in business. Other software companies are in business and follow that principle. The costs of not doing this are (hopefully) so obvious that I won’t bother listing them unless someone needs me to.


>LMS Integration isn't painful if you follow the sensible steps: …

No offense, but do you know how bad that looks as a practice?

As Tony lays out in his original article, we are working under the assumption that the client is unwilling or unable to test or provide access to their LMS, and therefore your process is a complete non-starter in this case.

However, even leaving aside this fact that they won’t – why should they? You aren’t paying them for their time or the use of their LMS; they are not qualified to track down SCORM integration issues; they are not going to accept liability for QA; it is more time consuming for them than you; etc. I suspect that many clients have learned this lesson – hence why Tony’s contact (and many others) are running into the issue.

Lots of vendors follow your steps with our clients. Our introduction is typically a panicked client calling before a major rollout. Support spends about ten minutes before saying, “Don’t waste any more of your time on this. Send me a copy of the courseware and the contact information for the vendor.” This does not endear such courseware vendors to us as it was completely unnescessary to get to this point. The courseware vendor could have contacted us before steps 3 and 5. Things go smother if you have access to each other’s support documents, compliance statements, test suite results, QA services, Cytrix boxes, evaluation/pilot licenses, support forums, a support department, a client service department, etc.

Even if some, most, or every other LMS vendor made it impossible to get such services – that is no reason to inflict those steps on clients using LMSs which do make such services available; nor contacting the LMS vendor after the fact; nor give the entire industry a bad name. If some LMS vendors make it hard to get hold of testing services… ok... make their clients go through steps 3 and 5. Perhaps they’ll change their policies if clients learn that integration is smoother and easier in other LMS? And the same goes with courseware vendors. If a client asks me which courseware vendor to go with, I can’t ignore your business practices. I’ll say integration is easier with vendors who have step 3 as “contact LMS vendor”.

In fairness, most courseware vendors we run into, with no pre-sales contact, typically have not followed your steps 1 and 2. I suspect a lot of courseware following your first two steps would sail past most LMSs. It is certainly sound advice. I’ll include it under due dilligence before contacting the specific LMS vendor. The vast majority of courseware we test is made by clients so of rarely has 1 or 2 done before we receive it.

Side note, if you have contacted various LMS vendors I’m sure folks would love an idea of exactly why you can’t get access to their software, support, and/or testing facilities. I curious as to what courseware vendors consider unreasonable in terms of effort, costs, set-up, etc. to get an LMS. I’m also curious why LMS vendors would want to force their clients through steps 3 and 5 – especially if step 6 is a panicked call from the client anyway.

Anonymous said...

I have a project in which I need to convert scores of Docent 6.5 to SCORM 1.2 and test is ADL test suit. I opened the course in Docent outliner and converted it into SCORM 1.2. But I don't know how to test in ADL test suit as I am not seeing the Index.html file in the course sturcture. Normally I test courses developed in Flash by opening lead Index.html file through ADS suit to check but in the Docent course I am not able to identify the lead file which I could open through test suti. Any help?

Mark said...

If you using the ADL Test Suite you need to either have a SCORM package already (zip file) or have extracted those files into a folder. Then you launch the tester and select either the package or imsmanifest file. I'm not sure exactly what you mean by converting Docent 6.5 scores to SCORM 1.2. That doesn't really make much sense to me can you explain more!

Mark said...

I'm not sure if anyone is still reading this post, but I just read down from the top and thought I'd share a small tid-bit.

I have contacted SumTotal before, as well as other LMS vendors, and I'm lucky if I can keep them on the phone for more than 5 minutes. None that I have experience with will speak to you unless you are a customer of theirs. They keep saying have the owner of the LMS call, or do you have a support contract?

The best way to trouble shoot is through experience, the ADL tester, and sometimes an HTTP analyzer. Other than that good luck with Docent 6.5 missing api calls or SumTotal acting weird when installed on a secure server. A SCORM package is not universal and won't function the same in all LMS environments, and but if it did it would be boring and we would have nothing to blog about, right.

-M

Tony Karrer said...

Mark - thanks for the new comments. People still see this post periodically. Certainly having your input on what the vendors said is helpful for everyone.

BW022 said...

Mark,

I have no interest in SumTotal, but they do list a number of content partners on their web-site? Have you asked what is necessary to become one of their alliance partners and what services and/or support they provide?

In the end, it doesn't matter what reason you come up with why you can't get your courseware tested under SumTotal -- why you can't afford it, why you can't get support, etc. -- the client has fifteen other courseware vendors to choose from who somehow managed to get their courseware tested under SumTotal.

Ella said...

Slightly off topic, but I wonder if anyone has a test script or checklist for beta testing e-learning courses. I've recently been tasked with this and while I try to make sure I test all the features and functionality, I'm certain that there should be a more tightly controlled process. Any thoughts?

BW022 said...

Ella,

The ADL includes a test suite for testing LMSs, courseware, and content packages.

http://www.adlnet.gov/downloads/DownloadPage.aspx?ID=277

Passing the test is the minimum necessary to say your are SCORM compliant.

However, if you are serious about courseware development, I would recommend getting hold of a real LMS for development, preliminary testing, and to ensure it works in the real world. Some are free, others you may need to buy, lease, partner with the LMS company, etc.

je55ic said...

My company develops and sells e-learning content and we also offer hosting services for clients without LMSs. My position as Technical Implementation Manager requires me to troubleshoot any technical integration issues between our content and various client environments/setups. As such, it's been really helpful to have access to Docent 6.5, SumTotal 7.5, Pathlore 6.6 and Saba 5.4 SP2 test our existing and new content that we develop for all my troubleshooting needs. We pay for test instances of Docent, SumTotal and Pathlore through a company called Pinneast (www.pinneast.com). Hope this helps.

-Jessica

Faxme said...

Im thinking about developing an online course using moodle, is moodle itself 'SCORM' complient?

eshiika said...

I am not able to test LMS course using ADL test suite, it only accepts its own inbuilt courses. Please reply.

Mike Rustici said...

eshiika, you are probably running the LMS test in the ADL test suite. You might need to run the content package test in order to test your own content.

Anonymous said...

Hi all,
I need a bit of help.
Trying to open a course with IE I receive a message:
API not set!
Launch SCO.aspx........ID....will not be located.
Since it is more than difficult to contact the creator and ask him to change the code, please help me to do something (as an user)and become able to view the course.

Thank you in advance for your help