Tony Karrer's eLearning Blog on e-Learning Trends eLearning 2.0 Personal Learning Informal Learning eLearning Design Authoring Tools Rapid e-Learning Tools Blended e-Learning e-Learning Tools Learning Management Systems (LMS) e-Learning ROI and Metrics

Wednesday, May 31, 2006

The Real HCM Maturity Model

I read an interesting article over on Learning Circuits entitled HCM Maturity Model. Two years ago, I might have simply nodded my head and moved on, but now I think that something very different is happening. First, their model is presented as:

Their definition of the "Knowledge Targeting Stage" includes:

By enabling line-of-business managers to collaborate with HR and training departments to tailor learning to their people, real business value begins to emerge as employees extend their knowledge beyond basic skill sets to specialized talents that have a direct impact on business performance.

The knowledge targeting stage of HCM provides HR and training leaders a unique opportunity to connect with business leaders, understand their challenges and deliver tangible value to them.

As you move up the maturity model you get into bigger and more centralized solutions with the ultimate solution being an all-you-can-eat tool. Oh, did I mention that the author comes from Saba? A good company, but an obvious bias.

Now here's the reality...

While I work a lot with LMS products (including Saba) and believe that they are really great for may things, I also find that they can easily become a barrier. For example, what if I just want to put up some quick-hit information. I'm not going to put it under the LMS. This is the same thing I talked about in my post: eLearning Technology: Tools for On-Demand Information - An LMS?

Thursday, May 25, 2006

Shift in Blended Learning - Example of Melding of Training and Support

In a previous post Shift in eLearning from Pure Courseware towards Reference Hybrids, I talked about the shift from Courseware towards other kinds of Blended Learning solutions with a greater emphasis in information sources.

As an example from the software training world, what we are seeing more an more is a melding of Training and Support materials. In other words, our blended learning model often used to look like:

now it looks like:

Note: the midpoint represents the launch of the system. The key differences here are:

  1. Support materials such as help, cheat sheets and manuals are seamlessly integrated with training and used prior to actual system launch. The training materials are also seamlessly integrated into with support materials so that they can be easily accessed at the time of use.
  2. "Training" is handled over time with a series of learning events that includes learning events scheduled post-launch and that are often more like office hours. This spacing is known to have much better impact.

Keywords: eLearning Trends

Shift in eLearning from Pure Courseware towards Reference Hybrids

Just when you've made the transition from the prior generation of CBT authoring tools (e.g., Authorware, Toolbook) to the new generation of WBT authoring tools (e.g., Captivate, Lectora), it looks like things are slowly shifting again.
The shift I'm seeing is away from the design of pure "courseware" solutions
and much more to "reference hybrid" solutions.

To explain this, I need to step back and deal with the fact that terminology around eLearning Patterns is problematic.

In my mind, "courseware" is interactive (to some level) instruction run asynchronously. It is created via an Authoring Tool or an Learning Content Management System. Often there's period quizzing to test understanding. It's designed to hit particular instructional objectives. It's the stuff you see demonstrated at every Training conference over the past ten years. Oh, and it almost definitely has a NEXT BUTTON.

"Reference" is static content - meaning no interaction other than allowing the user to link from page-to-page and to search. It is asynchronous. It is normally a series of web pages, but can be PDF or other document types. It can be created using Wiki software, a content management system, web editing software or even Microsoft Word stored as HTML. It's designed to provide either real-time support for work tasks or near real-time support for look up. Often they are designed based around particular job functions and tasks to provide good on-the-job support. It almost certainly does not have a next button and should have search. You probably don't see many demonstrations of these kinds of solutions, because they aren't sexy.

I realize that these terms are vague, so let me go see what other people have to say. If you look at various eLearning Glossaries: Wikipedia eLearning Glossary, WorldWideLearn eLearning Glossary, Cybermedia Creations eLearning Glossary, you'll find that there are hopeless definitions of "courseware" and no definition of "reference." Reference sometimes comes out as "job aids" or "online support" or "online help" or various other things. Each of these other terms in slightly more specific than "reference" as they generally imply a bit more about the specific structure of the content. Thus, "reference" to me is a good umbrella term.

By the way, if you can help me ... Am I missing an alterative term for "reference?" Are we just calling these "web pages?"

I did try another avenue to find better definitions. I went over to Brandon Hall's Awards Site. He has awards for "custom content" organized by the type of function they were supporting, e.g., sales. I would think that "content" is an inclusive term for courseware and reference. However, if you look at the judging criteria the first question is: "How engaging is this entry?" So, I've got to assume they really are looking for courseware and not for reference material (which is inherently less engaging - and in fact you would claim that you don't want it to be engaging, you want it to be quick and to the point).

The other categories for Learning Technology. The sub-categories here are:

  • Course Development Tools
  • Software and System Simulation Development Tools
  • Soft Skills and Technical Simulation Development Tools
  • Tests or Instructional Games Creation Tools
  • Rapid Content Creation Tools
  • Live E-Learning/Virtual Classroom Technology
  • Just-in-Time Learning Technology
  • Innovations in Learner Management Technology
  • Innovations in Learning Content Creation and Learning Content Management Technology
  • Open

None of this makes me think about "reference," but maybe it would be included in either "Rapid Content Creation" or "Just-in-Time."

Okay, now that I'm done ranting about terminology, here's the real point ...

We are seeing a significant shift in development away from mostly creating courseware to creating more-and-more reference materials designed to be just-in-time support.

Even more so, we are seeing a shift towards Hybrid Reference and Courseware combinations where the Courseware is embedded within the Reference. So, if you don't quite understand the concept or you want to make sure you provide a nice introduction, you put that embedded within the web pages.

As an example of this, we've created several hybrid reference/courseware solutions that are designed to both introduce and support the use of software. Traditionally, we would have built courses in something like Captivate and pointed users to go take these courses first. We would have separately created a "support" site that would have a FAQ and help on various tasks.

In the hybrid solution, we created the support site as the first element you go to and put a prominent "First Time User" link on the home page. This page takes them to instructions on how to get up and going. Most of the content is presented as static web pages that tell how to perform particular tasks, but some of the pages contain embedded Captivate movies to demonstrate or simulate use of the system.

This design has given us several advantages:
  • End-users can get started with the application quickly and receive incremental help on the use of the system as they need it. We've eliminated most up-front training.
  • End-users only see one solution that provides "help on using the application" as opposed to seeing "training" and "support" separately.
  • It costs less to produce because there's greater content sharing between training and support materials and because we build more of the content as reference which costs less.

I'll be curious to hear if other people are seeing a similar shift in what they are building.

Keywords: eLearning Trends, eLearning Resources

Wednesday, May 24, 2006

Surveys in eLearning

Through two recent experiences I've come to realize that many people in eLearning are not using Survey tools nearly as much as I would have thought.

For my recent Collaborative Learning eLearning 2.0 Class, I used a SurveyMonkey survey to ask the class members about their background, interests and availability. You can see the survey results on the Course Wiki.

SurveyMonkey is very easy to use. It's free up to a limit. It will even help send the surveys, track who's taken them and nag people who have not completed the survey.

And, of course, there are lots of other tools out there that are equally easy to use and equally free. I've used Zoomerang before and it's great as well.

So, why am I hearing that most people aren't aware of these tools or aren't using them for pre & post course surveys?

Keywords: eLearning Resources

Email, Knowledge/Content Management - Email as a Future Application Interface

James Robertson's excellent Column Two blog pointed me to an interesting article by Seth Gottlieb - Email and Content Management which provides some practical suggestions about how to move from email based content management towards better mechanisms.

While I agree with Seth's main contention that email, especially email with attachments, makes content management much harder, I actually think that Seth is swimming against a very, very strong current and is probably going to get sucked into the ocean shortly. He may know this since he points us to another article that explains The Good In Email (or Why Email Is Still The Most Adopted Collaboration Tool).

And, I personally believe that email is going to become more and more the "front end" of many of our applications. Many of the systems we build these days are workflow applications that often email the people involved to notify them or even allow them to respond. This is our way to "get in front of the user." And it works extremely well. And most every application is starting to do this. As we begin to get more sophisticated about workflow, we are going to see this increase.

So, while I would love to believe that much of the current communication that occurs through email will be migrated to other kinds of vehicles with more appropriate persistence and searchability characteristics (e.g., wikis), I think that most users are not heading that direction today and what is a more likely trend is to have email become more integrated so that it acts seamlessly with our CM/KM solutions.

One Third of Adults "Not Learning" - BBC - We Need a Better Definition of Learning

I found this via Donald Clark's blog. It pointed to an article by the BBC - Third of adults 'not learning'

The study defines learning as being not only taking formal courses but also practising, studying or reading to develop skills, knowledge, abilities or understanding of something.

This can even be part-time at home. It does not have to have been finished or to have led to a qualification.

The survey found that 20% of adults said they were currently learning, with more than 42% having done so in the past three years.

What I really found fascinating was how the study authors defined learning and then how it was interpreted by people who answered.

  • They left out any mention of TV or the radio and the word "listening or watching." I guess people don't learn anything from watching or listening to BBC programs.
  • They used the word "to" in their definition which I would interpret to mean that you are only including "intentional learning" not including "unexpected learning." In other words, if I learned something by reading, but I hadn't set out to learn that thing, then it really doesn't qualify as "learning" in terms of how the question was phrased.

I guess I believe that even someone who might not sit down "to develop skills, knowledge, abilities or understanding of something" might still watch the occasional Sports Center and find out a bit more about how the Heat's offense.

My point is that it's somewhat dangerous for all of us to call ourselves "learning professionals" and to work in this field if the common definition of learning excludes informal and unexpected learning.

Tuesday, May 23, 2006

Tags, Search Effectiveness, Personal Benefits

A couple of interesting recent posts and my experience in my Collaborative Learning Class has me thinking about the usefulness of Tags both personally and in workgroups.

From Bill Ives - Where Tagging Works and Where Tagging Doesn’t Work – Search Engine Lowdown
I guess I tend to agree with Danny Sulliivan about the tagging and search but that is not the original intention of tagging. If I want to search on a key word, I will still go to Google as the most efficient way. If I have the time to go exploring through multiple links and see the interrelations between key words, I might go to However, if I want to set up a way to store and share links on a particular topic, I will use which I have done already in co-authoring an article.
Interesting, Bill points to a search on Google for "Web 2.0" and he 75,900,000 hits with the famous OÂ’Reilly article at the top of the list - which is a pretty dang good result and makes sense given Google's in-bound link based algorithm. If you do a similar search on, you first realize (as Bill found) that tags cannot have spaces, so you actually need to look for "web2.0" - you can see what you get at:

I definitely don't think the results are nearly as good as what you get in Google. But look on the right side to find "related tags" that are how you can find things that are related.

I completely agree with Bill's assessment, is more useful if you are trying to find related terms to search against, but the quality level of results doesn't seem to be there.

A closely related great series of articles can be found at What are the Personal Benefits of Tagging? -
One thing that the most useful of these reasons all have in common is that
they allow the user to express tags using personal vocabulary.

I personally have found that because I've switched to Yahoo MyWeb that has full-text search across my bookmarked pages, I've come to use tags mostly to represent two things:

  • Actions - I tag items with "blogthis" if I plan to come back an write it up in a blog.
  • Sharing - I tag items that I plan to share with a specific tag so that others in my group can find it.

So for me, it's not quite the folksonomy effect that most people talk about, but based on these articles, I'm starting to think that's what other people are finding as well.

Beta Program, Email List, Acquisition - A Case Study in What Not to Do

Let me set the context and then let excerpts from the emails tell the rest of the story. A company with a very good product (the leader in its category) asked its leading users to participate in a Beta program right around the time they were acquired. The acquired company established a listserve mailing list and put all of their top users on the list...Here a few of the emails (there are about 80 in total), but you'll get the idea:

Dec 8, 2005
Once the beta is ready I am sure they'll let us know first... No point asking every day.
Dec 8, 2005 - From official at company that was acquired
You will receive an email once we start the beta. Until then, I'd like to ask you not to post messages on this beta list. Thanks in advance for your understanding. The Product Team
Jan 6, 2006
Any word on the Beta?
Apr 11, 2006
Any word on the Beta?
Apr 11, 2006
I received an email about registering for the Beta from the new company. When I responded to be sure that they weren't putting me in twice, I was told that they had already filled out the group! I had responded immediately upon receiving the notice!
Apr 11, 2006
I had the same experience. Probably the new company has a different way to select beta testers...
May 22, 2006
On the new company's blog, it says that the product is in full beta, and that they have already received feedback. Yet, I have not seen anything about the product in over a month. Am I missing something? Thanks.
May 22, 2006
I believe that the prior Beta program is gone. The new company runs it under a different program called 'Prerelease', not beta. I am not sure how to apply, the process is much different now.
May 22, 2006
I don't know what's more pathetic - us all waiting for an email that apparently wasn't ever going to come for the start of this beta, or having to read 3rd-hand that we were led to believe we were in the beta testing program and when they changed their program the company, in effect through inaction, 'screw the old beta testers, no need for us to at least let them know that they're no longer beta testers!'. If anyone from the new company is reading this, and I doubt you are, next time show a little common courtesy for your customers.
May 23, 2006
I am looking for the message they are trying to send. When a company asks people to participate in a beta test, are they not identifying the group of people who are most passionate about your product and who are early adopters? So if they then send them the (unspoken) message that you do not need them... what exactly is the company trying to say to these passionate early adopters?

It is an interesting strategy.
Discussion of alternative products starts on the list. List of 10 competitors including open source are being discussed.

May 23, 2006
Abandoning their products is pretty easy. There are plenty of good alternatives out there, and they all output the same types of files....

And this is just TOO FUNNY...
May 23, 2006 -
To: a member of the list
Sent: Tuesday, May 23, 2006 11:24 AM
Subject: RE: Beta program

Would you be so kind and send this message to the list (I am not
authorized to post
) - Hopefully, I'll find out what's going on shortly.

I don't yet know how this story will end. Will the users all find alternative products? Will the new company figure out how to post on their own email list?

Monday, May 22, 2006

Web 2.0 - Mainstream Term

These are somewhat telling about how mainstream Web 2.0 has become.

Over on Read/Write Web, and interesting article by Richard MacManus - Coming to Terms with Web 2.0
Then on 18 December 2005 I made the infamous declaration that "Web 2.0 is dead. R.I.P.". Ever wish you hadn't pressed the 'publish' button?

So what's 2006 brought? Believe it or not, I think it's brought acceptance of the term 'Web 2.0'. That's actually caught me by surprise - I got it wrong. Web 2.0 hasn't died, it's actually morphed into a mainstream term that Gartner and IBM use.

Gartner Says Web 2.0 Offers Many Opportunities for Growth, But Few Enterprises Will Immediately Adopt All Aspects Necessary for Significant Business Impact
While it is straightforward to add specific technologies, such as Ajax or RSS to products, platforms and applications, it is more difficult to add a social dimension.

By 2008, the majority of Global 1000 companies will quickly adopt several technology-related aspects of Web 2.0, but will be slow to adopt the aspects of Web 2.0 that have a social dimension, and the result will be a slow impact on business, according to Gartner, Inc.

Keywords: eLearning 2.0, Web 2.0

Search - Implications on Knowledge Work

This post was sparked by a couple of recent articles:

Babson Knowledge: How Google Plans to Change the Scope of Googling (And Why Information and Knowledge Workers Should Care).

InfoWorld: Reinventing the Intranet

These articles point out what we already know:
It is generally easier to find stuff that is in the mass of public information than it is to find stuff inside our own corporations.
While there is some defense of Corporate IT in that it is harder to get at information that is stored in a wide variety of systems, e.g., ERP, CRM, etc. It still is surprising that we are not seeing as rapid adoption of search technologies inside the firewall. But, likely, you are at fault as well, after all,
What desktop search are you using?
If the answer is "none" ... then get with it. Desktop search via tools such as X1 will change your life. You will find that you don't spend nearly the time worrying about categorizing your own content into folders. I used to spend a lot of time worrying about rules in my email. Not anymore. Just search and you will find.

Soon, these tools will be expanding out to your internal network, your internal systems and basically every piece of information.

Interestingly, I think the algorithms that Google relies on for searching the public web (based on incoming links) will not work nearly as well as older algorithms based on frequency, semantic interpretation and other techniques.

Keywords: eLearning Trends, eLearning 2.0, Web 2.0

Firewalls and Security in Software as a Service

One of the interesting outcomes of my recent course - Collaborative Learning Using Web 2.0 Tools - A Summary - was general consensus around:
  1. Software as a Service is Great for Learning Professionals inside Corporations
  2. Firewall restrictions still pose a problem for SOME services
  3. Security is a concern, but generally should not stop use

The reason that Software as a Service is so attractive is that it is often hard to get Corporate IT to spend time on getting even simple software packages set-up and even harder to get them to agree to support these packages. Thus, while we are excited about wikis, blogs, discussion groups, etc., the practical reality is that, unless they already exist somewhere and we can piggy-back on those implementations, we are not going to be able to get them implemented by Corporate IT. Thus, there is real attraction in being able to sign up for hosted services that provide these tools without Corporate IT being involved.

For us to be successful doing this, we first need to make sure that the system will work with whatever firewall restrictions exist. For example, in our course, we found that Yahoo Groups were restricted in some corporate environments. Elluminate did not work through several firewalls, so we had to switch to WebEx. The Yahoo Toolbar (for MyWeb) couldn’t be installed on locked desktops. Instead, we should have used We had no trouble with our PBWiki. The good news is that there are lots of these services in most categories, and thus, the best advice is:

Test any service you are thinking of using in different locations and desktops to make sure that you are able to use the service effectively given firewall restrictions.

Do not believe any vendor claim that "it works through firewalls" because a firewall can be configured to stop anything it wants. That's its job.

The other big hurdle is the question of security. What's your exposure by having your content at a hosted location. The first part of the answer is whether outside parties (not you or the host) can hack into the system and get at your content. Generally, I think you will find that hosts provide fairly reasonable control, but you will want to check into their security approach.

The second part is that there is some set of administrators who provide the hosting who will have the ability to get in and see your content. The host may make it difficult for the administrator to get in there, but often its not that difficult. Really, this is the same situation as what you face in internal software with some set of Corporate IT staff having access to content (likely including email). In the case of hosted solutions, the added "risk" is that the administrators are employees of an outside company. On the other hand, you probably have better recourse against the host provider if the administrator does something wrong than you would against your own employee.

The security issue not new. There are likely lots of content types that get stored externally by your oganization. They might be using as a CRM. They might be using an email system that handles Spam filtering and archiving. Chances are, the content you are putting up in your learning solution is far less of a risk than what is already getting stored out there. Which brings us to the first defense ... while the risk is probably low that you will actually have information leak out:

Try to limit content to information that would cause little damage to the
company if it were made public.

What if you need to work with content that is confidential and would potentially represent a risk? Well then you are going to need to go through the same protocols you would use internally to vet the system and likely you will again need to involve your IT staff because they are likely the ones who make these determinations. This will slow down your implementation time, but is not nearly the hurdle you have trying to bring software in-house.

Will they derail your process? If you look out at what's happening, you find mixed reviews. In an eWeek article: Security May Dog Software as a Service they provide a mixed answer:

the biggest challenge for companies such as Microsoft that see their future in on-demand software may be getting customers to understand and be comfortable
with the model.

And, the current state of network and application security at most companies is poor enough to make it hard to imagine on-demand deployments being any worse, experts agree.

You are still on the hook according to Software as a Service and Security:
A company must show due diligence in its relationships with third-party providers to ensure that those providers maintain and comply with U.S. and international regulations to which that company is subject. Under such regulations, it is the responsibility of the company—not the software as a service provider—to protect sensitive information.

The advice from an article in CFO Magazine:
Data security. Although SaaS vendors invariably emphasize the resources they devote to security, many customers remain uncomfortable with their employee and customer data flying over the Internet, not to mention potentially residing on the same data-center server as their rivals'. "Look at security. Do the due diligence. Make sure the vendor has the right premises and that protecting your data is its top concern," counsels David Brooks, director of CRM at Magma Design. Juniper Networks CIO Kim Perdikou insists on modifying SaaS contracts so that she has the right to do periodic security audits.

What's the bottom line? Chances are that you are not going to run into much of an issue. Try to keep the content to things where there is low risk. And where you have sensitive data, bring in IT staff to audit the security, bless the vendor(s), and check the protection in the contracts. It's still better than having to install software behind the firewall.

In talking with a lot of different CTOs from software development companies in Southern California it appears this is the way forward.

Keywords: eLearning 2.0, Web 2.0

Friday, May 19, 2006

Elves, Measuring Results and Informal Learning

Brent and I have been having a nice blog discussion. Our previous posts discuss what should be measured: Intermediate Factors in Learning, Intermediate Factors - Impact Many Measure One. And we finally seem to be agreeing with one exception. And this exception relates closely to my earlier concern eLearning Technology: Informal Learning is Too Important to Leave to Chance.

This discussion makes me think back to a question that I used to ask in my Computer Science Project class (based on something I read, but I now forget who it was - Fred Brooks maybe):
If an elf appeared and offer to give you a program that met your spec, how happy would you be?

After the initial jubilation wears off, the class realizes that there is some real concern on whether the program actually works as intended and without having insight into the guts of what's going on, it feels very uncomfortable. How do I know it works? How do I know what it's limitations might be?

It turns out that you really want more than just a program. You want one that you know how and why it works.

So, back to learning and measuring results. I actually don't want "just the outputs." I want to know how and why it's working. In the case of improving customer satisfaction based on knowledge of Store Layout and Product Knowledge, I want to know whether we've increased their knowledge and whether customer satisfaction has improved. While my client only cares about customer satisfaction, if it remains the same while knowledge increases, this is an important data point. It tells us about the system.

So, on to informal learning. Brent said in From Product Focus to Audience Focus:

The process is continuous and if our “training solution” is organic, dynamic, and flexible, it is very difficult to measure using the current method of measuring learning products. My point is “who cares”. If we have set up environments that help people collaborate, and support their informal learning, we should see output improvements.

And this is our only point of remaining disagreement. I would be much more comfortable if you can explain the internals of this system, how you know it works, when it will work and when it won't. As I said in Intermediate Factors in Learning ...
If you create an "organic, dynamic, flexible" learning solution but can't explain how it impacts the end numbers, then: (a) you won't get credit, (b) you won't know if you can repeat it successfully, and (c) you won't know if its really working.

Keywords: eLearning Trends, Informal Learning

Thursday, May 18, 2006

Intermediate Factors in Learning

In Measure Intermediate or Final Factors, Brent responded to my posting: Technology: Intermediate Factors in Learning.

Brent and I (and Jay Cross and lots of others) agree that measures of butts in seats, number of completions, etc. are generally not that useful in telling what impact we are having on what matters to the business.

Where Brent and I really seem to disagree, and I've seen this other places is the importance of Intermediate Factors.

The key to almost every engagement for me is understanding how human performance drives the business. Yes, the client hires me to improve business results which is always ultimately around Revenue or Cost. Most of the time the client already measures intermediate factors such as Customer Satisfaction, Loyalty, Quality, etc. All of these are known to have impact on Revenue and Cost.

Most of my clients have some understanding of how human performance impacts these measures as well, but most often they have not fleshed this out to the level that is needed. Thus, if they tell me that they care about Revenue and they really want to look at improving Customer Satisfaction because that is the biggest predictor of Revenue. Then, I will drill down to what impacts Customer Satisfaction. Some aspects I won't be able to affect, but other aspects are within our ability to influence. So, we will continue down the path.

If they are a more sophisticated organization and have Customer Satisfaction Surveys, then likely we'll already have some very interesting intermediate factors defined. In home improvement retail, associates ability to tell customers where particular products are and getting them to the product is a big factor in customer satisfaction. Together with my client I will now agree that what I'm really trying to do is to impact these further intermediate measures such as the results of the Cust Sat Surveys around Associate Help in Finding Products. Likely I will further drill down on this to break it into Knowledge and Skill Components such as (a) Knowledge of Store Layout, (b) Knowledge of Product Categories/Types, (c) Handling of Customer Questions around Product Location.

I then do the most important thing with my client. I get agreement that what they are hiring me for is not to increase Revenue (end Factor), but rather to provide mechanisms that will impact these Knowledge and Skill Components in a way that improves scores on the particular Question (all are intermediate factors). In fact, when I work with the store managers as part of the intervention, I will make sure that they understand the relationships here and understand why these intermediate factors are important. Ultimately, the Causal Relationship that we believe will result in:

  • If we improve Knowledge of Store Layout, Product Categories/Types, and How to Handle Customer Questions,
  • we will improve the customer experience (as exhibit in how they rate us on that questions),
  • which will improve customer satistfaction,
  • which will improve revenue.

Much of what we do in Learning is making sure we understand these Intermediate Factors.

Keywords: eLearning Trends, Informal Learning

Wednesday, May 17, 2006

Tracking Without an LMS

Based on an earlier post - Tools for On-Demand Information - An LMS?, I received a couple of questions around tracking.

Then today, I saw a post on TrDev about tracking without an LMS and thought I should maybe clarify what I often see as the choices around tracking:

a. Click tracking
b. Custom tracking
c. LMS tracking

Click Tracking

In Click Tracking, you rely on looking at logs of what pages have been clicked on and get reports via log file analysis (web analytics) tools such as WebTrends. These tools will tell you:
  • How many users have visited each page (HTML page)
  • When users are visiting
  • How long users stay

This is very standard technology that likely your IT shop can provide for you. If they cannot, then you can do what I've done on this blog and embed SiteMeter onto all of your web pages and it will give you similar kinds of reports. In fact, if you go to the link at the bottom of my blog (you can't see this in the RSS feed), you can see what traffic I get.

What you don't get with Click Tracking (without using some tricks) is the ability to see what any individual user did on the system. Thus, you couldn't tell if John or Sue finished the course. So, you have to answer the question:

Do I need to know if people are completing the course?

If the answer is no, then the other aspect to this solution is to create your course in a way that is easily tracked. Remember that Click Tracking only tells you what page was clicked on. This means that you need separate pages for your course. If you create a single, big Captivate Flash file, you will have no clickstream data. Instead you need to break the Captivate movies up (which is good practice anyhow) and put separate Captivate movies on each page.

LMS Tracking

I'm skipping Custom for a second. LMS tracking relies on creating a SCORM or AICC course which then communicates with the LMS in order to provide details of score, sections completed, which user it is, etc.

There are two issues with LMS tracking. First, many people do not have an LMS available to them. Second, even if you have an LMS you may not want to require users to login before they access content. This is discussed in Tools for On-Demand Information - An LMS?

Custom Tracking

While it is becoming less common as prices for LMS products have gone down and there are more hosted LMS products available, there are still times when we build custom tracking solutions. If you have no IT support available to you (i.e., simple programming), then this option is not available. However, there are some very simple things you can do to quickly and easily track your courses. While there are many solutions and lots of possible permutations, the basic approaches that are used are either a Simple Database or Enhanced Click Stream.

In a Simple Database approach, users will be asked to enter their name (sometimes at the start and sometimes at the end) in a simple form that will be recorded in a database. A simple web page is created that dumps out these results. There are lots and lots of subtleties here, but this is very simple to pull together and will give you a record of what a specific individual did. This approach is good when you only need a simple report of who has completed the content and do not need details of how they got there.

In Enhanced Clickstream, we will continue to rely on a tool like WebTrends, but we will put in place a simple bit of code that will enhance the clickstream data (the web log file) with information about the particular user. Normally, we rely on asking the user up-front for who they are (and then embed a cookie for repeat visits). This way, we can encode each page hit in the log file with the user information. WebTrends and other such tools can look at these parameters and give details of what pages that user has gone to. If you have a single completion page, it is easy to get a report on who has "completed" the course. This approach is good when you want to get more details of what individuals are doing on the system.

Keywords: eLearning Resources

Tuesday, May 16, 2006

Collaborative Learning Using Web 2.0 Tools - A Summary


Over the past six weeks, I’ve been leading a course:

Collaborative Learning Using Web 2.0 / eLearning 2.0 Approaches

Course Description: The purpose of this course is to give you an opportunity to learn about collaborative learning by participating in collaborative learning. This course is designed to teach how to design and build collaborative learning experiences using Web 2.0 / eLearning 2.0 approaches.

You can find out more about the course itself from the Wiki at:

The basic structure of the course was:

  • Six weeks long, each week had a one-hour virtual classroom session
  • First two weeks were introductions to the tools and to collaborative learning
  • Middle three weeks were the design (as a team) of collaborative learning projects, facilitation/participation in projects.
  • Final week was a summary discussion

The participants in the class were corporate learning professionals from a variety of medium to large organizations.


This has been a great learning experience for me. I thought it might be interesting to provide some of the feedback from course participants and some of the insights from having conducted this course. I’m going to likely have additional posts based on the outcomes of this course. (Note: all quotes below are from participants.)

  • Nuvvo was initially going to be used for the course as a registration system, communications and some content presentation. However, I found little value over using separate tools such as Yahoo Groups and a PBWiki. Through Nuvvo I did receive several registration requests, so it at least offers free advertising.
  • There is significant interest in the topic. I sent out about 10 emails to friends and colleagues and quickly found about 20 people wanting to take the course (I got back 2 or 3 people at several companies). However, the low barrier to entry (free course offered by a friend) created an opportunity for a disconnect between my expectations for participants and their “commitment.”
“In order for the class to be effective I would be charging up front. Make some accountability for the folks to ensure the committment is there.”
  • An introductory survey, conduct using SurveyMonkey – , worked very well! I would highly recommend SurveyMonkey. Using the survey I was quickly able to determine interest areas, when people would be able to participate, level of understanding of different technologies (Survey Results). My only problem was that I didn’t really have a way to change the class design significantly based on the result and I should have. See the next topic. Interestingly, one of the projects within the class (created by the participants) used a survey as well (which was a surprise, but speaks to how easy surveys are to use these days).

“Use a more comprehensive survey to learn more about the participants and why
they are taking the course and how they plan on using the things they learn. This might provide more insight in developing various exercises or homework - pairing people with like needs.”

  • When I originally conceived the course, I assumed that most attendees would know about Blogs, Wikis, etc. generally, but would not have experience using them. This turned out not to be the case. Because of this I could easily have used another two weeks to more gradually introduce the tools.
"introducing 3 new technologies (blogs, BlogLines, wikis) at the same time is a bit much"
“Specific introductory directions would have been helpful.”
"liked to have started the week with more background and knowledge on blogs and wikis ... so my focus could have been on richer information sharing"
“Have a pre-class experience for those unfamiliar with the tools to learn more about
them by having demonstrations or showing examples. Or dedicate the first 2 sessions to learning and applying the tools one at a time in our homework. In this way, those people who are already familiar with them could be given the choice of missing those sessions. "
  • All professionals (including learning professionals) are extremely busy. Even though the attendees in this class were fairly dedicated, it still is hard to find time for 3-5 hours per week. Ideally the course would have required less research time on topics. It is hard to define “assignments” that take relatively short amounts of time yet are interesting and deep enough. Part of this is the natural reluctance to share partial thinking.
"Doing research on the web is time consuming"
"I find that I am hesitant to write anything unless I have really thought it through."
  • Firewalls and restrictive corporate environments caused us considerable grief: Yahoo Groups were restricted in some corporate environments, Elluminate did not work through several firewalls, we had to switch to WebEx and that really goofed up the start of the course; Yahoo Toolbar (for MyWeb) couldn’t be installed on locked desktops.
  • Software as a Service (SaaS) is only way this could have worked. While I came into this believing in SaaS, I’m leaving it believing even more.

  • Limit class size … teams of 4 worked well … 22 individuals acting individually in the early part of the course did not.

  • I think that we achieved a different kind of understanding around Blogs, Wikis, Discussion Groups, etc. by actively using them as part of the class. Most of the participants gained real value from actively working with the tools as part of a learning experience, BUT, there is definite frustration as well given having to simultaneously get up to speed on the tools and learn about using them. Given the widely different levels of experience and propensity for using the tools, it was very difficult to balance learning about the tools and using them as part of learning. In the future, I will separate the two entirely.

"Let’s hear it for failure based learning"

  • Students were given a first-hand experience relative to "control" in collaborative learning both as participants and as leaders. As participants, they were given a lot of autonomy which sometimes worked well and many times did not. As leaders of their individual projects, they had mixed results with the participants actively engaging on the project. Also, I was fairly open on many issues such as assigning roles in teams, establishing norms – but some teams suffer because of this:
"Team collaboration might have been better if we had assigned roles soon after Tuesday's meeting"
"I found that in our collaborative environment, it was much easier to get busy and not participate as much as I would have when forced to face my facilitator or team members in an actual conference call or live meeting encounter. It became apparent that deadlines would need to be set and enforced in a corporate collaborative environment to ensure things kept progressing."
  • You still need to enforce timelines, norms, etc. While it’s nice to try to leave things open, it may be more effective to be somewhat dictatorial.
“Meetings are great to keep things moving because they are a deadline”
  • Each week, students would reflect on what they learned and how the class was working for them through a Plus/Delta assignment. This worked very well and sparked interesting discussions each week. But you need a thick skin. Most of the quotes here comes from the Plus/Deltas. However, there was some question of the format:

"I am not sold on the value of a plus / delta post. I prefer a more free form comment on the learning of the week. The important element of the post from a learning perspective is learner reflection. I think general comments on the week
allows this reflection to be more meaningful, at least to me it is."

  • The tools generally work pretty well for collaboration
"Our new collaboration tools were essential as our team had challenges with time schedules and difficulties coming together."
"signing up on Yahoo 360 and Yahoo My Web went well"
"sharing links with My Web is a lot easier than e-mailing them"
"Yahoo Bookmarks allows you to see your bookmarks from home or office"
"Yahoo groups worked well for exchanging information with team members"
"Signing up for Yahoo 360 and My Web were very quick and easy"
"I was surprised at how easy add-ins were to incorporate"
  • The projects in the course (designed by participants) was a great learning experience. If anything, more of this kind of learning would have been better.

  • Timing is a real challenge. Ideally you would get work into pairs or teams quickly, but with this kind of class you expect to have some level of non-participation and you don’t want people teamed with folks who will drop. So, you start individually, but then move to more team-based. However, most of the participants felt that pairs/teams were more effective.

  • Our teams had mixed results in really collaborating virtually. This was primarily a function of timing. If you could do several cycles in a short amount of time, then you could get good collaboration.

Keywords: eLearning 2.0, Web 2.0, Collaborative Learning

Thursday, May 11, 2006

What does Google Trends tell us?

Google Trends is quite interesting to play around with to see how different search terms have been used within Google. For example, I looked at eLearning as compared with Knowledge Management...

You can see that eLearning and Knowledge Management have both been trending downward in searches. Does that imply something as is suggested in: Organic KM: KM Is Losing Steam - Evidence From "The Database Of Intentions" ?

I'm not so sure that you can directly jump to the conclusion that interest in these topics are waning. Rather, I think with both Knowledge Management and eLearning the conversations have shifted as they've become more day-to-day kinds of concerns. On the other hand, I do think you can glean interest in other topics such as:

Which to me seems like a fairly accurate depiction of the interest that I've heard at conferences over the last few years.

Keywords: eLearning Trends

Wednesday, May 10, 2006

Fun Headline Generator

headline generator
This is a fun headline generator tool: Headline Generator

You give it a date, and the headline to generate, the news story and it generates the newspaper for you. A fun little thing to do in your eLearning course. So go generate headlines!

Update - here's another type of generator. The Sign Generator.
Keywords: eLearning Resources

Tools for On-Demand Information - An LMS?

I was pointed to the article in CLO Magazine: Tools for On-Demand Information by Donald Clark.

This happens to be a topic that I really believe in and we are finding more and more that we are building less "courseware" and much more reference material, job aids, etc. Often our solutions are tightly integrated where you can seamless move between reference material and the associated courseware (that may have details and practice opportunity).

But, I was dismayed to read:

Many learning organizations have already implemented an LMS, so there is an opportunity to enhance this infrastructure, making it part of a deliberate strategy to take advantage of the more ubiquitous workplace technologies.

Does anyone really believe that an LMS is a good way to support "on-demand information?"

The practical reality is that we most often make an early choice about what will be "under the LMS" and what will be "outside the LMS." Under the LMS implies that you want it to be tracked and are willing to make the user go through a bit of additional hurdle to get there. Outside the LMS implies that you want the end user to go directly to the resource via links on the intranet or through search and don't want the LMS to get in the way of this quick access at the loss of being able to track who is doing what with the content.

Some LMS products have tried to provide deep linking and automatic registration, but it is rare that this still turns out to be as easy as just building content using Wikis, CMS, Portal or whatever the relevant toolset is that you have for building information resources.

So, I hate to be the bearer of bad news, but if you are thinking about On-Demand Information (and you should be), you probably are not going to want to start with the concept of "leveraging your LMS."

In fairness to the author (and to IBM where the author comes from), they actually do a great job of providing seamless access to all kinds of information resources including learning resources. But IBM is the best of the best in terms of how they use technology for these kinds of problems and 99% of other organizations are going to need to take a very different approach to get there.

Keywords: eLearning Trends

Tuesday, May 09, 2006

Point Solutions vs. Suites and Composition

One of the more interesting back-stories in Blended Learning / eLearning 2.0 is the classic question of point-solutions vs. suites.

As background for the question, take a look at LAMS and particularly the demonstrations.

The description of LAMS in their FAQ is interesting:

LAMS is focussed on a very specific aspect of e-learning – sequences of learning activities, particularly collaborative activities. LAMS provides rich functionality for designing a “flow” of learning activities, especially where these activities rely on collaboration between students to drive the learning process. LAMS is a comprehensive system for the design and implementation of learning activity sequences – it includes an authoring environment, a “run-time” environment for implementation of sequences with groups of learners, and a monitoring environment so teachers can see student progress in real-time (and view past activities). It also includes a user administration system for system administrators. LAMS can include content delivery (and quizzes) as single-learner activities within a sequence, and you can point to content held elsewhere from within a LAMS sequence via a URL. However, LAMS does not attempt to replicate the course administration functionality of a typical LMS/VLE/CMS.

As an example, here's a screen shot of a workflow defined:

LAMS allows the specification of a blended learning sequence via a workflow diagram. Very interesting concept.

It's also worth looking at what Michael Feldstein and Patrick Masson are doing around an LMOS.

In the corporate space, probably the closest thing comes from Bill Bruck at q2Learning.

All of these aim at the need of being able to compose together a variety of intervention types into a Blended Learning solution. The LAMS approach is good to look at, because it looks like the workflow problem that this really is.

I think its pretty safe to assume that the big corporate LMS vendors will jump on this bandwagon soon enough by adding a workflow tool into their systems to finally be able to support richer specifications of blended learning programs.

In fact, my contention is that Workflow itself is going to be a central part of what we are doing even beyond composing together blended learning. See my post: eLearning Technology: Blurry Vision of Future of Work (Workflow, eLearning 2.0, Web 2.0, Decision Systems)

However, what's happening here is almost the opposite of what is the direction being suggested in the Web 2.0 world. Web 2.0 says "create lots of easily composable, point solutions that can be mashed together."

The providers I've listed seem to say, we'll provide you all the tools that you can compose into your blend. While, I want and need the composition, I don't necessarily like the tools being offerred by these vendors. Their particular implementation of voting, forum, questions, chat, surveys, wiki, etc. are not the best out there. But, instead of allowing me to compose together point-solutions using their workflow solution, I'm locked into their solutions.
I can understand this to a degree today given the lack of standards around things like identity, but it seems pretty clear that the path is towards suites. As a person who likes to develop interesting parts and be able to choose best-of-breed, the trend troubles me.

This is also coming out in the authoring world. Can you compose your course out of smaller components composed together, or does your authoring tool provide all the components you need?

Undoubtedly, we are going to see overlap between the suites, points solutions and authoring tools in terms of capabilities as they reach into areas such as comments, peer-review, etc. Today the overlap is most evident in testing. You can use your LMS' test solution, your Authoring tools test solution, or Questionmark Perception. This is an easy problem because the interface between a test and an LMS is standard. Just don't try to have the course authored in one authoring tool try to link directly into the test authored in another tool without first going back to the LMS - even thought it would be MUCH NICER to be able to link directly to the test from the content.

Still, my chief complaint is that I want to be able to have my best of breed solutions, but I want to also have easy integration. Buying everything from a single LMOS or LAMS vendor makes me worried that I'm not going to get my best of breed and that the promise of Web 2.0 to have composable elements is not going to come together. For more on this see my posts on Composition:

Promise of Web 2.0 and eLearning 2.0 - Comparison to Macros, IDEs, and Visual Basic

Typepad Widgets - A Sign of Things to Come in eLearning Authoring / Developement

Authoring in eLearning 2.0 / Add-ins & Mash-ups

Keywords: eLearning Trends, eLearning 2.0, Web 2.0

Intermediate Factors in Learning

I apologize on this post, but you are going to have to bear with me as I take you through my statement that:

The most important work we do in corporate learning is to understand Intermediate Factors and how to positively impact them.

I've recently read several comments and posts that talk about "only measuring the outcomes in learning." To point to one, take a look at: From Product Focus to Audience Focus.

In the corporate world we should only really care if the learning is transferred to the job…period! Is the output increased, or of higher quality because of our learning intervention. This has always been a problem for training departments because we look at everything we do as a product, and we “evaluate” if the product had impact. The approach is totally wrong.

While I completely agree that corporate learning is all about driving human performance that leads to business results, I have a real problem with the implication that you should only measure the outputs.

I've worked on several learing solutions that were tied closely to employee and customer surveys and so I've had a nice opportunity to work with some really geeky, analytic wonks who use things like regression analysis and command serious dollars to figure out what Intermediate Factors ultimately drive results.

For example, we might be working with a bank who knows that loyalty, share of wallet, and customer profitability are the ultimate business measures. But, the real question is "What drives these factors?"

A survey research person will create a Causal Model that attempts to explain Intermediate Factors that ultimately drive these numbers. For example, Satisfaction with the Advice the bank provides, Knowledge of Staff around products, etc., may be Intermediate Factors in their models. Each of these are then tied back to Survey Questions. Then, through analysis of survey responses in comparison to actual numbers, we can determine the correlation between these intermediate factors and the business drivers. This is INCREDIBLY IMPORTANT information!

Why is this so important? In most cases, I can't change the ultimate output directly. However, I often can aim my solutions at the Intermediate Factors. I can improve advice, increase knowledge. These are the performance drivers I go after.

Ah, well, the people who talk about "only measuring output" will say that all we've done here is understand the "outputs" and there's definite truth to that argument. I would certainly agree that these are important "outputs" to also measure. In fact, I would say that these can be used to define and measure the performance that we are going for.

However, even if I now recognize that Providing Quality advice is the performance measure (as measured in survey responses), I still have more work to do in order to determine the drivers of this. As learning professionals, we then spend our time understanding what practices drive this performance. We identify a series of additional intermediate factors (product knowledge, solution selling, etc.) that we need to attack.

The bottom line is that for us to be able to have a systematic method for improving the "output" we need to be able to define all of these intermediate factors AND we need to measure the impact we are having on intermediate factors. Again,

The most important work we do in corporate learning is to understand Intermediate Factors and how to positively impact them.

Yes, I care about the end result, but unless you can tell me the intermediate factors, how you will impact them, and ideally measure your impact on them, then why should I believe that your learning solution is going to work.

Brent said in his post:

The process is continuous and if our “training solution” is organic, dynamic, and flexible, it is very difficult to measure using the current method of measuring learning products. My point is “who cares”. If we have set up environments that help people collaborate, and support their informal learning, we should see output improvements.

"Who cares"? Well I do. And, actually, the business does. If you create an "organic, dynamic, flexible" learning solution but can't explain how it impacts the end numbers, then: (a) you won't get credit, (b) you won't know if you can repeat it successfully, and (c) you won't know if its really working.

Keywords: eLearning Trends, Informal Learning

Monday, May 08, 2006

Why Kids Blog (David Warlick)

I just read a David Warlick's post Why Kids Blog and one comment jumped out at me (among several great comments on the value of blogging by teachers):

In fifteen years of teaching, I have never seen anything come along even CLOSE to motivating students to write - like blogging does.

Mark Ahlness

What's interesting is how the shift to writing (my Kindergarten child is writing everyday in school) along with the power of blogging makes for some interesting possibilities in the future.

Definite food for thought.

Keywords: eLearning Trends

Friday, May 05, 2006

Web 2.0 Adoption in the Enterprise - It's Personal

I've previously written about adoption of Web 2.0 and eLearning 2.0 tools inside of corporate environments. eLearning Technology: Enterprise 2.0 - What's the PU?

There's been quite a bit of discussion around this topic (or similar topics) recently:

and I think that that the "7 Reasons" captures my same feeling on adoption:

people will only use software that benefits them personally at every step

So, to modify my earlier question "What's the PU (Perceived Usefulness)?" The question should really be:

What's the Perceived Immediate Value to Me?

This, of course, flies in the face of the normal discussion around Web 2.0, eLearning 2.0, Enterprise 2.0, and all the other 2.0s. The normal description is that Web 2.0 is all about social software.

In corporate environments, it can't be about Social, in needs to first
be Personal.

If you have thoughts on the personal value to you of Blogs, Wikis, Social Bookmarking (without the social), etc., I would love to hear them. I'm going to be posting on this topic, but would love input.

Keywords: eLearning 2.0, Web 2.0, Enterprise 2.0

Tuesday, May 02, 2006

Chambers - "Next Market Transition is Collaboration"

John Chambers, CEO of Cisco Systems, who said in 2000 ...

The next big killer application for the Internet is going to be education.
Education over the Internet is going to be so big it is going to make email
usage look like a rounding error.

Now is saying (Keynote at Interop, May 2, 2006):

Next market transition is collaboration.

While he naturally is always looking to see what is going to be the media rich applications that are going to push stuff over the networks they enable, it is also interesting to hear this as we in the corporate learning world have begun to transition as well.

Keywords: eLearning Trends

Monday, May 01, 2006

Informal Learning is Too Important to Leave to Chance

An interesting comment from Jay Cross (Mr. Informal Learning) relative to my earlier eLearning 2.0: Informal Learning, Communities, Bottom-up vs. Top-Down got me to thinking.

Jay says -
Serendipity is cool and I always leave room for it, but the learning I write about is always intentional. Otherwise, it's a tough sell.

So, we agree that most learning professionals are being asked for Intentional Learning (as opposed to Unexpected). Jay and I also agree that performance and business results are what matter at the end of the day, not seat time.

Where I'm not sure if we agree is whether intermediate results such as learning objectives, skill development opportunities, etc. are important in order to ensure that we will reach the performance and business results.

Maybe it's just me, but some folks (especially those coming from a background in Communities of Practice) believe that you foster learning but you don't control it. You provide the environment and people will learn.

Jay commented:
The question "How can I make sure that I'm able to hit my learning objectives if I don't control the content and the learning process?" assumes that you are ever in control. I think the learner should always be in control.

Well, maybe it's just semantics around learner control vs. learning professional control, but I'm not a big fan of leaving the learning to chance. Don't get me wrong. I like to create fairly open learning experiences, e.g., collaborative learning through discussions. I like people to have lots of opportunity to find their answers along the way themselves. But, I'm personally much more confident if I have a set of performance objectives that I can use to derive learning objectives and skill development opportunities around. I want to put structure in place that guides the learner along the way. I want to put them back "inbounds" as needed. All of this gives me a much higher degree of confidence that I will achieve the performance and business results I want at the end of the day.

Again, I would point us to George Siemens in his article Theories for Informal Learning Design:

Informal learning is too important leave to chance. But why don't we have theories that provide guidelines?

What we need is practical guidelines, lots of examples, etc.

I'll be saying more on this shortly based on the results of my Collaborative Learning class that I'm teaching right now.

Keywords: Informal Learning