Tony Karrer's eLearning Blog on e-Learning Trends eLearning 2.0 Personal Learning Informal Learning eLearning Design Authoring Tools Rapid e-Learning Tools Blended e-Learning e-Learning Tools Learning Management Systems (LMS) e-Learning ROI and Metrics

Sunday, September 30, 2007

Web 2.0 - Consumer vs. Enterprise Use

Jack Vinson (in Web2.0 as opposed to Enterprise2.0) pointed me to Ben Gardner's post The difference between Web2.0 & Enterprise2.0 that discusses the difference in how Web 2.0 acts in the consumer space as opposed to when it's adopted in the enterprise. Ben tells us:
Web2.0 vs Enterprise2.0 [excerpt]
  • User: Millions vs Hundreds
  • Mind set: Fun vs Work
  • Organisational structure: Flat vs Hierarchical
  • Attitude: Sharing vs Hoarding
  • Skill set: Digitally savvy vs Digitally averse
  • Visibility: Anonymity vs Recognition
  • Society: Public vs Private
  • Cultural: Innovative vs Mundane
While I'm not really in agreement with all of how he describes corporate cultures and the attitude of people inside corporations, there are definitely differences in how these tools get used once you move them inside the corporate walls. And there certainly is an issue of adoption inside corporations.

I've talked about adoption of web 2.0 tools in the enterprise before in this blog:
The main points across these posts are:
There are some things that are already pretty clear:
  • Organizations should attempt to adopt tools in concert with what workers are adopting already. In other words, if facebook and del.icio.us are already being used, then try to align your strategies with these tools rather than trying to provide behind-the-firewall solutions.
  • Organizations should provide clear definition of policies around web 2.0 use. These policies should be along the lines of IBM's blogging guidelines instead of being one of the 41% apparently ban Facebook and other such applications via firewall rules.
  • Organizations should look to establish champions throughout the organization who can help lead incremental adoption where it makes sense. The champion will help others to recognize opportunities and value.
  • Build new work and learning skills in the workforce

Friday, September 28, 2007

KM 2.0, Enterprise 2.0 & eLearning 2.0 Worlds Getting Closer

For a while, I've been trying to figure out how eLearning 2.0 relates to KM 2.0 and Enterprise 2.0 - certainly there is a LOT of cross-over between what's being discussed. Well funny enough, it turns out that:

DevLearn 2007 & KM World 2007 are both in San Jose, Nov. 6-8. Even both in the downtown area.

Maybe we can recruit some of the KM bloggers who are attending to come join us for Beer and Bloggers on Nov. 7. Some of the folks I see listed are people I read all the time: Dave Snowden
Dave Pollard, David Gurteen, (apparently you have to be named David to work in KM), Ross Dawson (but since his name is Ross - now I'm not so sure about him).

If we don't do something, it seems like a horribly missed opportunity.

eLearning Startups - New Wave Coming

I'm not sure if other people are seeing this, but based on a bunch of conversations and calls, it appears that there is a wave of new eLearning startups being created right now and the trend is accelerating. Because of my background doing startup software development with leading startups, e.g., eHarmony (see Matching Algorithm), MyShape, LoanToolbox, etc., my background in eLearning, and the fact that I'm a CTO-type person, I may get more of these kinds of calls, so I'm curious if other folks are seeing this as well.

What's interesting about the new wave is that most are targeting outside of traditional eLearning Solutions. Traditionally, we've seen start-ups that focus on authoring tools, virtual classroom, learning management, content, services. When you look back at 2000-2002, companies getting VC funding were folks like Outstart, Hyperwave, Infocast, Element K, Pathlore, Vuepoint, Knowledge Impact. Some of that is still happening with folks like Cornerstone getting $32M. But a lot of what you see today is consolidation among players in these areas.

Today the startups in eLearning sit in smaller niches or by attacking tangential opportunities in eLearning. They are going after things like:
  • specialized tools and content that meet particular industry or audience needs
  • games and simulations
  • web 2.0 approaches that leverage distributed content creation, social aspects as part of learning, collaborative learning and editing.
This is likely a natural outgrowth of the maturing of the industry. The larger players become focused on bigger and bigger opportunities. The bottom of established markets get eaten by lower-cost, e.g., open source players. And innovation comes from nimble start-ups who can attack smaller market opportunities. What's interesting though is how many of these "smaller" opportunities have the ability through network effects to grow very large.

One thing that was curious to me is that I can't find a good resource that shows what companies have received funding in our world? Where are the analysts? Is there a list somewhere? How could I find this out?

Wednesday, September 26, 2007

Blogging a Conference

I just saw a blog that appears to have been created solely as a means to interact, capture thoughts and ideas during the Brandon Hall Innovations conference. It's a great way for folks not at the conference to have a sense of what's going on there. And, I bet it is a fantastic learning and networking experience for the blogger. Kudos to the BH folks for encouraging this even if there were issues.

Tuesday, September 25, 2007

Networking to a Job - Practical Advice

I just posted Networking to a Job - Practical Advice on my other blog. If you are looking at how you can leverage your network to find a job, it may be valuable. Literally, I've just told three people the exact same thing in the last week so I thought I had to blog it.

Monday, September 24, 2007

Software Upgrade Training and Support

BJ Schone just posted a great question about a particular situation. They are upgrading their PeopleSoft implementation. He tells us:
I don’t know much about the new system, but I understand that it is quite an overhaul; one estimate said we would need 80+ hours of face-to-face training. However, due to logistics, time, and money, it appears we will be training about 80% of these employees using a combination of self-study eLearning courses and webinars. Everything will be tracked in our LMS.

Applications training can be excruciatingly boring, especially when taken as a self-study eLearning course. These courses generally consist of step-by-step instructions where the learner watches a task as it is performed, and then they try the task on their own in a simulated environment. I’m worried that we’ll bore people to tears and that they’ll mindlessly follow along with the step-by-step directions…and then not retain anything.

How would you tackle this? What ideas do you have?

What a great question BJ! It's also an incredible example of something good to do in a blog post! I plan to mention this in future presentations. I hope you get some good ideas.

Some quick thoughts -

1. Are people changing just the software version and interfaces or are process changes happening also? Many times, upgrades are about changes in interface and less about process change and thus can focus on how you accomplish specific known/understood tasks in the new interface. This greatly simplifies the training and support needs. However, if you are upgrading from PeopleSoft to Oracle you may be looking at process changes which will require additional work.

2. Assuming you are talking about an upgrade as interface change then any training and support starts and ends with roles/jobs and tasks. Once you find the primary tasks that represent 90% of what people will do on the system, find out which tasks are likely to be problem areas, which are possible sources of costly errors and figure out who does which tasks in your organization and how often these are performed, then you have the basis for your learning design.

3. I agree that 80 hours is WAY TOO MUCH for a whole lot of reasons. I generally shoot to make any course as short as possible. See How Long Should an eLearning Course Be? In this case, it's identifying what the minimum amount of upfront training that's required to get various jobs/roles minimally proficient so they can start on the new system and then dribble additional information to them as appropriate.

4. The minimal training will include:
4a. Overview of the new system and interface
4b. Common interface conventions
4c. How you get help/additional training on specific tasks
4d. Training on minimum set of tasks

5. Provide a hybrid reference / course solution for the remainder of tasks. It explains details on any task and provides access to additional walkthroughs or simulations. I generally call these a hybrid reference solution.

6. More broadly I also like to see up-front communications, introductory sessions to the system (virtual classroom), minimal training, access to hybrid reference and support, and follow-on "office hours" that allow people to ask questions, tell us problems they've found. Office hours can be an amazingly effective tool. We can tell everyone going in that there will be challenges as you get into the system. If it's something you need immediate help on, here's how you get help. If you can hold it for offices hours and ask it then, everyone can see the issue and see how it's solved. Note: office hours need to be held based on job/role in the organization. They are held daily at the start and then slow down as it goes farther.

7. Pilot your solution with the pilots of the system.

You can also check out my post on: Software Simulation eLearning for some other thoughts and links to posts.

I know I'm forgetting a bunch, but as you begin to shape your solution, it will likely come to me.

How Valid are Research Results? And What to Do

Based on my post - LMS Satisfaction Features and Barriers - I received some feedback that the numbers didn't correspond to what other research has shown. As a case in point, in my post I said:
To me, it's interesting to see news like CornerStone OnDemand Raises $32M - Sept. 17 2007 - when their satisfaction score is being reported so low. That seems like a disconnect. I know the folks from CornerStone and their product does some really interesting things. They are moving towards a broader suite that focuses on other aspects of Talent Management. But given the number of respondents who've rated them low on satisfaction, it would give me pause during a selection process (so that makes me worried about sales), it also would have worried me as a potential investor.
It was pointed out to me that Cornerstone has been rated much higher by other researchers, in particular, Bersin. Looking at Cornerstone's site:

“Cornerstone’s customers are not only satisfied with their solutions, but also recognize that Cornerstone can grow with their business, as evidenced by their leading loyalty ranking.”

“For years, Cornerstone has provided an excellent integrated talent management solution for performance, learning, and succession management. Cornerstone's customers - among some of the world's largest enterprises - tell us that Cornerstone OnDemand's solution is highly configurable and flexible, a key strength in the performance management market… Cornerstone's 'new breed' of integrated talent management is not only in demand among its customer base, but is also catching on across the industry.”

Further, according to a PDF sent to me it appears that Bersin's research rated Cornerstone as having market leading customer satisfaction and customer loyalty numbers.

So which research is valid? What should you believe? Are customers satisfied with Cornerstone or are they not satisfied? And why the discrepancy?

How Research is Conducted

So let's start with how Bersin and the eLearningGuild gather their data. The eLearningGuild asks industry experts to formulate survey questions that are then sent to their members. Members are asked to fill out these surveys via emails. The membership is pretty large so on the LMS survey, you get roughly 1300 respondents, but with some overlap over companies that are combined to form a single response. In the case of Cornerstone, there were 9 companies who are corporations with 1,000 learners, 1,000 employees represented in the rating. There were a total of 20 different people rating Cornerstone representing roughly 18 organizations.

I'm less familiar with the details of Bersin's approach. What they say on their web site is:
Our patent-pending methodology relies on primary research directly to corporate users, vendors, consultants and industry leaders. We regularly interview training and HR managers, conduct large surveys, meet with managers and executives, and seek-out new sources of information. We develop detailed case studies every month. We have the industry's largest database of statistics, financials, trends, and best practices in corporate training (more than 1/2 million data elements).
Normally what this means is that each company who is going to be in the survey will be asked to provide a list of customers who can be interviewed and respond to the survey. However, it's not clear from their description what the source of the interviews and who the "large surveys" go to.

So the eLearningGuild gets surveys based on asking a large audience and receiving somewhat random inputs based on who responds. Bersin takes a directed research approach likely based on lists provided by the customers.

Impact

Obviously, these two approaches are likely going to have different results.

Bersin's approach is much more likely to be similar to what you would find when you check references during a selection process. The vendor is providing the names of customers. You would expect these customers to be happy with the product. If they aren't, then you should be pretty worried. Bersin research also provides more information about where the vendors are focused as a product, where they position themselves in the market and other interesting information. In my opinion, in terms of evaluating real satisfaction, Bersin's numbers are highly suspect.

The eLearningGuild approach is going to get a much more diverse set of inputs and is much more likely to find people who are not satisfied with the product. If the eLearningGuild can reach a large enough set of customers of the product and you get a random sample of those customers, then I would tend to believe those numbers over the numbers produced by Bersin.

But the big "if" there was whether you reach enough customers. The eLearningGuild received roughly 1,300 respondents. The problem is that once you go beyond the market leaders, the number of respondents on smaller vendors becomes small. Only 3 companies rated TEDS and they rated it really low. I'm not sure I'd believe that is representative of all customers.

So, if you are looking at a relatively niche product, then the eLearningGuild is not likely to produce meaningful numbers. On the other hand, if you are considering TEDS, the eLearningGuild has found 3 customers are not happy with the product, and that's good to know.

In the case of Cornerstone, despite having glowing reviews from Bersin, there are some customers who are not satisfied with the product. As I said before, that would give me pause during a selection process and would cause me to ask:
  • Why are the 9 customers rating Cornerstone lower?
  • Was it a feature / needs mismatch?
  • Were they having trouble with features?
  • Are the dissatisfiers things that I would care about?
  • How could I find out?
The last question is probably the most important question. And right now, the unfortunate problem is that it may be relatively hard to find out. The eLearningGuild makes the aggregate findings available, but there's no ability to drill down to find out specific reasons nor do they provide some kind of social networking to get to those respondents. Note: Steve Wexler (head of the guild's research group) and I have discussed how they could do this, but it wouldn't be easy.

So, it's on us to figure it out on our own. And we are describing what you do during the reference checks that occur towards the tail end of the selection process. This takes some work, so it's definitely not something you should do across more than a couple of vendors.

I would use LinkedIn to find people who've been using the product at various places, especially at companies similar to your own. I would also use the vendor's client list (not their reference list) and call to ask or again use LinkedIn to network into those clients to find the right people to talk to. Most people are very happy to give their experience. Especially if you are about to select the LMS. However, don't be surprised if you find people who are still in process or otherwise can't be all that helpful. So, it takes some work.

I welcome any thoughts and feedback on this.

Also, I obviously highly recommend using LinkedIn as a means of professional social networking. If you are a LinkedIn user, consider connecting with me.

Friday, September 21, 2007

Role and Voice of Community Leaders - Your Input Needed

Based on the post He Had a Bad Day, Mark Oehlert, Heidi Fisk and I have had an email conversation that raises an interesting broader questions around the role, voice and communication norms of people who are the leaders / organizers of groups like the eLearningGuild.

  • Should they try to stay behind the scenes and put members out in front?
  • Should they have a strong voice and opinion that champions approaches or direction for the members?
  • Do they need to remain neutral?
  • How can they avoid abusing high profile, influential positions?
  • When they have valuable resources that are free, requires membership, or for fee that could be of value to someone who posts a question, what's appropriate? Should they point people to it? Should they sit back and hope someone else does?
Certainly, if you look at the eLearningGuild, Masie, ASTD, VNU, Brandon Hall, etc., you will find very different approaches taken by the leaders of each. They clearly have different answers to these questions. But, based on the conversation with Heidi, I know they must struggle with this question.

Mark, Heidi and I have shared our opinions on this via email, but I'm very curious to hear what you think about this issue.

Please take a minute or two to share your thoughts in the comments section (or in a blog post).

Thursday, September 20, 2007

LMS Satisfaction Features and Barriers

Update, Oct. 2010. The data below is getting just a bit old at this point. It still might give you an idea of what's out there, but I'd be careful relying on it too much in your LMS selection process.

Here are some more recent resources such as my series on Selecting a Learning Management System (LMS):
Additional resources from my blog that are more recent around LMS:
  1. LMS and Social Learning- eLearning Technology, March 31, 2009
  2. Open Source LMS- eLearning Technology, December 10, 2009
  3. Test SCORM Courses with an LMS- eLearning Technology, January 13, 2008
  4. LMS Team Size and Time - Wow 23 Months!- eLearning Technology, October 31, 2007
  5. How long does it take to select an LMS?- eLearning Technology, January 8, 2009
  6. Test LMS- eLearning Technology, September 10, 2008
  7. Communities / Social Networking and LMS Merger- eLearning Technology, December 6, 2007
  8. What Goes in the LMS?- eLearning Technology, February 5, 2009
  9. Rapid Learning Management Systems- eLearning Technology, October 20, 2009
  10. What Makes an LMS Easy to Use?- eLearning Technology, February 11, 2010
  11. Leading with an LMS - Harmful to Your Health (or Skipping Stages in Bersin's Four Stage Model)- eLearning Technology, June 6, 2006
  12. LMS RFP- eLearning Technology, October 25, 2007
  13. Larger LMS Audience Means Lower Satisfaction- eLearning Technology, April 18, 2008
  14. LMS - Questions- eLearning Technology, November 5, 2007
  15. Moodle in Corporations- eLearning Technology, April 4, 2007
  16. Custom LMS Anymore?- eLearning Technology, January 5, 2010
  17. LMS Tracking of Podcasts and Video Casts- eLearning Technology, February 9, 2010
  18. One Week to Select an LMS – No Way- eLearning Technology, February 16, 2010
  19. LMS Solution for Simple Partner Compliance Training
  20. Low Cost LMS
And you should check out: Learning Management Systems (LMS) for a more extensive list.

I thought I had previously wrote about this, but I realized when I was posting Learning Management Systems (LMS) Gotchas that I'd not previously talked about the results from participating in the eLearningGuild's Learning Management System research report. You can get the abstract for the report and sign-up for a webinar on Sept. 26th on that same page. I was one of several authors who collaborated on the survey questions and wrote different sections of the report.

The survey data itself I find to be pretty interesting and useful for some important aspects of selection. The survey asks demographic and company questions of all the participants so that you know things like the industry, size of company, number of learners, their role, etc. Then it asks questions such as satisfaction level, satisfaction with particular features, are you planning to replace, primary barriers, importance of features, etc.

Overall Satisfaction in Large Corporations (Count >= 3)

LMS Satisfaction Large Corporations

Overall Satisfaction in Small and Medium Corporations (Count >= 3)

Learning Management System Satisfaction

Overall Satisfaction in Education and Goverment (Count >= 3)

LMS Satisfaction Education and Government

And to show some of the power of the slicing, I've got a graph of overall satisfaction for Manufacturing and Biotech/Pharma for companies with more than 1,000 employees and 1,000 learners with a count of 1 or more.

LMS Satisfaction Manufacturing

Some notes on these numbers...
  • These are being reported out of the larger eLearningGuild population and I would claim that they are likely more accurate than survey research that goes through the vendors themselves. However, when you get low counts, the numbers are highly suspect.
  • Several of the LMS vendors appear more than once, such as Oracle. Most of the time (as is the case with Oracle) these are different products. Sometimes it's how they were referred to by the survey respondents and it hasn't been cleaned up in the data.
Some comments on the satisfaction numbers themselves...
  • The dissatisfaction numbers are high in general. And this is not being reported only in this research. I mentioned in previously in LMS Dissatisfaction on the Rise. An LMS is a big, expensive tool that takes quite a bit of work and is generally harder than you think it's going to be when you start out. It's why I often try to convince people to not Lead with an LMS.
  • To me, it's interesting to see news like CornerStone OnDemand Raises $32M - Sept. 17 2007 - when their satisfaction score is being reported so low. That seems like a disconnect. I know the folks from CornerStone and their product does some really interesting things. They are moving towards a broader suite that focuses on other aspects of Talent Management. But given the number of respondents who've rated them low on satisfaction, it would give me pause during a selection process (so that makes me worried about sales), it also would have worried me as a potential investor. Maybe it's good I didn't post this until today.
  • Moodle scores very high in satisfaction, but we need to qualify that result a bit. I personally feel that Moodle is good at the limited stuff it does. It's free which improves satisfaction. But it's not really an enterprise LMS and has some very serious deficiencies when it comes to many of the needs of corporate training departements. However, I would be concerned if I was a starter LMS vendor because Moodle is going to cause you grief in the bottom of the market. If nothing else, it causes the perception that there's a free competitor.
  • SkillPort scores very high as well on satisfaction. However, like Moodle, it only addresses particular needs and so most often it's a starter LMS. I do highly recommend using a starter LMS if you are new and you don't necessarily know what you need later on. Note: this represents a lot of the market whether they admit it or not. Of course, if you are consciously choosing something as a starter LMS, make sure that everyone knows that its a starter and that you plan to move in a few years.
  • Oracle's (Learning Management and PeopleSoft Enterprise Learning Management but less so it's iLearning) and SAP Enterprise Learning score better than I would have expected. For the past several years during selection I've found that they trail Saba and SumTotal on needed capabilities and they were hard to use for particular tasks that were important to the clients involved. Based on the satisfaction numbers, it appears that they are catching up and it will be interesting to see where they are during the next evaluation. As these systems catch up, this will make the market really hard for Saba and SumTotal. Over the past few years, they've been able to fend of Oracle, PeopleSoft and SAP through superior products. If the playing field is more level, then it becomes harder to argue why you don't want an integrated solution. So large companies with Oracle or SAP implementations are going to start going more towards using those packages.
  • The numbers displayed above don't show this, but you can slice the numbers according to the level in the organization and by role. No surprise, but the people who have to work with the LMS day-to-day give a much lower satisfaction number than the managers and directors who get the results and the reports. Performing tasks (use cases) in an LMS are harder than they seem like they should be and it results in lower satisfaction.
  • I wouldn't read too much into any of these numbers. They can definitely be useful to see what LMS products are being used in companies similar to yours. They can be used to see how LMS vendors stack up against particular features. But you still need to look at differentiating use cases to choose a vendor that will work for you.

Other data in the report points to particular barriers and the importance of particular features during Learning Management System (LMS) Selection.

Barriers



LMS Barriers


Importance of Features



LMS Features


Some notes on Barriers and Feature Importance:
  • Good news for me - problem with 3rd party consultant scores low as a barrier. :)
  • A lot of the other factors are pretty significant barriers, e.g., cost, customization, flexibility.
  • On the features importance, looking at the aggregate numbers is not very interesting. However, looking at numbers for your particular type of industry, size, etc. can yield more interesting results.
Some other posts in my blog around Learning Management Systems:


LMS Systems:

LearnFlex Operitel Corporation 1 10.00
Extention LMS Acadia HCS 1 9.60
NetDimensions
Enterprise .. NetDimensions 1 9.30
OutStart Evolution LMS OutStart 2 8.40
Moodle Moodle 6 8.33
Oracle iLearning Oracle 4 8.25
TopClass e-Learning Suite WBT Systems 2 8.25
SkillSoft SkillPort Skillsoft 12 8.03
CourseMill Trivantis 1 8.00
Compliance LMS Pro-ductivity Systems 1 7.50
Oracle LMS Oracle 14 7.48
ResultsOnDemand SumTotal Systems Inc. 3 7.33
LearnCenter Learn.com 5 7.18
WBT Manager Integrity eLearning 1 6.90
Articulate Online Articulate 1 6.60
Saba Learning Suite Saba 9 •::: 6.37
TotalLMS SumTotal Systems Inc. 27 I 6.29
GeoMaestro LMS GeoLearning 6 6.27
Plateau Learning Managem.. Plateau Systems, LTD 22 6.24
Blackboard Academic Suite Blackboard, Inc. 8 6.19
PeopleSoft Enterprise Lear.. Oracle 2 6.15
Saba Enterprise Saba 23 5.97
SAP Learning Solution SAP 10 5.97
Virtual Learning System Plateau Systems, LTD 1 5.90
Cornerstone OnDemand Cornerstone 4 5.80
Training PartnerTM Learning.. GeoMetrix Data Systems Inc. 4 I 5.68
IBM Lotus Learning Manag.. IBM 1 5.30
KnowledgePlanet Learning KnowledgePlanet 3 4.50
ViewCentral ViewCentral 3 4.43
KnowledgeNet Platform LMS Thomson NETg 2 4.10
GeoConnect GeoLearning 1 3.90
Enterprise Knowledge Man.. Generation 21 Learning Syste.. 1 3.00
IntraLearn 5.0 IntraLearn Software Corporati.. 2 2.70
Pinnacle Learning Manage.. Learnframe, Inc. 1 2.50
TM SIGAL® Technomedia Training Inc 1 2.30
Enterprise Knowledge Asse.. Generation 21 Learning Syste.. 1 2.10
Meridian KSI Knowledge Ce.. Meridian KSI 1 1.60
Vuepoint CertPoint CertPointSystems VLS
Adobe

LMS Barriers:

The Cost
IT Support
Customization
Integration with other
systems (content. HR, ERP,..
Legacy system integration
Mind set to move learning
online
Problems with vendor
Support from management
Flexibility for future
requirements
Support from stakeholders
Clear business goals
Security
Support from learners
Tool/vendor selection
Administration
Problems with third-party consultant
Compliance

LMS Features:

Tracking, reporting, and
measurement
Content delivery
Assessment and testing
Asynchronous e-Learning
Blended learning
Standards (SCORM and AICC)
Training history
The ability to support different
models and sequences of blended..
Ability to create an index so that
people can find a particular topic
Security
User and group management
Registration
The integration with single sign-on so logging in is not required
Competency and skills
Synchronous e-Learning
Regulatory Compliance
Collaborative learning
Certification
Catalog
Instructor-led training management
The ability to support specific and complex business process models

Wednesday, September 19, 2007

He Had a Bad Day

I just saw a post by Mark Oehlert - I'm tired people, so I'm only going to say this about 1,000 more times... - and it appears that Mark has had a bad day...
The point (hard to see though as it is) however, is that with learning - the changes are going on inside our own heads and bodies. We are acted upon by outside forces but ultimately learning is an internal act mediated by our own individual/collective contexts. What learning is NOT is a product. It can NOT be shrink-wrapped. It can NOT be updated to version 1.2. It does NOT rely on a particular OS or even give a crap about what version of the Web we happen to on. Learning scoffs at mergers of companies and at specifications like CORDRA and SCORM. This misunderstanding has led us to countless, pointless discussions about lots of issues but ROI makes a great example (you can't really measure what's going on in someone's head can you? - no but you can measure performance - but people aren't selling Performance management systems (those would be PMSs and marketing would NOT let that happen).

We can do better or worse at creating opportunities for people to learn. We can use methodologies and technologies that seem to have a positive impact on peoples' ability to learn; but we are NOT selling learning. So let's freakin' STOP talking about learning like its a product. Hey LMS CEO - you ever manage "a learning"? Hey authoring tool person - you ever make "a learning"? Can you send me one?

So how about for pete's sake, we all agree to start indulging in some semantic accuracy.
I was going to comment on his blog, but for some reason it required registration, but had not way to register. So, instead, I've pasted my comment here...
Mark, wow. Are things okay? Is it that Oregon State is going to get beat up by Oregon this year in college football?

I hate to tell you, but while you are technically correct that learning is an outcome of something else you can still sell "learning solutions" - solutions designed to achieve learning as an outcome.

And good luck with getting us to stop using the term learning instead of training. I'd personally rather have us talk performance, but the industry has landed on learning, e.g., CLO, LMS, WLP, oh and eLearning.

Of course, when I think about it, we should be talking outcomes. So learning is better than training as a term.
Hopefully Mark gets some help working through this issue. :)
... Maybe I should buy a beer for him in San Jose.

Tuesday, September 18, 2007

Time Spent on Blogging

A frequent question I get at the end of any presentation on eLearning 2.0 is how much time I spend on blog reading and writing and how anyone can work that into their already too busy life. I also got a form of this among the questions from the grad student (I'm slowly working through these.)

Some quick thoughts -

Scanning Activity Replaced

I likely spend an average of 30-45 minutes per day reading/writing that is part of what I call "scanning" which is staying up to speed on what's going on. Realize this is high, but much of my professional work is staying up to speed.

Blog reading and writing as part of scanning has been a replacement of other activities that I used to do - reading magazines and books. Originally, I didn't plan to replace these activities as I considered the information from blogs to be quite different. However, I've found that I'm not nearly as willing to read through a magazine, book or newspaper in depth because they don't seem nearly as meaningful or targeted as blog posts. Most of my free subscriptions have lapsed.

My guess is that most people can relate this to the experience of sitting in a large conference session with a not so great speaker thinking - "I could have got more value by searching the topic on Google and reading about it than sitting here for an hour." The same feeling starts to happen when you read blogs and then go and read magazines. A magazine generally has very superficial content, slightly off target for your needs, and is really pretty dang poor compared to blog posts. Of course, a lot of blog posts you skip very quickly. Scanning really becomes scanning. But after you've done really fast scans on blogs to find the interesting nuggets, you quickly find yourself flipping through the magazine and maybe finding one article that's worth reading at any level of depth - maybe.

Except for last month's T+D article on Blogging for Learning and Networking of course. ;)

Natural Part of Knowledge Work

I've found that I've also adopted a practice of blogging as a natural part of my knowledge work. When I'm doing some research on a topic as part of a project, as I think through the problem, find solutions, etc., it is quite natural to blog these things. This is separate from scanning as described above. And again, it's a replacement for other forms of note-taking. I still take lots of electronic notes, but as I formulate more specific ideas, I definitely use the blog as a sounding board.

In terms of time, I'm pretty sure that this has been a replacement that doesn't cost me time and sometimes saves me a lot of time via great feedback. In other words, I likely would have spent this time anyhow on that task, it's just that I keep this information stored in a new way.

A recent post talks about exactly this issue and suggests blogging your brainstorms, your R&D, your initial discoveries.

I'd add to that post - blog your questions.

Suggestion for Starting Out

I normally suggest that people start small and try to commit an hour a week to scanning - reading and writing about things in parts of the field that interest them. That's more of a way to get used to the blog itself.

Likely the more powerful effect comes when you are assigned a research task and you use the blog as part of that task.

Other posts to check out:

Sunday, September 16, 2007

Risk of Identity Theft Due to Social Networking and Blogging

First question -
Identity theft and electronic stalking are scary issues. The more one you participate in blogs, discussions groups and other social networking tools, the more information there is about you in the world for anyone to access. Does this concern you? If yes, what guidelines do you follow to minimize the risk? If no, why not?
I am definitely concerned about identity theft, and less so about electronic stalking. Of course, the situation with Kathy Sierra was an eye opener.

The basic guidelines I have is not to put anything in my blog, on a social network, even when I register on a site (even when that site is supposed to keep the information private) that I wouldn't want to be public. If you assume that everything you are doing, including what you write in email and IM, is fully public, you tend to protect yourself. In my mind, the risk is pretty high already with simply the personal search tools that exist on the web. Going to intelius hints at information that's readily available.

If I look at what I put in my blog and on social networks, I don't believe that this creates real additional risk because of the type of content I provide. Unfortunately, there will continue to be some extreme cases, but the actual risk - if you use precautions, is not significantly higher.

I'm curious if anyone does believe that blogging, social networking, etc. poses much of a risk?

Saturday, September 15, 2007

Blogging as Part of Classroom Experience

Note: 9/17/2007 - fixed link to correct instructor's blog. Some good comments coming in.

I've received several good questions from Kirsten Morton that I'll be answering over the next few blog posts. She is a graduate student in adult education and learning technologies at the University of Colorado. This semester, one of her courses focuses on trends in eLearning and requires that each student start a blog.

It was interesting to visit her blog, the instructor's blog and some fellow student blogs:
Note: get with it Keith - you are definitely lagging the rest.

This is a GREAT educational tool to use associated with a class. I did something similar in my Collaborative Learning Using Web 2.0 Tools Class. All students were required to use blogs as the primary writing tool for assignments. Further, they were required to provide feedback via comments or blog posts on each other's work.

I like that the instructor in this class is going one step further by asking the students to network to find individuals to answer questions. This is a great way to learn how to do this outside of a class environment.

Thursday, September 13, 2007

Delicious Upgrade

Yesterday, at my presentation Introduction to eLearning 2.0 - ASTD OC, I mentioned that Yahoo MyWeb had a few features that del.icio.us was sorely missing. And that had made me previously conclude: Yahoo MyWeb better than del.icio.us, rollyo, et.al. for Personal / Group Learning

However, I told everyone that del.icio.us had been acquired by Yahoo and it was likely that slowly these features would get integrated. And today, I just saw - and Delicious 2.0.

It looks like search across content of bookmarked pages has been addressed. Not sure whether control of sharing has been.

Wish I had an invite so I could tell. :)

Monday, September 10, 2007

Local Lectures vs. First Class Lectures

This is a topic that I've been wondering about for a while and Donald Clark hits on it in his post:
Professor Lewin - “It sounds arrogant, I know, but it’s better to see a first class lecture on video than a mediocre one in the flesh."

Use the FREE stuff because it’s better. This is a simple solution to a massive problem. Students are already voting with their fingers and dumping their third-rate, real, local lectures for first-rate, online, global lectures. The same can apply to most standard teaching and training lectures.

Why would a student attend lectures by a professor that aren't great just because they are local? In large universities where there is little to no interaction and the interaction is done with Teaching Assistants, why not have the lecture come from the absolute best teacher (hopefully one that is known in the field as well so you can drop names). You can still have a professor or teaching assistant handle the interaction.

There is also then the question of where your degree is really from at that point. How does branding work anymore? Professor Lewin is from MIT. Do you get some kind of MIT credit? It gets thorny really quick, but there's no questioning:
It’s better to see a first class lecture on video than a mediocre one in the flesh.

LMS Conversation with Tracy Continues

This is a somewhat different experience for me. Tracy Hamilton and I are having a weird kind of slow, public conversation using our blogs. But it also feels like a reasonable way to do it - which makes it somehow even more weird. With that caveat -

You can find the beginning of the conversation and my list of Learning Management Systems (LMS) Gotchas.

In response Tracy just posted -LMS - I don't want to be Little Miss Should've. Among her points:
Gotcha #1...Unrealistic Expectation....okay maybe, this is a definite we'll see. I don't think we are aiming too high here.
Just because you aren't aiming high doesn't mean you and your stakeholders won't be disappointed in what you get. In fact, what seems like it should be really simple to do can sometimes be hard in an LMS. Especially if you don't configure things right from the start.
Gotcha #4 - Failing to Identify Key Differentiating Use Cases...I'm going to admit here....I'm not sure what this one means. I'm going to need some help clarifying this mistake.
Sorry - without the details that are in the report, it's a bit cryptic. A "Use Case" is an example of how a particular user will use the system. Tracy - you actually have the beginnings of some use cases when you say:
I really would just love something that is going to track our staff education (outside the organization), something to help with booking the internal courses, and something to launch and better yet marked the annoying core curriculum (still marking the paper ones from June).
You just would need to flesh these out a bit to tell us what you really mean when you say "track" and "booking" and "marking" - the verbs all indicate you want the system to do something for you. What exactly is that?

The other key aspect to the gotcha is "differentiating." When you go to select, what use cases are a bit different that will really make one LMS better for the use case than another. Most of your use cases will be quite normal and LMS products will generally do them the same.

Of course, this gotcha relates quite closely to the other gotchas. Without having the details of what people will be doing with the system (the use cases), you aren't really in position to test it, nor are you in position to have realistic expectations about how well the LMS will be able to handle those cases.

Tracy - I look forward to hearing more about your experience. And, your description of the process to get the LMS starting in 2003 will resonate with MANY (if not most) people in our field.

Friday, September 07, 2007

End of an Era - Authorware

Just saw an Adobe announcement that they plan to discontinue development of Authorware. Since it was the first multimedia authoring tool I ever used, it certainly is bringing back some memories of some pretty cool work (and headaches).

Goodbye Authorware. Thanks for the memories.

25 Interactions for eLearning - Free eBook

BJ Schone has published a nice little eBook and has an associated blog that provides some interactions that can be used in eLearning that will make the learning more fun and engaging. Some of the interactions he talks about are:
  • Scatter steps
  • Order of importance
  • Find the mismatch
  • Story-based questions
  • Scavenger hunt
  • Branching stories
I was planning to also point folks to other lists of these kinds of interactions (which I believe are out there), but for some reason I couldn't easily find them. Would love some pointers to other such lists to provide here.

In the meantime, a couple of ideas for where you can find other similar kinds of descriptions of eLearning interactions:
  • Look at Articulate product - tabbed interaction, process, timeline, pyramid diagram, labeled graphic, interactive FAQ, media tour, circle diagram, guided image, FAQ and more.
  • Similarly you can look at Adobe's various products to get ideas - look at Dreamweaver + Coursebuilder and Captivate especially.
  • Raptivity's list of interaction types.
If you know of a list of eLearning interaction types (or articles on it) - please drop a comment.

Thursday, September 06, 2007

Learning Management Systems (LMS) Gotchas

I regularly read Tracy Hamilton's blog. She's been writing about getting a new LMS. I hate to say it, but this is likely not going to go well. It will be fun to keep reading about it, probably much better than having to be Tracy during the process. In her latest post: What's Happening With My New LMS?!?! she tells us:
Oh and my little meeting about my job role and what path this LMS will lead me down. NOTHING!!!! We can't possibly discuss that until we have the tool in place and know exactly in what capacity we will use the darn thing.
Uh oh. They are bringing in an LMS and they don't know how they'll use it nor do they know who will be responsible in what ways for the LMS. Ouch.

For the eLearningGuild, I wrote a description of the 10 Gotchas in LMS Selection, Installation and Configuration as part of their Research Report on Learning Management Systems. I'm not sure I can publish the details of my section in my blog, but the abstract is available for the report (and has some good stuff in it). Luckily it has the Table of Contents for my section which lists my Gotchas:
  • Gotcha #1 – Starting With an Unrealistic Expectation of What You Need
  • Gotcha #2 – Missing a Key Stakeholder
  • Gotcha #3 – Failing to Get Agreement on the Process with Key Stakeholders
  • Gotcha #4 – Failing to Identify Key Differentiating Use Cases
  • Gotcha #5 – Coupling Content Authoring with LMS Selection
  • Gotcha #6 – Not Testing a LMS
  • Gotcha #7 – Failing to Ask a Critical Question or Two
  • Gotcha #8 – Poor Contract Negotiations
  • Gotcha #9 – Tripping on the Models
  • Gotcha #10 – Customization
So far it appears that Tracy's company has hit Gotchas 1, 4, and 6. Likely also several others.

Tracy, keep up the reporting!

Have Work and Learning Changed or the Way We Do Work and Learning?

I'm struggling a bit with the question of whether what we need to do around Work and Learning has changed or is it only that the Context and the Methods have changed.

Let me try to explain the question and then I'm hoping I'll get a bit of help on answers.

I believe that the Context we do our Work and Learning is changing:
  • Ever increasing pace of information creation
  • Greater volume of information
  • Improved accessibility of these large quantities of information
  • Wider and more varied sources of information
  • Larger numbers of people creating content
  • Greater access to wider networks of people
and I believe that there are many new methods, skills, tools, knowledge around how to accomplish our work and learning (see Needed Skills for New Media).

However, when I look at what I do day-to-day and what other people do day-to-day as part of their work and learning, I don't see it as really being different. Basically, I see us:
  • Staying generally knowledgeable on about my industry, my job skills, etc.
  • Performing research tasks to figure out things (tacit knowledge work), e.g., what eLearning tools should I be using.
  • Performing transactional information work where we get some piece of information and quickly do something with it, e.g., respond to email
  • Acquire knowledge/skills in new domains
That would have been accurate for many years, right? So is there really anything changed about these core tasks? And what do you call this core part? Is there a really good description of these work and learning tasks?

Banning Phrases - Adoption by Corporations

I heard (on the radio this morning) about a mayor in a Russian city who banned various phrases for city employees. The mayor explains his rationale as:
  • City officials should help improve people's lives and solve their problems, not make excuses.
  • I am tired of civil servants telling me that problems were impossible to solve, rather than offering practical solutions.
  • the use of these expressions by city administration officials while speaking to the head of the city will speed their departure.
The phrases include:
  • What am I supposed to do?
  • I'm not dealing with this
  • We're having lunch
  • The working day is over
  • Somebody else has the documents
  • I think I was off sick at the time
  • It's lunch time
This got me to thinking that something similar maybe should be adopted by other organizations. Certainly there are a lot of phrases that become part of people's lexicon that help them avoid helping to solve problems. For me, it's one of the most frustrating things to deal with. Non-answers. Avoidance.

Thoughts?

Tuesday, September 04, 2007

Learning and Networking with a Blog (Deleted Scenes)

My article in Training + Development Magazine was just published - Learning and Networking with a Blog. You must launch the reader and then go to page 20 to find the article.

Unfortunately, a couple of things got cut between the final edits from T+D and when it was published. So here are two associated pieces of content that are the article's "deleted scenes":

Getting Started with Blogging

Probably the first step in blogging is learning about blogging for yourself before you try to use it for your organization. To help with this, it might be worth looking at a discussion in October 2006 on the question – “Should All Learning Professionals Be Blogging?” You can find a somewhat tongue-in-cheek summary - Top Ten Reasons to Blog and Not to Blog.

Assuming you decide that it’s something worth trying, then you should first sign up with an RSS Reader such as Bloglines or Google Reader. Then subscribe to blogs. Two good lists of related blogs can be found at:

http://www.articulate.com/blog/the-19-best-elearning-blogs/

http://elearningtech.blogspot.com/2007/02/top-ten-elearning-blogs.html

Then you will want to sign up on a blogging system such as Blogger, and begin to write blog posts. To help you get connected with other bloggers, it’s good practice to link your posts to their posts and leave comments on their blog. Another way to get connected is to contribute to ASTD’s Learning Circuits Monthly Big Question. You can find this on the Learning Circuit’s Blog.

Once you are comfortable with blogging, you can begin to focus on where and how it should be used in your organization. IBM, well-known for fostering blogs and Wikis, has established guidelines that provide a good model:

http://www.ibm.com/blogs/zz/en/guidelines.html


Common Questions and Issues around Blogging
There are a few questions that commonly come up around blogging:

What should I write about?

The best advice here is to write about what interests you. By being interested, you will be interesting. Most workplace learning professionals write about topics such as projects, challenges, answers to questions they see on other blogs, answers to questions they get asked at work or by peers, and links to interesting content or documents.

How much time does it take?

Blogging can consume a large amount of time so it’s a good idea to start small and work up. Most bloggers will tell you that it has replaced other learning activities and that they seamlessly work it into their day. As they research a new topic, they find they can write a blog post that will help them capture their thoughts. Typical specific answers range from 30 minutes per day to an hour per week.

Can I write about confidential matters?

Some companies, such as Motorola and IBM, greatly encourage their staff to blog. They have established policies about what is acceptable or not to write in a blog post. The basic answer is that unless it is a private blog, everything you write is public and will exist forever. No confidential information should ever be put in a blog post. Most bloggers find that it is rather easy to turn specific questions or issues on a project into generic discussions that contain no confidential information.