tag:blogger.com,1999:blog-22055982.post524262229935226730..comments2024-03-28T08:36:29.053-07:00Comments on eLearning Technology: Data DrivenTony Karrerhttp://www.blogger.com/profile/15408035995182843336noreply@blogger.comBlogger5125tag:blogger.com,1999:blog-22055982.post-2340120132211749372008-12-17T14:51:00.000-08:002008-12-17T14:51:00.000-08:00Virginia - that's a good point that the numbers do...Virginia - that's a good point that the numbers don't show you everything. In fact, qualitative measures can also be very instructive. Or better having store managers go around and talk to customers. That was often done as well. Ultimately, these were hard numbers that people were getting measured on, so at the end of the day, it mattered to everyone.Tony Karrerhttps://www.blogger.com/profile/15408035995182843336noreply@blogger.comtag:blogger.com,1999:blog-22055982.post-17133028815519326852008-12-17T10:00:00.000-08:002008-12-17T10:00:00.000-08:00Sorry, Tony, but I just have to say it: You are su...Sorry, Tony, but I just have to say it: You are such a technocrat!<BR/><BR/>Having worked in Market Research a number of years, we would use many of the formulas and quantitative analysis you are talking about. However, the best researchers were those that were able to read the information between the lines, gathering bits of information that were not explicit or qualitative comments. It was so frustrating that I could tell you what was really going on in a market but the higher ups wanted it proven. More often than not, my analysis was pan out a while later and people would come to me to find out how I had predicted the behavior? It was simple, I listened and interacted with those we interviewed and collected information from.<BR/><BR/>I see the same problem in corporate education. If I have to hear one more time about the "deliverables" without there ever being mention of "learning", I'm going to scream. Don't get me wrong, I think there needs to quantitative support as well as qualitative. However, just as antidotal evidence is not enough to make a good decision, neither is just surveys. There needs to be a mixed methods approach to really understand what is going on.<BR/><BR/>On the other hand, I like the focus of this posting on looking at both short and long-term outcomes. It is important to identify the factors that will indicate changes due to training and monitor them over a period of time.V Yonkershttps://www.blogger.com/profile/11910904367068063554noreply@blogger.comtag:blogger.com,1999:blog-22055982.post-13026086868959431462008-12-12T04:20:00.000-08:002008-12-12T04:20:00.000-08:00Joe - fantastic question. And I like your idea of...Joe - fantastic question. And I like your idea of doing this via a spreadsheet. Likely a lot of it could be done that way. Or spreadsheet + SharePoint.<BR/><BR/>In terms of metrics used ... in the specific example I was using, it included both sales and customer satisfaction in the dashboard, but only the satisfaction data tied to the performance suggestion. I've done a combined version before aimed at a similar kind of environment to what you are describing (high end products via dealerships). But we had more specific sales and satisfaction data and you could tie particular units to sales people and dealerships.<BR/><BR/>I'm a bit surprised to hear that you have a tough time associating satisfaction numbers as easily as sales numbers. In my experience it's the opposite. It all depends on the specific data you have. In the case cited, most of the specific questions and derived data points were fairly closely tied to performance/behavior. That's often not the case with sales numbers. Your in-store sales dropped x% in this category. Hard to know what exactly is at issue.<BR/><BR/>That said, you often can ask someone to do a diagnostic or have several people do diagnostics that can help to surface possible reasons for the issue and then go from there.<BR/><BR/>We should maybe connect on this at some point.Tony Karrerhttps://www.blogger.com/profile/15408035995182843336noreply@blogger.comtag:blogger.com,1999:blog-22055982.post-8603047024718135842008-12-11T20:42:00.000-08:002008-12-11T20:42:00.000-08:00Hi Tony,I'm curious. In this case is the data str...Hi Tony,<BR/>I'm curious. In this case is the data strictly from customer satisfaction surveys, or are you also using sales statistics? I am an instructional designer for a retail sales organization and we have implemented a similar approach by developing an excel pivot table report which provides very detailed sales statistics and a "dashboard" of overall sales associate and store performance. We have made attempts at using data from customer satisfaction surveys but have had a very difficult time using that data to point to specific performance improvement opportunities. Sales statistics on the other hand make opportunities for improvement blaringly clear and allow you to pinpoint a specific flaw in a sales presentation. In my case the retail stores are selling high priced merchandise and it sounds like that may not be the case in this situation. I'm curious if you have had success using sales statistics and wondering if I am not taking full advantage of data from customer satisfaction surveys.Joe Deeganhttps://www.blogger.com/profile/10222566841920170710noreply@blogger.comtag:blogger.com,1999:blog-22055982.post-64099816663997279862008-12-11T05:57:00.000-08:002008-12-11T05:57:00.000-08:00I saw this comment from Stephen Downes that I thou...I saw this comment from Stephen Downes that I thought should be cited here:<BR/><BR/>"I would caution that such an approach is designed specifically for a well-understood environment with clearly definable outcomes. Tying a store training initiative to sales, for example, works because (a) you know that you want 'sales' as an outcome, and (b) there is a way to effectively measure sales. Blindly applying the same methodology to less concrete domains would be a mistake, however. All of that said, Karrer's article is a well-written explanation with concrete examples and solid analysis."<BR/><BR/>---<BR/><BR/>I would tend to agree that unless you have solid data / metrics that you are trying to optimize, this kind of approach will not work. Sometimes there are proxies for what you are trying to get. For example, you don't actually measure loyalty - it takes too long - rather you survey for satisfaction. Similarly you might do Level 3 surveys.<BR/><BR/>All that said, you can get some negative effects if you focus on the wrong things. People are very adept at optimizing for what they believe matters.Tony Karrerhttps://www.blogger.com/profile/15408035995182843336noreply@blogger.com