-
A Bad Employee Or A Bad Week
Matthew J. Hays, PhD
Senior Director of Research and AnalyticsWhat’s Going On When A Learner Refuses To Learn?
On a humid summer evening, George Williams was clocked at 78 mph in a 35 mph zone. He ran four red lights as the police followed him. He finally stopped, leaped out of his car, and ran into a building.
What do you think of George so far?
A few steps into the building, he collapsed. Nurses swarmed him, applying compresses to the wounds they could see. After four hours of surgery and three months of physical therapy, George made a full recovery.
Now, what do you think of George?
What is The Fundamental Attribution Error?The fundamental attribution error (FAE) is when you assume that someone’s behavior is due to their nature rather than their circumstances. We all fall prey to it. The numbskull who cuts you off in traffic is clearly a terrible person.
Of course, when you’ve cut someone off in traffic, it’s been for a good reason. You were about to be late for a meeting, or someone was hurt, or one of your kids had a restroom emergency-type situation going on. You can excuse your own behavior, but there are no excuses for others. If someone’s acting like a jerk, they must be a jerk.
Full disclosure: I made this error for years. I run research and analytics for an online eLearning platform (Amplifire). In many other systems, you can just put a stapler on the spacebar, go get a coffee, and your training is done when you come back. But in ours, you have to actually master the material. We’ve developed sophisticated behavior analysis algorithms that try to identify when someone attempts to rush through Amplifire without learning. For the longest time, I have been referring to people who do this—who interact with the system in a disengaged or disingenuous manner—as “goofballs.”An example of goofballery: Responding “I don’t know yet” in less time than it could possibly take to read a question, over and over and over again. (We even have messaging that pops up and makes it clear that their strategy won’t work. One of the messages concludes by telling them, “Look, you might as well just learn.”)
But what I need to keep in mind is that someone who engages in goofballery isn’t necessarily a goofball. I learned this lesson quite acutely from one of our clients. Let’s call them Acme. They were using Amplifire in a several-week-long onboarding program for new call center agents.
Even before we had algorithmic behavior categorization, we had some idea of what a goofball looked like in our standard reports. A couple of Acme learners fit the bill perfectly. They rushed through questions but took longer to learn than average and struggled mightily to grasp even basic concepts…it was clear that something other than normal, engaged learning behavior was going on.
I’ve been a big fan of naming reports after the question they answer. Sometimes a report user wants a Learner Progress report, sure. But sometimes, they just want to know who isn’t done. So we ought to give them a “Who Isn’t Done?” report.
The goofballs at Acme made me want to make a “Who Should I Reprimand?” or “Who’s Screwing Around?” report. Maybe even a “Who Should I Fire?” report. It’s exactly the kind of insight that call center clients are looking for. The success metrics almost calculate themselves: cost savings from reducing training class size, plus improved call performance once the trainees become agents (since you’ve filtered out the ones who appeared not to care about the company or how to do well at it).
Cooler heads prevailed, and we instead went to Acme with a couple of names and a curious tone. We asked what they thought might be going on and let them follow up with the trainees. They came back to us with awe and thanks…but not for the reasons the FAE had led me to expect. Instead, one trainee was living in his car in the training center parking lot. You can imagine that that kind of stress would make you disengage from parts of your job you couldn’t tell were important. The other trainee’s father had just died. He was in no shape for a training program that week…but was able to pull it together the following week.
Who Needs Your Help?
My new name for the goofball report is “Who needs YOUR help?” These trainees needed the opposite of getting fired; they needed support. Our data identified that, but I fell victim to the fundamental attribution error. They weren’t bad employees. They were employees going through a bad time.
My team and I are trying to avoid the FAE in our report labels and data interpretations. I’m also adopting this in my personal life. When someone cuts me off in traffic, I wish them good luck.
Well, I try to, anyway. I’m working on it.
-
Ladder related incidents reduced by 31.5%
Ladder Accidents
Across all industries, the highest fatal and nonfatal ladder fall incidents (LFI) rates were in the following two occupational groups: construction and extraction (e.g., mining), followed by installation, maintenance, and repair occupations.
Every year, over 300 people die in ladder-related accidents, and thousands suffer disabling injuries. According to a report from Liberty Mutual, the direct compensation and medical treatments associated with these falls cost American businesses $4.6 billion a year.
Ladder Safety Training
For this telecom/cable service provider, technicians who experienced a ladder fall averaged 81.25 days of lost work. Their top executives employed Amplifire to reduce the number of incidents.
Amplifire worked with their L&D team to develop a Ladder Safety training course that focused on how to perform safety checks according to OSHA guidelines. The course was then rolled out to 862 supervisors. Each supervisor has 15 subordinates.
Confidently Held Misinformation
Confidently held misinformation (CHM) exists in all individuals and organizations. It is one of the largest contributors to costly errors.
CHM exists when an individual is sure they are right, but they are wrong. It creates misjudgments and mistakes. Misplaced confidence can be perilous—and result in adverse incidents.
Amplifire has the unique power to detect and correct CHM. The platform requires learners to state their certainty when they answer questions. The system then classifies which questions were answered confidently but incorrectly—representing confidently held misinformation—and customizes a module in real-time that will lead the learner to rapid mastery of the topic.
The cognitive science behind the platform has proven itself in over one billion learner interactions.
Knowledge Variation
The variation of knowledge among supervisors was high, with some supervisors quite misinformed and others showing confident mastery of the topic. Surprisingly, some of the more senior supervisors held more CHM than their counterparts.
Download the full case study
to see the training ROI realized by this telecom provider. -
Using True-False Questions to Promote Learning
Matthew J. Hays, PhD
Senior Director of Research and AnalyticsTests don’t just evaluate what you know. The act of being tested actually makes your memory stronger. Think about it. You want to improve your ability to retrieve information from your brain. What could be better practice than…retrieving information from your brain?
Unfortunately, tests get a bad rap – but some are perceived to be even worse than others. It seems like the harder it is to make a test and evaluate its responses, the better its reputation. Essay tests are supposedly the best, while everyone looks down on multiple-choice tests (even though they shouldn’t). True-false tests are at the very bottom of the heap.
But they don’t have to be.
In a series of studies published in July of 2020, researchers in the Bjork Learning and Forgetting Lab at UCLA have revealed how to construct true-false questions in a way that enhances learning beyond what previous true-false questions have been able to achieve.
For example, suppose you wanted someone to learn that Steamboat Geyser is the tallest geyser in Yellowstone Park. After having them read about geysers, you could use a true-false test to reinforce what they read. On that memory-enhancing test, you could use a true item: “Steamboat Geyser is the tallest geyser in Yellowstone Park, true or false?” You could also use a false item: “Castle Geyser is the tallest geyser in Yellowstone Park, true or false?”
These questions are somewhat limited in their instructional value. When an item is true, it helps people learn only about the topic of the true-false statement. For example, the Steamboat question above only reinforces that Steamboat is the tallest. When an item is false, however, it helps people learn only about information related to the true-false statement. For example, the Castle question above only reinforces that Steamboat is the tallest – but does nothing to enhance learning about Castle Geyser itself.
The Bjork lab’s discovery: Inserting a contrasting clause into the question activates more concepts in your brain. For example, you could make a true item with a contrasting clause like this: “Steamboat Geyser (not Castle Geyser) is the tallest geyser in Yellowstone Park, true or false?” You could also make a false contrasted item like this: “Castle Geyser (not Steamboat Geyser) is the tallest geyser in Yellowstone Park, true or false?” Both questions help reinforce information about both geysers.
We are building these findings into the Amplifire platform, our authoring system, and our content analytics. If you’d like to know more about the science or the software, reach out here!
-
Amplifire’s CEO Reflects on 2020: What This Year Has Taught Us
Bob Burgin
Amplifire CEO2020 – a challenging year for everyone. Enough said?
Well, maybe not. I have to say that it has been truly inspiring to witness our clients’ heroic response during this trying time. Health systems’ heroic and dedicated staff took significant personal risk to care for us. T-Mobile kept lines open while delivering superior customer care. Our higher education clients helped over one million students succeed in their efforts to learn remotely. Professionals across the country found the motivation to earn certifications as their career paths were at risk. More than anything else, we thank you. Your tenacity during hard times is inspiring.
As for Amplifire, well, I think it has been a good year. You see, the world never entirely goes back to what it was before a time like this. Something we have all known for years but struggled to create just advanced 10+ years in only nine months. It’s the notion that sophisticated adaptive online training done right significantly outperforms classroom training.
I wouldn’t wish this year on anyone, but sometimes out of terrible events comes powerful change. I suspect we will get on airplanes for business a little less often now that we have broken the self-consciousness barriers and learned to look into each others’ eyes on a web call. And we have a new pattern of staying connected with loved ones across the country and the world.
And I similarly believe we will not go back to putting people in classrooms and teaching to the lowest proficiency in the room. This is the promise of adaptivity—people get instruction tailored to their location and circumstances on the path to mastery. Some go fast, others take more time to absorb information, but everyone can become proficient.
In 2020, we expanded our client relationships and added new clients. And as the radical move to online adaptive training hit hard, the demand for our advanced capabilities grew, and we added a host of new reseller partners fully embracing this new way of learning.
It was a challenging year, without a doubt. In April, we were taking pay cuts to ensure our ability to keep our no-layoff promise to our employees during COVID. Like everyone, we had no idea what the future would hold.
We will continue our focus on advancing the art and science of Knowledge Engineering to drive improved performance. It’s never been truer than this present moment that we live and work in a knowledge economy where human flourishing derives from the information stored in our minds as long-term memory.
It turns out that the most powerful, complicated, amazing computing device is not in some university basement. Not calculating rocket trajectories at NASA or finding obscure info for Google. Although it designed all the computing devices ever used.
It’s the human mind.
And finding ways to load knowledge into the human mind, well, that’s our obsession.
As 2020 draws to a close, we wish you our best and share our hopes for an exhilarating 2021.
And a very special thank you to our team, who never blinked.
Bob Burgin
CEO -
Amplifire Secures Patent for Analytics Regarding the Confidence of Learners
BOULDER, Colo., Nov. 2, 2020 /PRNewswire/ — Amplifire, an eLearning company, announced today the award of an additional patent by the United States Patent Office. Amplifire holds patents issued in the US, EU, Australia, Canada, Japan, South Korea, and other countries and jurisdictions worldwide. The new patent is titled Display and Report Generation Platform for Testing Results.
The new patent, US Patent No. 10,803,765, is directed to aspects of Amplifire’s learning platform, which includes unique Answer Key and Reporting Dashboard features. The Answer Key allows learners to signify both their confidence and answer choice in one click, which fosters greater metacognition. The Reporting Dashboard builds visual analytics displaying a learner’s misinformation, uncertainty, and struggle in bar charts and heatmaps. Search and sorting features allow managers or instructors to see their organization’s knowledge at any scale, from individual to team to division, and across the enterprise.
“Our customers view knowledge as a strategic way to compete.”
Confidence measures shown in the reporting dashboard are essential because confidence is the precursor to human behavior. It appears as internal thoughts such as, “I’ve got this,” or, “I haven’t a clue what to do.” The Amplifire dashboard reports and sorts using the confidence a learner displayed when they answered questions in the assessment phase of learning and subsequent refreshers. The most dangerous form of confidence occurs when a learner is sure but incorrect, referred to as confidently held misinformation, driving them towards a mistake.
The ability to see how confidence is bound to knowledge gives learning officers, administrators, and instructors a window into the risk of future mistakes in their workforce. Visualizing human fallibilities such as misinformation, uncertainty, and struggle lends unprecedented guidance to managers and instructors. For the first time, they can see the people who improve in the platform, where pockets of risk lie, and who needs at-the-elbow help.
Amplifire CEO, Bob Burgin, noted, “We are proud that the US Patent Office noticed our reporting dashboard’s unique features and awarded our efforts with a patent. Amplifire’s product development team regularly thinks up new ways to help people overcome the knowledge problems inherent in the human condition. Our customers view knowledge as a way to compete. They understand it’s strategically in their interest to help their people reach new levels of performance.”
About Amplifire
With over 2.4 billion learner interactions, Amplifire (www.amplifire.com) is the leading adaptive learning platform built from discoveries in brain science that help learners master information faster, retain knowledge longer, and perform their jobs better. It detects and corrects the knowledge gaps and misinformation that exist in the minds of all humans so they can better attain their real potential. Healthcare, education, and Fortune 500 companies use Amplifire’s patented learning algorithms, analytics, and diagnostics to drive exceptional outcomes with a significant return on their investment.
-
Self-Regulated Learning: Beliefs, Techniques, and Illusions
Authors
Robert A. Bjork,1 John Dunlosky,2 and Nate Kornell3
1Department of Psychology, University of California, Los Angeles, California 90095, 2Department of Psychology, Kent State University, Kent, Ohio 44242, 3Department of Psychology, Williams College, Williamstown, Massachusetts 01267
Abstract
Knowing how to manage one’s own learning has become increasingly important in recent years, as both the need and the opportunities for individuals to learn on their own outside of formal classroom settings have grown. During that same period, however, research on learning, memory, and metacognitive processes has provided evidence that people often have a faulty mental model of how they learn and remember, making them prone to both misassessing and mismanaging their own learning. After a discussion of what learners need to understand in order to become effective stewards of their own learning, we first review research on what people believe about how they learn and then review research on how people’s ongoing assessments of their own learning are influenced by current performance and the subjective sense of fluency. We conclude with a discussion of societal assumptions and attitudes that can be counterproductive in terms of individuals becoming maximally effective learners.
The Need for Self-Managed Learning
Our complex and rapidly changing world creates a need for self-initiated and self-managed learning. Knowing how to manage one’s own learning activities has become, in short, an important survival tool. In this review we summarize recent research on what people do and do not understand about the learning activities and processes that promote comprehension, retention, and transfer.
Importantly, recent research has revealed that there is in fact much that we, as learners, do not tend to know about how best to assess and manage our own learning. For reasons that are not entirely clear, our intuitions and introspections appear to be unreliable as a guide to how we should manage our own learning activities. One might expect that our intuitions and practices would be informed by what Bjork (2011) has called the “trials and errors of everyday living and learning,” but that appears not to be the case. Nor do customs and standard practices in training and education seem to be informed, at least reliably, by any such understanding.
-
Learning Versus Performance: An Integrative Review
Authors
Nicholas C. Soderstrom and Robert A. Bjork
Department of Psychology, University of California, Los AngelesAbstract
Knowing how to manage one’s own learning has become increasingly important in recent years, as both the need and the opportunities for individuals to learn on The primary goal of instruction should be to facilitate long-term learning—that is, to create relatively permanent changes in comprehension, understanding, and skills of the types that will support long-term retention and transfer. During the instruction or training process, however, what we can observe and measure is performance, which is often an unreliable index of whether the relatively long-term changes that constitute learning have taken place. The time honored distinction between learning and performance dates back decades, spurred by early animal and motor-skills research that revealed that learning can occur even when no discernible changes in performance are observed. More recently, the converse has also been shown—specifically, that improvements in performance can fail to yield significant learning—and, in fact, that certain manipulations can have opposite effects on learning and performance. We review the extant literature in the motor- and verbal-learning domains that necessitates the distinction between learning and performance. In addition, we examine research in metacognition that suggests that people often mistakenly interpret their performance during acquisition as a reliable guide to long-term learning. These and other considerations suggest that the learning–performance distinction is critical and has vast practical and theoretical implications.
The Goal of Instruction
Whether in the classroom or on the field, the major goal of instruction is, or at least should be, to equip learners with knowledge or skills that are both durable and flexible. We want knowledge and skills to be durable in the sense of remaining accessible across periods of disuse and to be flexible in the sense of being accessible in the various contexts in which they are relevant, not simply in contexts that match those experienced during instruction.
In other words, instruction should endeavor to facilitate learning, which refers to the relatively permanent changes in behavior or knowledge that support longterm retention and transfer. Paradoxically, however, such learning needs to be distinguished from performance, which refers to the temporary fluctuations in behavior or knowledge that can be observed and measured during or immediately after the acquisition process.
The distinction between learning and performance is crucial because there now exists overwhelming empirical evidence showing that considerable learning can occur in the absence of any performance gains and, conversely, that substantial changes in performance often fail to translate into corresponding changes in learning.
-
The Beauty of Multiple Choice
It’s now well understood that self-testing is perhaps the world’s most powerful trigger for turning on the mental mechanisms that create long term memory and the ability to recall something in the future. Memory storage, like most brain processes, is a pattern of neurons connected at their synapses by electrochemical processes. Retrieving a memory through self-testing strengthens that pattern better than any other technique.
Urban myth would have us believe that multiple choice tests are mostly good for quick grading. But in 2009, researchers proved this myth about multiple choice was wrong. It turns out that…
- Multiple choice tests increase the retention and recall of information.
- They are better than recall tests or essay questions or extra study for building durable memory.
- They do not promote the mental phenomenon of “retrieval induced forgetting.”
Multiple choice is not merely the recognition of a correct answer. It can more accurately be described as recognition combined with recall because the alternative answers must each be contemplated in turn and then selected as right or rejected as wrong. Multiple choice retrieval unleashes powerful effects that increase both retrieval strength and storage strength of the correct answer plus the alternatives. It’s a virtuous circle of memory fortification.
Solving the Problem of Retrieval Induced Forgetting
Research has shown that multiple choice solves a serious problem with other kinds of testing—the problem of retrieval induced forgetting which occurs when the brain is asked to retrieve a memory. For example, I ask you to name the fourth planet from the sun. If you recall that it’s Mars, then the memory of Mars is strengthened as we would imagine, but related memories become weaker—the positions of Jupiter and Venus are suppressed. This is something we would not have imagined until it was shown experimentally.
Think of it this way. The memory of new learning is in a kind of competition with the memory of related, but older learning. To avoid confusion and to help make rapid decisions, animals and people need to instantly know which memory is most up to date. To accomplish this, a mental system evolved to suppresses the strength of older, related information that may compete with the more recent and likely more relevant information. New information causes old information to be forgotten.
“ The results imply that taking a multiple-choice test not only improves one’s ability to recall that information, but also improves one’s ability to recall related information.” — Little & Bjork, UCLA
UCLAEnhancing the Memory of Related Information
Multiple-choice tests do not damage access to related information. Just the opposite, they enhance the retrieval strength of related information. In multiple-choice tests, a learner’s brain is asked to compare and contrast the truth of the competing alternatives. This strengthens the memory trace of those alternatives—quite the opposite of what occurs in the brain during a recall test, where forgetting of the alternatives is encouraged by the suppression of related memory.
Multiple Choice Plus Confidence in Amplifire
Confidence appraisals magnify the superior effects of multiple-choice testing. Thinking about one’s confidence focuses the brain’s resources and attention through activated dopamine circuits. The spotlight of consciousness is brought to bear on the question at hand. With confidence focusing attention on the plausible alternatives of a well constructed multiple-choice question, learning and memory benefit greatly. Students who use Amplifire report that their attention is focused, that learning occurs more rapidly, and that future recall is easier.
-
Amplifire Crosses 2 Billion Learner Interaction In Its Adaptive Learning Platform
BOULDER, Colo., May 13, 2019 — Amplifire, today announced it has reached more than 2 billion learner interactions in its patented adaptive learning platform. That’s a 9% increase in user interactions from the previous year. The company has also added more than 827,797 unique learners to its platform during this time period.
Amplifire engages learners and creates memorable learning experiences by adapting to each learner in real time. The platform detects and corrects knowledge gaps and misinformation that can lead to poor performance and cause errors.
The system includes a robust reporting suite that identifies misinformation, gaps, and struggle at the individual, team, location, and systemic level. Organizations can use this to improve performance. With 2 billion learner interactions, the system is able to deliver insights on learner behavior, predict who is at risk, and intervene where necessary to keep learners on the path towards mastery.
“Reaching 2 Billion learners is a a significant milestone, ” said Nick Hjort , SVP of Product and Development. “Over the past three years, Amplifire has seen learner interactions grow exponentially which reflects that Amplifire continues to be a leader In a crowded education tehnology market.”
“It has been an exciting year for us,” said Bob Burgin, Amplifire CEO and Founder of the Healthcare Alliance. “We continue to see growth in the Higher Education market while expanding our reach in the Healthcare and Call Center space. More importantly, we are making a difference by helping individuals focus on areas where they have knowledge gaps or are struggling so they can perform at their highest potential. We remain committed to improving learning efficiency.”
Learn how Amplifire improves performance, saving organizations money and time at https://amplifire.com/case-studies/
-
The Testing Effect: Putting Mom to The Test
Remember those flashcards mom made you cycle through every morning while she drove you to school? Times tables, vocab words. Then she’d start quizzing you on your spelling words. As much as you might hate to admit it, these methods worked. You remembered the information she was trying to pound into your head. But why did it work? The answer: the testing effect.
The testing effect has been given many names over the years—active recall, retrieval practice—but it all refers to the same mechanism: asking your brain to remember and retrieve information on cue. And, as Henry Roediger and Jeffrey Karpicke put it, “Testing has a powerful effect on long-term retention.”
Some might say, “Flashcards are a lot of work. What about just re-reading? That’s got to be as good as testing, right?” Wrong. Re-reading material might make you better at reading the test questions, but making yourself pull the information out of your brain makes you better at answering the questions.
Studies have compared students who prepared for an exam by testing themselves against students who prepared by reading. The result: the pre-testing students received higher scores on the exam than the reading students. Furthermore, the success of testing is linked to timing. The power of the testing effect increases as more time passes between the practice test and the actual exam.
Research has also shown that multiple practice tests can further improve retention and test performance. Repeatedly asking your brain to dig around and produce sought-after information creates stronger connections and retrieval pathways in the brain. In one study, students who repeatedly practiced retrieval (testing) doubled their proportion of correct responses on the final test compared to students who only practiced retrieval once.
So what can we learn from the testing effect? Not only are quizzes the way to go when we want to remember information, but also—at least when it comes to flashcards and spelling quizzes—mom was right.