When we think of assessments or tests in a learning context, we think of them as a tool for instructors to measure how much a student knows. But if the goal is to ensure that learners retain as much material as possible for as long as possible, what if the most effective use of assessments is before they’ve actually studied anything?  

At Amplifire, we’re concerned with better learning: remembering more, faster, over a longer period of time. And frankly, we believe that is everyone’s goal when it comes to learning. Our platform is built on brain science principles and employs the Socratic questioning method to stimulate learning. Our approach to assessment is science-based; and, as it turns out, assessments themselves can promote stronger, longer lasting memories. However, not all assessments are created equal. We’d like to share some of our research-backed testing strategies for better learning. 

Best type of test for learning: Multiple choice 

As we mentioned, not all assessment strategies are created equal. Some resources will say that open-ended, essay-like questions are the best way for students to practice recalling information. There are many types of test questions out there, including true/false, fill in the blank, the aforementioned essay, and more. But according to research by some of the world’s leading cognitive scientists, the multiple-choice format is the best type of test question to truly promote better, long-lasting learning.  

After all, testing isn’t simply about evaluating what learners know — the act of testing can be a tool to improve memory. In a study about testing efficacy, researchers Elizabeth Ligon Bjork, of the Bjork Learning and Forgetting Laboratory at UCLA, and Jeri L. Little concluded,  

“Answering multiple-choice questions…can enhance performance on a later test, not only on questions about the information previously tested, but also on questions about related information not previously tested — in particular, on questions about information pertaining to the previously incorrect alternatives.” 

They then investigated why this is true. The format of a multiple-choice question presents a prompt with several answers in which one or more choices are correct, and one or more choices are incorrect. The test-taker must read and contemplate all the answers to determine which is correct, invoking a “search and retrieval” of information stored in the brain. Therefore, multiple choice is not merely about recognizing the correct answer. It is a combination of recognition and recall, tapping into the cognitive process that increases memory retrieval and storage strength. 

How to design an effective multiple-choice test 

There are several criteria that ensure your multiple-choice assessment is optimized for maximum efficacy. Factors like timing, feedback, and content design all contribute to the quality of the questions and ultimately the quality of retention derived from the assessment. That is because these factors are known cognitive triggers — mechanisms that maximize the brain’s natural memory processes. The multiple-choice format, combined with these factors, can help any learner retain more. 


Tests are most often given after learning and studying to gauge learners’ retention. Learners implement a variety of study methods — from flashcards to notes, from cramming to spaced-out prep — in the hopes of achieving a high score. But if the ultimate goal is truly to learn and retain as much as possible, then it might surprise you that the best time to give a test isn’t after students learn material, but before they study anything. 

The reason pretesting is so effective for learning is because it invokes a cognitive trigger known as priming. When compared to traditional studying, pretesting before learning and studying — or “priming” — results in higher test scores, regardless of whether learners perform well on a pretest (a 2010 study).  

Not only is pretesting conducive to better learning, but it also helps learners and instructors gauge existing knowledge, so they know where to focus attention. Rather than wasting time learning what they already know, learners can focus on topics they struggle with. 


If you’re using multiple-choice assessments as a pretest, this process allows for feedback either during learning or after the test is completed. Feedback is a great way to directly access the brain’s natural memory-storage process to help learners retain more correct information. 

Studies have shown that feedback in the form of a correct answer and explanation can improve retention dramatically. This is especially true for low-confidence, correct answers. Let’s face it, we’re not always confident about the answers we give, but feedback including the right answer can help solidify correct information in learners’ minds. Therefore, offering an opportunity for feedback (slightly delayed after a learner answers a question) can greatly enhance retention. 


When it comes to writing multiple-choice questions, the content of the entire question matters, even the incorrect answers. The key to designing an effective multiple-choice test is to make sure all the content in the questions serves a purpose. The study by Elizabeth Bjork and Jeri L. Little not only determined that the multiple-choice format led to better learning outcomes, but also determined specifically what type of content led to those outcomes. They post-tested for results comparing a group that pretested with multiple-choice questions that contained non-competitive (obviously incorrect compared to the correct answer) incorrect responses and a group that pretested with questions that contained competitive (intentionally similar) incorrect responses. They found that the content in the alternative, incorrect responses made an impact on later test performance on related material. Learners who took the test with competitive alternatives performed better on a later test than the other group. Content matters, even if it’s the incorrect response. 

These triggers are built into Amplifire’s eLearning platform, along with a few others. Like we mentioned during our discussion of feedback, learners aren’t always confident in the answers they choose — which isn’t a bad thing. For example, another trigger known as metacognition is also woven into the platform’s answering process. Learners can indicate whether they feel confident or unsure about their answer or indicate that they don’t know the answer. The process of thinking about how much you know strengthens the pathways around that information, promoting stronger learning. 

So, while multiple-choice format gets a bad reputation as an easier form of testing, it is the most effective way to test — not only for evaluation purposes, but for more thorough learning. By enhancing your multiple-choice pretests with certain cognitive triggers, you set learners up for success with stronger memories and better retention with less time spent learning, overall. 

From the beginning, Amplifire has relied on innovative brain science to guide its product development to create the most effective learning and training solution, perfectly tailored to the way the human brain works. Learn more about how Amplifire helps people learn better and faster by checking out a demo.