In today’s post, our team shares tips that you can use when piloting the flipped classroom in your course or program. Learn more about defining the scope of your pilot, evaluating your pilot, and avoiding common pitfalls in pilot design and assessment.
Before you decide to fully implement the flipped classroom in your class, course, or curriculum, you should consider testing its effectiveness in your local context. This process of trying a new teaching method or curriculum model, such as the flipped classroom, is known as piloting. When you pilot an innovation, you don’t know in advance if it will work in your particular situation.
But, if you want to put in the time to test the innovation, you likely hope it will work well enough that you can continue to use it again with a new group of learners next year. While we won’t use a precise definition for the word “pilot” in health professions education, piloting is an important practice in advancing the field.
Why?
If you want to implement the flipped classroom or any new educational idea in your program, you will need to gain faculty and administrator support for it. If, after piloting the flipped classroom, you can show that it improves students’ learning outcomes and satisfaction, then you can use these data to convince your colleagues that your successful pilot merits a wider roll-out.
In this article, we share tips for piloting the flipped classroom so you can measure its effectiveness. Read on to learn how to set up a well-designed pilot.
Tips for piloting
Define the scope of your pilot
As the first step in planning your pilot of the flipped classroom, you need to define its scope. How you define the scope of your pilot will depend on your role within your institution.
For example, if you are a single lecturer who gives one lecture each year, you control one tiny piece of the curriculum. If you decide to run a pilot during this lecture, it may well be the entirety of your lecture time, which may limit your ability to run a comprehensive pilot.
However, if you are the Vice Dean of Education in a health professions program, you have a much larger fraction of students’ curricular time at your disposal, and can design your pilot to have a larger scope if you choose.
The key point here is that your role as an educator in your institution or program will determine how wide you can make your pilot’s scope. When designing a pilot, you should also consider what scope it will need to achieve your scholarship objectives or gain support from your colleagues for the flipped classroom or another innovation.
Ensure the stakes of the pilot are low
The first time you pilot a new curriculum model, such as the flipped classroom, everything will take longer than you expect. You need to keep this fact in mind as you’re working on the pilot. Expect it will take more time than you planned so you don’t “overpromise & underdeliver.”
As well, the 70%/30% rule from health professions education can help you maintain low stakes when piloting an innovation. With this rule, if you get a pilot to 70% of where you want it to be, you should launch.
Many health professions educators are perfectionists, and if you try to have everything in the pilot be perfect before you launch it, you never will. Plus, you can iterate on the results of your pilot every year because you have a new cohort of students to build upon your 70% success and get closer to the idealized 100% perfect curriculum.
Finally, when piloting, you can set low stakes by working with an audience that is friendly and open to providing feedback. You just need to ensure this audience is capable of providing useful feedback, meaning it is close to the point of view of your target audience.
Set the evaluation strategy for your pilot
If you prepare as much as possible before your pilot, you will learn more from it. Therefore, you should ensure you have sound educational design and implementation of the pilot itself.
To start, your pilot will need to have an evaluation strategy before it begins. The most common assessment of a pilot involves asking students for their opinions of it via Likert-scale items—strongly disagree to strongly agree.
You should ask students these questions before your pilot and after it. This way, you have a pre-/post- comparison of the pilot’s effectiveness and can see if students thought the new version of your class or course improved on the old version of it.
With individual Likert items, you can only look at students’ individual responses, but you can’t see students’ opinions relative to each other unless they evaluate the changed course in a room together.
Therefore, you should consider having a focus group of (volunteer) students who are willing to speak with you (via Zoom if necessary), and during it, you can ask them the same questions in the survey and additional questions.
When designing focus group questions, you should ask, “What questions will elicit the back and forth I want to see between respondents?”
How to learn from your pilot
When evaluating and learning from your pilot of the flipped classroom, student satisfaction should not be your only determinant of its success. Why? Because student satisfaction measures are the equivalent of students giving you the “thumbs up” on Yelp. While these data can be useful, they don’t tell you if students’ learning improved.
Beyond these measures, you want to determine if students met your learning objectives for the sessions you piloted. Did they learn everything you wanted them to learn? You measure this metric by testing students.
Just like you test students’ knowledge acquisition in your lecture-based curriculum, test them after your pilot to see if they got information right or wrong. Then you can compare the test results from students who received the lecture-based curriculum against results from students who participated in flipped classroom sessions to see if they still met your learning objectives.
To further assess the efficacy of the flipped classroom, you can look at if students retain their learning outcomes after participating in a flipped session. One way to assess this outcome would be by testing students two or three months after the flipped class.
This type of downstream study is difficult to do, but it can help demonstrate that the flipped classroom video lecture performed better than the traditional lecture in terms of knowledge retention.
Common pitfalls when piloting and how to avoid them
Many educators who have run pilots of the flipped classroom or other curricular innovations have run into pitfalls. Here, we’ll share some of these common pitfalls you may encounter when piloting, and how to avoid them.
Errors in pilot design
If you decide to run a pilot, you need to dedicate enough time to implement the innovation. Most educators who run a pilot for the first time think it will take less time to implement than it actually does. You can avoid this pitfall by maintaining realistic expectations for your pilot and planning it well in advance.
Another error in pilot design is attempting to invent resources or use new resources without looking at the existing resources your institution provides.
For example, instead of choosing to record your lecture on Zoom, if your institution provides Google Meet and other faculty use it, then you should also use it so students have a more cohesive experience across their entire curriculum. Connect with your institution’s ed-tech or IT department to learn about which resources your program endorses and provides.
One final error in pilot design could happen if you avoid seeking your colleagues’ feedback on it or your flipped classroom lectures. Drafts of your work will always get better when peers provide feedback before you launch it with your students.

Errors in pilot assessment
The biggest mistake when evaluating your pilot is not doing any assessment! While you might not think this situation often happens, it does. You should design a survey or assessment for your pilot and seek feedback on it before you start it.
Another error in assessing pilots happens when an educator doesn’t administer the evaluation sufficiently close to the pilot. Why? Because if you evaluate a pilot two or three months (or even two or three weeks) after it, your students will not remember it as well as they did immediately after experiencing the pilot.
Sources of funding for pilots
When seeking funding for your pilot, you should start at your local institution. After all, they pay your salary to educate, so maybe they’ll have additional funds for your desire to pilot and innovate. When approaching your school for pilot funding, you can say that you would like some protected salary time to pursue this initiative, and see if they will allow you to work on it for one or more curricular hours per week.
Additionally, local or regional educator-centric organizations may provide funding for pilots. One example involves the AAMC, which has regions that are called Groups on Educational Affairs or GEA’s.
Each AAMC region has grant money for small pilots or projects that medical school faculty in that region can apply for. Beyond the regional level, you can also look at the national and international levels for funding.
While this example is narrowly around medical school education, there are parallel health professional organizations for other health professions. These organizations likely have similar national and regional structures with their own sources of funding for educational pilots.
Resources for your flipped classroom pilot
If you are looking for engaging pre-learning resources to use in your flipped classroom pilot, we encourage you to check out the Osmosis platform for educators. It features 1,600+ videos on different health professions education topics, as well as a library of practice questions, flashcards, and notes. Let us know if you find other recommended resources so together we can help other educators “Reach further!”
Try Osmosis today! Access your free trial and find out why millions of clinicians and caregivers love learning with us.
Leave a Reply