Flipped and Blended Learning: What the Evidence Shows in 2026
Flipped and blended learning evidence: van Alten's meta-analysis, Teachers College blended +0.35, West Point RCT findings — plus when flipped classrooms actually work.

Flipped and blended learning occupied a particularly noisy corner of education discourse in the 2010s, with claims about transforming classrooms through pre-class video. A decade on, the evidence base has matured enough to say something useful: blended learning reliably outperforms fully online, often modestly outperforms fully face-to-face, and flipped learning works — when specific design conditions are met. This evidence review sets out what the research actually shows, when the effects appear, and what teachers should take from the findings.
What flipped and blended learning are
Blended learning is any course design that combines face-to-face and online modes. Flipped learning is a specific blended-learning design in which direct input is shifted partly to asynchronous study — typically a short video and a readiness quiz — so that live lesson time can be used for application, discussion, problem solving, and feedback.
The distinction matters. Most blended learning is not flipped; most flipped classrooms are blended. The research on the two is overlapping but not identical, and it is worth keeping them separate when interpreting effect sizes.
What the research actually shows
The evidence is moderately positive and substantially moderated by implementation.
The major flipped-classroom meta-analysis (van Alten and colleagues) found a small positive effect on learning outcomes and no overall effect on satisfaction. The moderator findings are the practically useful bit: gains were higher when face-to-face time was not reduced (i.e., the pre-class video added to rather than replaced in-class teaching), and when quizzes were built into the pre-class design.
The older U.S. Department of Education / Teachers College review found that blended conditions outperformed fully face-to-face teaching more clearly than fully online ones, with blended-vs-face-to-face effects around +0.35. A later meta-analysis of objective performance measures reported g ≈ 0.385, with stronger effects in STEM.
Rigorous trials remain mixed. A Danish macroeconomics RCT found a positive but statistically insignificant average effect, because student resistance reduced participation — a reminder that flipped learning can be undone by students who do not do the pre-work. A West Point randomised trial of flipped versus traditional teaching found short-term mathematics gains, no economics gains, and — importantly — a widened achievement gap: higher-attaining students benefited more than lower-attaining students. That gap-widening is a critical equity finding that separates well-run flipped from badly-run flipped.
The aggregate synthesis: flipped and blended learning are modestly positive on average, with stronger effects in STEM, when live class time is protected, and when pre-class accountability is explicit. They can widen attainment gaps if students who don't prepare are left to sink in the live session.
Why flipped and blended learning works (when it does)
The mechanism is straightforward. Face-to-face time with a teacher is the scarce and expensive resource; pre-class video is cheap. If you can use pre-class video to cover the initial exposure to content — where students are passively receiving information anyway — you free up the expensive in-class time for the higher-value activity: application, dialogue, immediate feedback, targeted reteaching.
The effect appears in the research when the design actually does this. It disappears when:
- the pre-class video is too long or low-quality, so students don't watch
- there is no readiness check, so students arrive unprepared
- live class time is spent on a second delivery of the same content, turning "flipped" into "duplicated"
- the live tasks do not genuinely require the pre-class preparation
Core design principles
A well-designed flipped or blended lesson has five features.
- Short, focused pre-class materials. A ten-minute video, not a 40-minute lecture. A short reading, not a chapter.
- A built-in readiness check. A short quiz, a low-stakes pre-class question, a one-paragraph prediction. Something that makes preparation visible.
- Live tasks that depend on preparation. If the in-class task can be done by an unprepared student, the preparation stops mattering.
- A catch-up pathway. Some students will arrive under-prepared, either because of access, disengagement, or genuine difficulty. Have a planned response that doesn't derail the live session.
- Protected face-to-face time. The pre-class component adds to contact hours, rather than replacing them. Live time stays focused on high-value activity.
Classroom examples across phases
Primary. Flipped learning in pure form is a poor fit for early primary — young pupils generally can't be relied on to complete pre-class preparation, and equity of access to home video is unreliable. A defensible primary variant is an occasional family-viewable phonics recap video, kept short, with core instruction remaining fully live.
Secondary. Year 12 physics. Students watch a ten-minute explainer on rotational kinematics the evening before the lesson and complete a three-question readiness quiz that populates a dashboard. The teacher scans the dashboard before the lesson and starts with a targeted reteach for the most common mistake. The rest of the lesson is guided derivation and problem-solving — tasks that genuinely need the pre-learned content.
Tertiary. First-year economics. Students complete a short recorded lecture and a readiness check covering the core supply-and-demand logic before the seminar. Seminar time is spent on case analysis, small-group problem solving, and individual feedback. Unprepared students have a catch-up option: a 20-minute pre-seminar live walkthrough with a TA.
Where flipped learning goes wrong
The failure modes recur in the literature and the trials.
- Replacing teaching time with video. The flipped lesson assumes the student prepares. If they don't, and the teacher can't restart content delivery without destroying the design, the system collapses.
- Long, unfocused videos. Engagement with pre-class video drops sharply past 10–12 minutes. Many flipped classrooms fail because the videos are just recorded full lectures.
- No readiness check. Without a short check, teacher and students have no shared information about who has prepared. The teacher has to guess.
- Equity failure. Students without reliable home internet or quiet study time are structurally disadvantaged. This is where the West Point widening-gap finding comes from.
- Live tasks that don't depend on preparation. If an unprepared student can do the in-class task, preparation becomes optional and the method unravels.
Best fit and poor fit
Best fit: secondary and tertiary contexts where students can handle some independent preparation and have reliable digital access. STEM subjects show stronger effects in the meta-analyses than humanities. Particularly strong for technical subjects where worked examples are valuable and in-class problem-solving time is scarce.
Poor fit: primary education (very limited rigorous causal evidence); contexts with poor home digital access; learners who haven't developed the self-regulation to complete pre-work reliably.
Evidence gaps: rigorous causal evidence in primary education is very limited. The bulk of the research is secondary and tertiary.
Teacher requirements, assessment, and resources
Medium to high up-front preparation. Platform support — a learning management system that handles video hosting, quiz delivery, and participation analytics — is effectively a prerequisite. Analytics routines matter: the teacher needs to be able to see who prepared and who didn't before the live session.
Evaluate not just grades but completion of pre-work, quality of in-class participation, and — critically — equity of access. A flipped design that lifts the top half and leaves the bottom half behind is not neutral; it is worsening the gap.
For a classroom-ready flipped template, see Flipped Classroom Lesson Plan.
How TAyumira supports flipped and blended learning
TAyumira supports flipped classroom as one of its ten research-backed teaching methods. When you pick it, the generator produces:
- A short, focused pre-class explainer — script, slide deck, or talking-points for a teacher-recorded video
- A readiness quiz aligned to the specific pre-class content, with distractors mapped to common misconceptions
- In-class tasks that genuinely depend on the pre-class preparation
- A catch-up pathway for students who arrive under-prepared
- An equity check: prompts to consider access and preparation before you run the lesson
Start for free — the Free tier covers the full workflow.
FAQ
What is the effect size of flipped learning?
Van Alten and colleagues' flipped-classroom meta-analysis reports a small positive effect on learning outcomes and no overall effect on satisfaction. The effects are larger when face-to-face time is not reduced and when pre-class quizzes are built in. A West Point RCT produced short-term mathematics gains but widened achievement gaps — a significant equity finding.
Is blended learning the same as flipped learning?
No. Blended learning is any design that combines face-to-face and online modes. Flipped learning is a specific blended design in which direct input is shifted to asynchronous pre-class study so that live time can be used for application. Most flipped classrooms are blended; most blended learning is not flipped.
Why did the West Point trial widen the achievement gap?
Higher-prior-attainment students tended to prepare more thoroughly and benefited more from the application-heavy in-class tasks. Lower-prior-attainment students more often arrived under-prepared and fell further behind. The gap-widening is a design failure — not an inevitable property of flipped learning — but it only gets solved with explicit catch-up pathways and genuine preparation accountability.
How long should a pre-class video be?
Under 12 minutes is a reasonable ceiling. Engagement falls sharply past that. If you need more material, make two shorter videos rather than one long one, and build the readiness check so completing both actually matters.
Is flipped learning suitable for primary school?
Rigorous causal evidence in primary is very limited, and the equity and self-regulation requirements are demanding for young learners. An occasional family-viewable recap video is defensible; running a full flipped model as standard is not well-supported by the evidence.
Related evidence reviews
- Explicit Instruction Evidence
- Retrieval Practice and Spaced Practice Evidence
- Cooperative Learning Evidence
- Formative Assessment Evidence
Sources
- van Alten, D. C. D., Phielix, C., Janssen, J., & Kester, L. (2019). Effects of flipping the classroom on learning outcomes and satisfaction: a meta-analysis. Educational Research Review.
- Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2010). Evaluation of Evidence-Based Practices in Online Learning — Teachers College / U.S. Department of Education.
- Carter, S. P., Greenberg, K., & Walker, M. S. Flipped classrooms at West Point.
- OECD. Digital education and digital equity frameworks.
Try one flipped lesson on your next unit
Pick one lesson where in-class time would genuinely be better used for application than delivery. Record a ten-minute explainer. Write a three-question readiness check. Design an in-class task that requires the preparation. Plan a short catch-up pathway for students who arrive under-prepared. If you want the explainer, readiness quiz, and in-class task generated, create a free TAyumira account.

