Problem-Based Learning: What the Meta-Analyses Actually Found
Problem-based learning evidence: Barrows' McMaster origin, Dochy's meta-analysis, Walker and Leary d ≈ 0.13, REL West Problem Based Economics trial — and when PBL fits.

Problem-based learning is often confused with project-based learning — they share the acronym PBL and not much else. Problem-based learning is a specific tutorial-based pedagogy that begins with an ill-structured problem and uses it to drive small-group inquiry, with the teacher facilitating rather than presenting. It was invented for medical education, has become standard in many professional education contexts, and has a research base that is more modest than its reputation suggests. This evidence review sets out what problem-based learning actually is, what the meta-analyses really found, and where it genuinely works.
What problem-based learning is
Problem-based learning (PBL, lowercase-case) begins with a realistic, ill-structured problem or case. Students encounter the problem before they have been taught the content needed to solve it. They discuss their initial hypotheses, identify what they need to learn, study independently between sessions, then reconvene to apply their new knowledge to the problem. The teacher is a facilitator rather than a lecturer.
The Howard Barrows model, invented at McMaster Medical School in the late 1960s, is the canonical version. A small group of students meets with a tutor around a clinical case. They work through a structured cycle: analyse the case, generate hypotheses, identify learning issues, study independently, reconvene, apply, reflect.
Problem-based learning is not project-based learning. A project has an output artefact; a problem has a resolution and a set of learning issues. A project takes weeks and a public audience; a problem takes a tutorial cycle and produces understanding.
What the research actually shows
The headline numbers are less flattering than the method's reputation would suggest.
The classic meta-analysis by Dochy and colleagues found a consistent positive picture for skills outcomes — problem-solving, reasoning, clinical judgement — but a weaker and less consistent picture for knowledge outcomes. Students in PBL sometimes learned slightly less immediately than students in comparison conditions, but retained more over time. That is a real and interesting pattern: initial knowledge acquisition may lag, while long-term retention and application may improve.
Walker and Leary's broader cross-disciplinary meta-analysis of 82 studies and 201 outcomes found a modest overall d ≈ 0.13. That is a small effect. Like the Dochy finding, it signals substantial moderation by implementation, assessment type, and discipline.
High-quality K–12 programme evidence exists but is narrow. The REL West randomised trial of Problem Based Economics found a significant positive impact on Grade 12 economics learning and problem-solving, after teachers received a 40-hour subject and pedagogy course. The 40-hour training requirement is itself a finding: PBL effects depend on tutor preparation.
In professional education, later meta-analytic work continues to report advantages in skills and in post-graduation competencies — exactly the domain in which PBL was invented and for which its assumptions are cleanest.
The defensible synthesis: PBL is strong for skills, reasoning, transfer, and long-term retention, especially in professional education. It is modest for immediate knowledge acquisition. It depends heavily on tutor expertise and case design. The biggest risk in practice is under-scaffolded novices who are expected to inquire productively about material they have no starting footing in.
The PBL cycle, as Barrows meant it
A disciplined problem-based learning cycle has six moves.
- Present the problem. A realistic case presented before content instruction.
- Initial hypotheses discussion. Students generate explanations and plausible approaches based on what they already know.
- Identify learning issues. What do we need to learn before we can go further.
- Self-study. Students research the identified issues between tutorials.
- Reconvene, apply, explain. Students return, apply the new knowledge to the case, and explain their reasoning to peers.
- Tutor-facilitated reflection and knowledge-check. A short reflection and a knowledge-check to make sure the core content has been genuinely learned.
The tutor's job is not to present. It is to ask good questions, to spot when the group is stuck in a way that needs a resource nudge, and to close the loop with a knowledge-check at the end.
Examples across phases
Primary. Problem-based learning in pure form is a poor fit for early primary — students lack the schema to drive productive inquiry from an ill-structured problem. A lightly scaffolded version — a local-transport problem with heavy teacher guidance and lots of explicit intermediate instruction — is sometimes called problem-based learning but is really closer to problem-led teaching.
Secondary. A-level economics case on inflation. The tutor presents a short case describing recent UK inflation patterns and policy responses. Students discuss initial explanations, identify what they need to learn (demand-pull vs cost-push, monetary policy transmission, sticky prices), self-study for a week, reconvene to apply their new knowledge, and end with a justified policy recommendation and a short knowledge-check.
Tertiary. Medical tutorial on chest pain. A four-student tutorial group encounters a chest-pain case. Initial differential diagnosis based on prior knowledge. Learning issues identified (cardiac vs pulmonary vs musculoskeletal, diagnostic reasoning, key investigations). Individual self-study. Reconvene to develop a management plan. Tutor-facilitated reflection and a short knowledge-check on the covered content.
Where PBL fails
The failure modes concentrate around one issue: minimal guidance with novices.
- Novices without sufficient prior knowledge. A student who has no grounding in the content area cannot usefully generate hypotheses. The discussion becomes guessing.
- Weak tutor facilitation. PBL depends on the tutor asking the right questions at the right moments. A tutor who either presents (turning PBL into a lecture) or disappears (turning it into undirected group work) collapses the method.
- Cases that don't drive the intended content. If the case doesn't naturally require the target content for its resolution, students will go around the content rather than through it.
- No knowledge-check at the end. Without a short check, PBL can produce satisfied students who have learned less content than they think.
- Over-ambitious case density. Trying to cover too much content through one case produces superficial engagement with everything and deep engagement with nothing.
Mitigation: supply high-quality tutor prompts, resource packs, checkpoints, and explicit teaching when misconceptions stall progress. When PBL is used with younger or less experienced learners, use a hybrid — explicit instruction on the core content, then a problem-based application cycle.
Best fit and poor fit
Best fit: late secondary and tertiary, particularly medicine, nursing, engineering, economics, and case-rich social sciences. Professional education generally. Anywhere the graduation criterion includes reasoning under uncertainty and the content is naturally case-shaped.
Poor fit: early primary (use problem-led explicit instruction instead); contexts without adequate prior knowledge to drive hypothesis generation; time-constrained revision for a knowledge-heavy examination; any setting where a 40-hour teacher development investment is not realistic.
Evidence gaps: direct classroom evidence in primary education is sparse. Most of what counts as "primary PBL" in practice is problem-led teaching rather than full Barrows-model PBL.
Teacher requirements, assessment, and resources
Tutor expertise is the single largest determinant of whether the method works. Cases, facilitation training, and resource packs matter more than physical resources. The REL West trial's 40-hour training investment is roughly the floor — not a ceiling — for serious implementation.
Assess with case analyses, oral reasoning, professional judgement rubrics, and — critically — knowledge tests that ensure the core content has genuinely been learned. The oral reasoning is where PBL's strengths show up; the knowledge test is where its weaknesses do, and assessing both is what produces honest evaluation.
For a classroom-ready template, see Problem-Based Learning Template.
How TAyumira supports problem-based learning
TAyumira supports problem-based learning as one of its ten research-backed teaching methods. When you pick it, the generator produces:
- A realistic, ill-structured case tied to the curriculum content you specify
- Tutor prompts for the initial hypothesis discussion
- A learning-issues template students can use between sessions
- A resource pack of pre-selected materials for self-study
- A knowledge-check at the end to verify content learning separately from reasoning
Start for free — the Free tier covers the full workflow.
FAQ
What is the effect size of problem-based learning?
The classic Dochy meta-analysis found a consistent positive picture for skills outcomes but a weaker and less consistent picture for knowledge outcomes. Walker and Leary's broader cross-disciplinary meta-analysis reported a modest overall d ≈ 0.13. In professional education, later work continues to report stronger effects on skills and post-graduation competencies.
What is the difference between problem-based learning and project-based learning?
Problem-based learning is a tutorial-cycle pedagogy built around an ill-structured case; students investigate, self-study, reconvene, and resolve. Project-based learning is a longer inquiry into a driving question that culminates in a public product. They share an acronym and not much else. The research bases are also separate and should not be blended.
Where did problem-based learning come from?
Howard Barrows developed problem-based learning at McMaster Medical School in the late 1960s as a response to the observation that medical students knew facts but could not reason clinically. The method spread through medical and health-professions education and has since moved into other professional disciplines.
Is PBL suitable for primary education?
In its pure Barrows-model form, probably not — primary students generally lack the schema to drive productive inquiry from an ill-structured case. Problem-led explicit instruction, where a problem frames the lesson but the teacher explicitly teaches the content needed, is a more defensible primary approach.
How much teacher training does PBL need?
The REL West Problem Based Economics trial required 40 hours of subject and pedagogy training before it produced its effects. That is a useful floor. PBL without investment in tutor training tends to default either to lectures with a case at the start, or to undirected group work — both of which wash out the effect.
Related evidence reviews
- Project-Based Learning Evidence
- Cooperative Learning Evidence
- Explicit Instruction Evidence
- Metacognition and Self-Regulated Learning Evidence
Sources
- Dochy, F., Segers, M., Van den Bossche, P., & Gijbels, D. (2003). Effects of problem-based learning: a meta-analysis.
- Walker, A., & Leary, H. (2009). A Problem Based Learning Meta-Analysis. Interdisciplinary Journal of Problem-Based Learning.
- REL West. Problem Based Economics trial.
- Medical College of Wisconsin. Problem-Based Learning overview.
Try one PBL cycle on a case you already teach
Pick one case or scenario from your current unit. Rewrite it so it comes before the relevant content rather than after. Script the tutor prompts for the initial hypotheses discussion. Plan the self-study resources. Add a short content knowledge-check at the end. If you want the case, prompts, and check generated for you, create a free TAyumira account.


