Principles and theory of good
education and assessment
practice
Anwar Ali Khan
Faculty of Medical Education
Samarkand Medical Institute
Uzbekistan
“to be trained is to arrive, to be educated is to continue to travel”
Kenneth Calman (Former CMO, UK)
Introduction and thanks
• Rector – Professor Jasur Alimjanovich Rizaev
• Prof. Agababyan L.R.
• Associate Prof. Khusinova Sh.A.
Anwar Ali Khan FRCGP
• Honorary Professor of Medical Education, Samarkand State
Medical University
• Professor of Primary Care Medical Education, University of East
London
• Medical Director, International Accreditation, Royal College of
GPs, UK
Ground rules
Permission to
Prompt/Interrupt
Plan of webinars
Principles and theory of educational evaluation practice
Writing MCQs, MEQs and SJT assignments
Writing assignments for OSCE and oral examinations
Workplace-based teaching and assessment tools
Psychometrics in Education and Assessment
Assessing difficult items, programmatic assessment
Face to face workshop on Teaching the Trainers
Aims and objectives
• Introduction to educational theory and adult learning
• Evolution of assessments
• Assessment and learning
• Purposes of assessment
• Objectives
• What makes for good assessment?
• Characteristic of assessment methods
• Utility
• Key issues
• Features of specific instruments
DEFINING LEARNING OUTCOMES
Learning outcomes: A journey
‘… if you don’t know
where you intend to go
before you start, you may
end up somewhere you
don’t want to be.’
Swanwick (2009)
Learner-centred approach
• What are the intended learning outcomes?
• What is the desired level of understanding?
• What kind of teaching learning activities are required to
achieve the desired level of understanding?
Good teaching & learning
‘Good teaching is getting learners to use the level
of cognitive processes needed to achieve the
intended learning outcomes ….’
Biggs (2011)
A VOYAGE OF DISCOVERY
• You can’t discover new lands
without sometimes losing
sight of the shore
(Andre Gide)
YERKES-DODSON CURVE
Trainers should try to keep
Learners at this level
REASSURE CHALLENGE
LEARNING PREFERENCES
Honey & Mumford
Activist
Reflector
Theorist
Pragmatist
Activist
Activists learn best from activities in which there are:
•New experiences and challenges
•Excitement, change and variety
•Opportunities for just ‘having a go’.
Reflectors
Reflectors learn best from activities where they:
• Are allowed to observe and think
• Have time to think before acting
• Can carry out detailed research
Theorists
Theorists learn best from activities where they:
• Are intellectually challenged and able to analyse as part of a concept
or theory
• Can question basic ideas, assumptions or logic
• Are in situations with a clear purpose
Pragmatists
Pragmatists learn best from activities where they:
•See an obvious link with a ‘real life’ problem
•Have the chance to practise techniques with coaching or feedback
•Can concentrate on practical issues
THE INNER APPRENTICE
•Adults are self-motivating.
•Natural curiosity and desire to learn.
•The skill of an effective teacher is to provide
support and challenge……
•…….Not just to pass on their knowledge
didactically.
Neighbour, R. (1992)
The Inner Apprentice.
Newbury: Petroc
THE EDUCATIONAL PARADYM
• Intended learning outcomes
(by end of session, learner will be able to) SYLLABUS
• Unintended learning
(richer than tick-box education?)
CURRICULUM
• Assessment drives learning
(sadly)
ASSESSMENT TEACHING
Rees, C. (2004) Outcomes-based curricula in medical education: insights from educational theory, Medical Education, 38: 593-598.
THE EDUCATIONAL PRESCRIPTION
• Three factors contributing to effective learning:
1. Whether learning is based on a real problem or an actual patient.
2. Whether learner searches for evidence on their own and
independently.
3. Reflection indicates clearly defined implications for the learning upon
actual practice.
3
Sackett’s Cube
2
PROBLEM-BASED LEARNING
• Sackett’s group introduced the concept of the ‘educational
prescription’.
• The skill of effective teacher is to work with the learner to
produce a plans for problem-solving based on real patients.
• The learner implements the prescription, finds out the
evidence for themselves, and ‘owns’ the resultant learning
Sackett, D., Haynes, R., Guyatt, G., & Tugwell, P. (1995)
Clinical epidemiology: a basic science for clinical medicine, Second edn, London: Little Brown and Company. pp 404-423
Teaching Methods
• Skill of effective teacher is to move beyond traditional
method of simply passing on knowledge.
• Additional teaching methods involve the teacher as a
facilitator of learning, through questioning, promoting
autonomy in the learner, and encouraging learning
through discovery and reflective practice.
4 Teaching Methods
1. DIDACTIC
Telling / Statements
• Helping the learner to become
more ‘facts rich’ by providing
information in an undiluted
and direct way.
4 Teaching Methods
2. SOCRATIC
Questioning
Awareness-raising
• Helping the learner to become aware of the
limits of their knowledge, values or beliefs
through asking awareness-raising questions
(Neighbour
1992)
4 Teaching Methods
3. HEURISTIC
Promoting discovery
• Aims to encourage discovery by respecting
the autonomy of the learner and giving
them ownership of the issue.
(Knowles 1990)
4 Teaching Methods
4. COUNSELLING
Exploring feelings
• Reflective practice in which the teacher’s role
is to promote the exploration of the learners
feelings, and self-discovery through an
examination of their assumptions.
(Claxton 1996, Heron 1989)
Effective Trainers are FLEXIBLE
Didactic Socratic Heuristic Counselling
Mainly for the Mainly for the
TRAINER LEARNER
JOHARI’S WINDOW
• Model for getting and giving feedback.
• Developed by psychologists researching human personality at
University of California in 1950s: Joseph Luft & Harry Ingrham.
• To promote self-awareness.
• To help people see things about themselves
that they didn’t know / couldn’t recognise.
FOUR ROOM HOUSE
with thanks to John Morris
ROOM 1 = PUBLIC AREA
TEACHER
LEARNER
• Contains things that are openly known and talked about - and which
may be seen as strengths or weaknesses.
• This is the self that the learner chooses to share with others
ROOM 2 = BLINDSPOT AREA
TEACHER
LEARNER
• Contains things that teachers observe that learners don't know about.
Could be positive or negative behaviours.
• Will affect the way that others act towards the learner.
ROOM 3 = UNKNOWN AREA
TEACHER
LEARNER
• Contains things that nobody knows about the learner - including the
teacher.
• This may be because learners have never exposed those areas of their
personality, or because they're buried deep in their subconscious.
ROOM 4 = PRIVATE AREA
TEACHER
LEARNER
• Contains aspects of the learner that they know about but choose to
keep hidden from teachers.
SIMPLIFIED LEARNING NEEDS
Use by Trainers
• Trainers should try to open up the public area.
• Make the other three areas as small as possible.
• This is done by regular and honest exchange of feedback and
disclosing personal feelings.
• Get to know "makes the Learners tick”; what they find easy;
what they find difficult.
• Provide appropriate support.
Examinations are formidable even to
the best prepared, for the greatest
fool may ask more than the wisest
man can answer.
(Charles Colton, 1780-1832)
Assessment Evolution
• 1900-1960
• Schools for acculturation, basic skills, sorting of pupils
• Assessment as the gates and teachers as gatekeepers
• 1960-2000
• Shift in assessment purpose from sorting individuals to
schools, systems
• Emphasis on accountability
• 2000
• Schools for Learning
• Assessment As Learning
Types of Assessment
Summative
Diagnostic Formative
Norm- Ipsative
Criterion
referenced
referenced
What – Why - How
What is assessment?
Why do we assess?
How do we assess?
Values for best practice
Assessment should:
…be valid
…be reliable
…be transparent
…motivate students to learn
…promote deep learning…be fair…be equitable…be formative even when it is
primarily intending to be summative…be timely…be incremental…be
redeemable…be demanding…enable the demonstration of excellence…be
efficient…be manageable…..
Race, Brown & Smith (2005)
Assessment
• What and how students learn depends to a major extent on
how they think they will be assessed
Biggs, Teaching for Quality Learning at University 1999
• The curriculum instructs teacher what to teach
• The exam instruct students what to learn – assessment
drives learning
Donald Melnick 1991
Assessment and Learning
Effective assessment empowers students to ask reflective questions
and consider a range of strategies for learning and acting …
… students move forward in their learning when they can use their
personal knowledge to construct meaning, have skills of self-
monitoring to realize that they don’t understand something, and
have ways of deciding what to do next
Earl 2003
It is very difficult for students to achieve a learning goal unless they
understand that goal and can assess what they need to do to reach
it.. … so self-assessment is essential to learning
Black et al. 2003
Assessment for learning (AfL)
‘AfL is the principle that all
assessment should
contribute to helping
students to learn and
succeed’
Sambell, McDowell & Montgomery (2013)
Constructive alignment
• Define the learning outcomes (LOs)
• Select learning/teaching activities that will enable students to attain
the LOs
• Assess how well students have met the LOs
• Design assessment methods to assess the LOs
Formative assessment
Assessment for learning is
any assessment for which the first priority in its design and
practice is to serve the purpose of promoting pupils’
learning …
Such assessment becomes ‘formative assessment’ when
the evidence is used to adapt the teaching work to meet
learning needs
Black et al, 2002
Types of assessment
Summative Formative
‘the type of evaluation ‘the use of systematic
used at the end of a term,
course, or programme for evaluation in the process
purposes of grading, of curriculum construction,
certification, evaluation of teaching and learning for
progress, or research on
the effectiveness of a the purpose of improving
curriculum, course of any of these processes’
study, or educational plan’
Bloom et al. (1981) Bloom (1971)
Miller's pyramid for clinical assessment
Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S7
Action
Skills &
Behaviour
Performance
Competence
Cognition/
Knowledge
Knowledge
Cate, O. t. BMJ 2006;333:748-751
Miller's pyramid with levels of awareness
Peile, E. BMJ 2006;332:645
Copyright ©2006 BMJ Publishing Group Ltd.
Purposes of assessment
• Measuring academic achievement
• Certifying competence
• Assuring standards (for example, that an individual meets
predetermined minimal standards)
• Diagnosing problems
• Encouraging appropriate learning (both what, how and when)
• Providing motivation and direction
• Evaluation of teacher or course effectiveness
• Predicting future performance of candidates
• Discriminating between candidates (for example for entry to
specialist training programmes)
What Assessment?
• Formative or summative?
• Content or process?
• Integrated or subject-specific?
• ‘Big picture’
• Triangulation
• Piloting
• Rating scales versus checklists
• Global rating versus specific anchors
Design
• What are we trying to assess?
• Knowledge
• Skills
• Process skills
• Content skills
• Perceptual skills
• Attitudes
• Learning outcomes/curriculum content
• Blueprint
• Validity
• Etc.
Five important questions
1. What should be assessed
2. How should it be assessed
3. Why should it be assessed
4. When should it be assessed
5. Who should assess
Assessment Evaluation criteria
1. Validity - Does it measure what it is supposed to measure?
2. Reliability - Does it produce consistent results?
3. Educational impact
4. Cost effectiveness - Is it practical in term of time & resources?
5. Acceptability
Validity
• How well does this assessment represent the idea/concept?
• Are the problems relevant & important to the curriculum
• Does it link between content of assessment & curriculum
objectives?
• Usually drawn up by blueprint
• Instrument can be valid for one purpose and invalid for
another.
• Not so much valid or invalid but valid for what and for whom
Types of validity
• Face – Does this reflect real life?
• Content - The extent to which the content of the test matches the
instructional objectives.
• Criterion - The extent to which scores on the test are in agreement
with (concurrent validity) or predict (predictive validity) an external
criterion.
• Construct - The extent to which an assessment corresponds to
other variables, as predicted by some rationale or theory.
Reliability
• How sure am I that this assessment provides a good
representation of the ideas/concepts that it
measures?
• Sampled across a range of problem
• Length of the test
• Standardized patients
• Check list / Administered & scored properly
• Examiner & Patient behaviour
• Choice of more than one tool
Types of Reliability
• Test-Retest – over time
• Internal Consistency - across the different items
• Inter-rater reliability - across different examiners
Reliability and Validity Reliable but
not valid
Valid but not
reliable
Reliable
Neither reliable and valid
nor valid
Reliability or validity?
• Reliability versus validity
‘Reliability is about competitive fairness to students. Validity is about
responsible justice to their future patients.’ (Marinker, 1997)
• Lots of work on aspects of reliability
• Far less on aspects of validity
• Trend to move away from reliability as the sole
criterion
Educational Impact
• Assessment may have facilitative or adverse effect on
learning
• Emphasises the important learning outcomes
• Provides feedback about the success of study
methods & so forth
Adverse effects of assessment
• Inappropriate influence on learning
• Create a hurdle-jump, ‘pass & forget’ mentality
• Causing loss of self-esteem and humiliation amongst those
who are not ‘up to standard’
• Engendering unhelpful and possibly unhealthy
competitiveness
• Affecting progress and career decisions because of flawed
decisions.
Feasibility & Cost
• Feasibility - in terms of manpower, time, place, etc.
• Cost - in time, finances, resources
Acceptability/Authenticity
• How well does the assessment reflect the real
situation - face validity?
• Acceptable to commissioners, students, public, etc.
Utility of assessment
(after van der Vleuten, 1996)
Utility is a product of the relationship between:
• Reliability
• Validity
• Educational impact
• Acceptability
• Cost
Utility of assessment
Utility = Rw x Vw x Ew x Aw x C-1w
Knowledge tests:
Multi-source feedback
selected response (MCQ) (360 degree appraisal)
constructed response (SAQ)
Written papers –
essays or MEQ or
Video analysis
CRQ
OSCE
“Self
Portfolios Assessment
Sucks!”
Mini-CEX
Viva
Incognito SPs
Clinical competence (and performance)
• Complex (!)
• Correlates with knowledge
• Not generic
• Content/case-specific
• There is a gap between competence and
performance
Delivery
• For maximum reliability need
• Lots of observations
• Lots of observers
• Lots of contexts
• Sampling
• Blueprinting
• Role of simulated/real patient in judgements
• Standard setting
• Feedback
Quality control
• Training/briefing
• Examiners
• Is selection more effective than training?
• Role players
• Patients
• Psychometric analysis
• Candidates feedback
Work-based assessment
eg portfolios/
MSF/videos/ Incognito
SPs
Does OSCE/long & short
cases/mini-CEX
Shows
Problem-solving tests Knowledge tests
e.g. MEQ? e.g. MCQ/SAQ
Knows How
Knows
Effective Feedback
“Feedback is among the most powerful moderators of learning”
Hattie, 2012
Assessment weaknesses
• Timing wrong – at the end
MCQ • Tests knowledge, not how to use it
Oral and MEQ • Talk or write about things they learn on courses
Simulated surgery • Refer or defer, handout imaginary leaflets, etc.
• distortions of real general practice - selection
Video • sharing options, safety netting, refer and defer
WPBA: The theoretical base…
An opportunity to re-couple teaching and learning with assessment
Authentic: gets as close as possible to the real situations in which
doctors work
Provides us with the only route into many aspects of professionalism
Tools
MSF
PSQ
COT
CEX
CBD
DOPS
Clinical Supervisor’s structured report
Linking assessment and teaching
Case discussions
• reveal learning needs
Use assessments in teaching
• COT for consulting skills
• CBD for understanding & decisions
Supervise knowledge learning
• but leave it to the Learner
• online learning, books, courses, etc.
Questions???
Next steps