Classroom Assessment – Identifying, gathering, organizing & interpreting
Basic Concepts:
Test – tool, intrument, gather data
Evaluation – judge data
Assessment – interpret data
Measurement – quantify data
Shift of Educational Focus from Content to Learning Outcomes CMO 46 S 2012
OBE – a call for a quality and accountability in education, tranformational
-focuses on what students can actually do after taught.
Immediate Outcome – instant changr of behaviour of learning
Deffered Outcome – long term, by time
Outcome Based Teaching & Learning
-Intended Learning Outcomes
-Teaching & Learning Activities
-Assessments Task
(Constructive Alignment of the 3 above is required)
Learning Outcomes in Different Levels
Lesson Outcome – topic, most specific
Course Outcome - subject
Program Outcome – a graduate should attain at the end of the program
Degree Outcome (Institutional Outcomes – graduate attributes)
Institution’s Vision & Mission
-Relevant source of students learning expectations
Plan – outcomes
Act – improve
Do – teaching and learning
Assessment – check
Principles of OBE
Designing Down – specific
Clarity of Focus
High Expectations
Expanded Opportunities
Student Learning Outcomes – what we want our students able to do
Outcomes – clear leaning results that learners have to demonstrate.
Content Standards
Performance Standards
Learning Competence
Competence – master of knowledge, skills, attitude or a.k.a. 3 domains of learning.
Create
Evaluate
Analyze
Apply
Understand
Remember
6 levels in the Cognitive Domain
Creating
Evaluating
Analyzing
Applying
Understanding
Remembering
Bloom Affective Domain
Internalising
Organizing
Valuing
Responding
Receiving
Psychomotor Domain
Naturalisation
Articulation
Precision
Manipulation
Imitation
Marzano New Taxonomy
Knowledge Domain – provides content
Information, Mental & Physical Procedures
Cognitive Domain – processes all the necessary information.
Knowledge retrieval, comprehension, analysis, knowledge utilization
Metacognitive System Domain – set goals and keep track of how well they are being achieved
Self System – decides whether to continue the current or behaviour or engage in new activity
-beliefs about the importance of knowledge, Efficacy, emotions associated with knowledge
Determining the progress towards the attainment of Learning Outcomes
Purpose of Assessment:
Assessment FOR Learning (done before or during instruction)
Placement – place student in specific learning, Strength
Diagnostic – strengths & weakness
Formative – monitors progress
Assessment OF Learning (done after learning)
-INTEGRAL part of teaching – learning process
-find out if the learning objectives were achieved
-assigned grades
Summative Assessments
Exam
Assessment AS Learning (Self – Assessment)
Teacher – understand and perform well their role of FOR and OF Learning.
-reflection
Mode of Assessments
Traditional – pen and paper test
Authentic – real life situations
Alternative – methods other than paper & pen test, ex. Oral, Moving quiz
Performance – demonstration or product
Portfolio – multiple indicators of student progress
Assessing Student Learning Outcomes
Traditional Assessment
Selected Response Type :
Multiple Choice
Alternate Response
Matching Typye
Constructed Response Type:
Short Answer
Problem Solving
Enumeration
Paper & Pen Test, Recall, Contrived
Authentic Assessment
Essay, Performance, Simulation, Application
-ask students to perform real world task
-meaningful performance task
-clear standards and public criteria
-quality products and performance
-postive interaction
-emphasis on Metacognition
Non Test Assessment of Learning
-a formative assessment
-informal, impromptu feedback
-students ability is measured through real task
-portfolio, teacher observation, slates, daily assignments, journal, games, projects, debates, checklist,
panel discussion, demonstration, problem solving
Performance Assessment
Product – Output
Process – Demonstration
Demonstration
Creation
Criteria in Selecting Performance Assessment Task
Generalizability
Authenticity
Multiple Foci
Teachability
Feasibility
Scorability
Fairness
Criterion – score is compared to standard
Norm – students score is compared to another student’s score
Ipsative – own new score compared to own old score
Contextualized Assessment – focuses on students construction of functioning knowledge (Authentic)
Decontextualized Assessment – written exams (Tradition)
-focuses on declarative and procedural knowledge
-no direct connection to real life context
Portfolio Assessment – systematic and organized collection of students work.
Showcase – best output
Process – cognitive & psychomotor progress
Evaluation – diagnose students learning
Documentation – learning progress, has annotation (comments)
Included in Portfolio:
Artifacts – academic output
Reproductions – students work outside the classroom
Attestations – Evaluative notes of a teacher
Production – goals, reflections and captions
Elements of a Portfolio
Drafts
Dates
Entries
Table of Contents
Cover Letter
Reflections
Stages of Portfolio Development
Set Goals
Collect
Select
Organize
Reflect
Evaluate
Exhibit
Principles of Portfolio Assessment
Content – subject matter that student is important to learn
Learning – students become an active learner
Equity – Allow students to demonstrate their learning styles and multiple intelligences
Equality – fair given
Types of e Portfolio
School Centered – assessment OF learning driven, exam,
Student Centered – assessments FO learning driven
Assessment – support institutional outcomes assessment(intermal)
Learning - Self assessment (students)
Career – showcasing their achievements to employers
Scoring Rubric
Hollistic – general or overall impression
Analytic – specific and describes levels of performance
General – task or skill
Task – Specific – assess skills specificnto a given problem
Likert Scale – express in AGREE or DISAGREE
Rating Scale – almost same in LIKERT but in numeric
Types of Paper and Pencil Test
1. Selected Response Test
A. Binary Choice Items – gives students only two options
> True or False
-avoid giving clues
-avoid using specific determiners like always, never, all, usually
-avoid using trick questions
-keep item length similar
-avoid using negative statements and double negatives
-include only one sentence.
B. Multiple Choice Items
- students are asked to choose a correct or best answer out of the choices
Stem – question
Multiple Choice
Incomplete Statement form
Negative Stem
Group Options
Contained Options
Matching Type = Association Test
-has 2 columns
Column A – Premise
Column B – Response
Perfect Matching – a response is the only answer to a premise
Imperfect Matching – a response is the answer to more than one premise
Constructed Response Test – avoid open ended questions, put the blank at the end, use only one blank,
blanks for the answers should have equal length, provide sufficient answer space.
Short answer
Enumeration
Cloze Test
Fill in the blank
Identification
Essay
Do’s and Dont’s in Test construction
1. Thou shall not provide opaque directions
2. Thou shall not employ ambigous statements in your assessment items
3. Thou shall not provide students clue
4. Thou shall not employ complex syntax in your assessment item
5. Thou shall not use vocabulary that is more advanced than required.
Scoring Biases & Errors
Internal Errors:
Health
Mood
Motivation
Test taking skills
Anxiety
Fatigue
General Ability
External Errors:
Directions
Item Ambiguity
Heat in Room
Lightning
Sample of Items
Observer differences and Bias
Test Instruction
Phases of Making a test
Phase 1 – Planning Stage
Phase 2 – Test Construction
Phase 3 – Test Administration
Phase 4 – Evaluation Stage
Item Analysis: Difficulty and Discrimination Index
Negative Discrimination – more from the lower answered correctly
Positive Discrimination – more from the upper group answered correctly
0 -. 2 – very difficult – reject
. 21 -. 40 – difficult – revise
. 41 -. 60 – moderate/average – retain
. 61-. 80 – easy – revise
. 81-. 1.00- very easy – reject
Principles of a High Quality Assessment
Objectivity
Appropriate
Balance
Validity – intends to measure
-Face Validity (Physical Appearance)
-Construct Validity (Convergent = connected but not same, Divergent = same but not connected)
-Content Validity – align to intended learning outcomes
Reliability – consistency of score
-Test – Retest – repetition of the same test
-Parallel, Equivalent Forms – 2 parallel form of test to the same group of students
-Split Half – Set A Set B, divided into two equivalent halves
-Kuder Richardson – correlating the proportion/percentage of students passing and not passing
Fairness – eliminating personal biases and giving all students equal opportunity
Practicality & Efficiency – usability
Authenticity
Test Validity Enhancers
TOS
Appropriateness of test items
Directions
Reading Vocabulary
Sentence Structure
Identifiable Pattern of Answer
Improper Arrangement of Test Items
Inadequate Time Limits
Test Reliability Enhancers
Use a sufficient number of items
Make sure the assessment procedures
Types of Test
Administration
1. Individual
2. Group
Mode of Response
1. Oral
2. Written
3. Performance
Scoring
1. Objective
2. Subjective (Essay)
Sort of Responsesn Being Emphasized
1. Power – more time more difficult
2. Speed – limited time =easy
Test Construction
1. Standardized
2. Teacher Made
Types of test
Personality – emotional and social adjustments
Intelligence
Attitude
Achievement – mastery of skills
Sociometric – likes and dislikes
Career – occupational
Range – highest variable – lowest variable
Variance
Standard Deviation – more spread out = score heterogeneous
More clustered = score are homogeneous
Percentile – divide the set of scores into 100 (norm reference)
Quartile – divide the set of scorers into four
Decile – into 10
Stanine – divide into 9
Positive Correlation - increase
Negative Correlation – decrease
Non Correlation – no connection
Measure of Shapes
Leptokurtic -
Mesokurtic -
Platykurtic – the narrowest
Skewness
Positively skewed – low scores
Symmetrical Distribution
Negatively skewed – high scores
K12 grading system
Checklist and anecdotal records can br used for grading
Written Works
Performance Task
Quarterly Assessment