Want to make creations as awesome as this one?

Transcript

The Fundamentals of ASSESSMENT in Education

What is assessment, etc.?

Types of assessment

Assessment & Constructive Alignment

Assessment instruments

Analysing & Evaluating assessment data

Uses of assessment results

Credit

How does assessment relate to testing, measurement, and evaluation?

How can assessment be categorised?

How LO, T&L activities & assessment are said to be constructively aligned?

What are the instruments to do assessment?

How to analyse and evaluate assessment data?

What do you do with assessment results?

Assoc. Prof. Dr Nurulhuda Abd Rahman

2021 NURULHUDA A.R.

The process of COLLECTING DATA or information on what students know, able to do and/or their values/dispositions with a PURPOSE so as to assist in DECISION MAKING

Assessment

The process of collecting data or information on what students know and/or able to do by giving a test in a fairly controlled situation so as to assist in decision making

Assessment, Testing, Measurement, & Evaluation

Definitions and relationship

TESTING

The process of applying a standard scale or a measuring tool to an object, event or situation (the process of converting data into numerals)

MEASUREMENT

The process of comparing data to a set of criteria and a standard for the purpose of judging the worth or quality

EVALUATION

How does the four concepts relate to each other?

rELATIONSHIP

Example of doing measurement 1. When you want to measure the length of a table, you will APPLY AN INSTRUMENT such as a ruler to the side of the table 2. When you want to measure students' understanding, after the data was collected through assessment, you do MEASUREMENT by scoring and then converting the raw score into a standardised score (APPLYING A STANDARD SCALE for e.g. the percentage)

e.g of testing: giving a written test (MCQ, structured or essay items) or performance test such as oral test or playing an instrument. The main difference between testing and alternative assessment is that testing is carried out in a fairly or highly controlled situations whereas alternative assessments are usually done in less structured contexts and require an extended period of implementation.

When we evaluate it means that we need to compare the data or information that we have to a set of criteria and standard to judge the worth or quality of the data or information. e.g.1: Let's say after doing measurement a student scored 80% on his exam. So, what does 80% means in terms of the quality of performance? So, to determine the quality we compare it to a standard. For example UPSI standard states that a standardised score of 80-100% shows an excellent performance and we designate this excellent performance a label 'A'. e.g.2: Let's say a student teacher is performing the teaching practice. To determine the quality of the teaching performance, we have a rubric with a set of criteria. We collect data through observation (assessment), compare the data to the criteria stated and the descriptions for each level of quality (e.g. weak, satisfactory, good, excellent) and then decide which description most resembles what the student teacher demonstrated.

1. e.g. of PURPOSE: to determine achievement (SPM standardised examination) e.g. of DECISION MAKING: (i) students can decide whether to further study or join the work force based on their result (ii) teacher can decide whether an improvement in deliver needs to be implemented for her next batch of students (if her previous class result was not encouraging) 2. e.g. of PURPOSE: to determine understanding (Q&A during a lesson) e.g. of DECISION MAKING: (i) students will know whether they have understood the material and can decide whether they need to ask further to deepen their understanding if they have not understood the material well (ii) teacher will know whether students have understood her lesson well and can decide whether to explain further or move on to the next unit

Assessment is the process of collecting data/information to assist in decision making. Testing is an assessment method that uses a test to collect data. Measurement quantifies and standardises the collected data. Evaluation is the process of judging the quality of the collected data either the standardised quantitative data from measurement (e.g. data from testing) or any data collected directly from assessment (e.g. data from verbal questioning). After the data was evaluated the process goes back to assessment whereby an informed decision can now be made.

Relationship between assessment, testing, measurement, & evaluation

Types of Assessment

according to the purpose of assessment

testing

Assessment by giving a test in a fairly or highly controlled situation/context.

performance

presentation

written reports

Q&A

diagnostic

reflection

metacognition

assessment for learning

assessment of learning

summative assessment

formative assessment

assessment as learning

alternative category

Purpose: giving grades

Lorem ipsum dolor sit amet, consectetuer adipiscing elit, sed diam nonummy nibh euismod tincidunt ut laoreet dolore magna aliquam erat volutpat.

Reflecting on ones thinking processes. "thinking about thinking"

Purpose: To identify weaknesses and improve Steps to do reflection 1. Identify a problem/issue 2. Identify the source of the problem 3. Relate relevant theories to explain the problem/issue 4. Identify new lesson learned/insights 5. Identify strengths to address the problem (if any) 6. Formulate actions to address the problem

To identify strengths and weaknesses

Verbal questioning throughout a lesson to monitor and assist learning

Purpose: Identify and monitor self-progress in learning

Purpose: Assisting students towards the attainment of the learning outcomes

Purpose: To determine achievement and give grades

Types of Assessment

according to the format of assessment

Quiz

examination

playing An instrument

teaching practice

concert

product

process

PERFORMANCE ASSESSMENT

paper & pencil test

traditional assessment

alternative assessment

Assessment that is not paper and pencil test

project work

pERFORMANCE test

oral test

Note: If the PERFORMANCE ASSESSMENT tasks are similar to or closely resemble tasks that are performed in the context of the related profession, then it is also known as authentic assessment.

Lorem ipsum dolor sit amet, consectetuer adipiscing elit, sed diam nonummy nibh euismod tincidunt ut laoreet dolore magna aliquam erat volutpat.

Constructive Alignment between LO, T&L activities, and Assessment

assessment

T&L activities

Learning outcome

constructive alignment

"Develop a lesson plan that is based on the STEM approach for the topic Specific Heat Capacity"

"Interactive lecture on the STEM approach and developing a STEM approach lesson plan"

"A lesson plan that is based on the STEM approach"

By lecturing interactively, students gain knowledge on the concept of the STEM approach, and by having a hands-on activity developing the lesson plan, students are guided towards achievening the LO.

By producing a lesson plan, the teacher can determine whether students are able to achieve the LO and at the same time, doing the lesson plan assists students in attaining the LO.

example

A condition where the T&L activities and assessment tasks support the attainment of the LOs and the assessment can determine the attainment of the LOs (Biggs, 2003).

What kinds of assessment tasks support the attainment of the LO? How will the LO be determined whether it is achieved or not?

What are the learning activities that support the attainment of the LO?

What are the knowledge, skills and attitudes/values you want the students to achieve?

Constructive Alignment between LO, T&L activities, and Assessment

assessment

T&L activities

Learning outcome

constructive alignment

A condition where the T&L activities and assessment tasks support the attainment of the LOs and the assessment can determine the attainment of the LOs.

"Experiment to determine the relationship between the pressure of gas with fixed mass enclosed in a container and its volume"

"Doing an experiment to determine the relation between pressure of gas with fixed mass enclosed in a container and its volume"

"A lab report based on the experiment doneand/orAn observation on the process of doing the experiment"

By doing an experiment, students learn by doing and is aligned to the intended LO

By producing a lab report, the teacher is able to determine whether students know what is involved in experimenting and with the added observation during the experiment, the teacher can also assess the process of experimenting

A condition where the T&L activities and assessment tasks support the attainment of the LOs and the assessment can determine the attainment of the LOs (Biggs, 2003).

1

2

3

4

rubric

An assessment tool that has descriptions for each of the level of frequency or quality.

checklist

An assessment tool that shows the presence or the absence of certain attributes

rating scale

An assessment tool that shows the level of quality or frequency of the attributes

test

Usually refers to a collection of written items that have correct answers or a set of instructions for performance tests

rubric

Assessment Instruments

e.g. of test: examination or test papers & instructions for performance test. Example of a TST Points to consider when planning and developing a test paper: 1. Use a Test Specification Table to help in planning 2. Identify the type of test item (Objective: e.g. MCQ, matching, true-false. Subjective: e.g. structured or essay questions) 3. Decide the number of items for each type. Factors to consider include (i) the test duration (for one objective question, the time is on average 1.5 min, for structured and essay questions they vary from 10 - 20 min each, (ii) the level of LO for the topic to be included (the higher level LO should be asked through the subjective type items), (iii) the student learning time (SLT) allocated for the topic (more SLT, more questions or weightage for the question) 4. Plan for a normal distribution according to the level of difficulty (a few questions at the remember and understand levels, a lot at the apply and analyse levels and a few at the evaluate and create level. However, the level of questions also depends on the level of LO for the topic included (if the LO for a particular topic is at the understand level, the test question for that topic should be at the same level) and the purpose of the test (if the purpose of the test is to assess critical thinking, the level of the test items should be from level apply up to create not remember and understand)

e.g of attribute: skills, attendance

e,g. of attributes: knowledge, behaviours, skills and strategies that students demonstrate. e.g. of frequency scale: never, sometimes, often, always e.g. of quality scale: weak, good, excellent A questionnaire is an example of a rating scale

Rubrics have descriptors for each level of frequency or quality. Eg. if the level is "excellent" the descriptor might be "able to justify the choice of teaching method convincingly" Go to http://rubistar.4teachers.org/ to create a rubric

Lorem ipsum dolor sit amet, consectetuer adipiscing elit, sed diam nonummy nibh euismod tincidunt ut laoreet dolore magna aliquam erat volutpat.

Analysing assessment data

Quantitative data analyses

counts

percentage

mean, median, mode

Mean (for ordinal, interval, and ratio data with normal distribution) Median (for ordinal, interval, and ratio data with skewed distribution) Mode (for nominal, ordinal, interval, and ratio data)

uses min & max values

exclude min & max values

uses all data

range

Assisting students towards the attainment of the learning outcomes

frequency/percentage

descriptive analyses

variability

standard deviation

Purpose: Identify and monitor self-progress in learning

central tendency

quartile

Assisting students towards the attainment of the learning outcomes

Qualitative

Purpose: giving grades

Lorem ipsum dolor sit amet, consectetuer adipiscing elit, sed diam nonummy nibh euismod tincidunt ut laoreet dolore magna aliquam erat volutpat.

Lorem ipsum dolor sit amet, consectetuer adipiscing elit, sed diam nonummy nibh euismod tincidunt ut laoreet dolore magna aliquam erat volutpat.

Lorem ipsum dolor sit amet, consectetuer adipiscing elit, sed diam nonummy nibh euismod tincidunt ut laoreet dolore magna aliquam erat volutpat.

Purpose: giving grades

"The difference being exhibited by data points within a data set, as related to each other or as related to the mean" (investopedia) or "the extent to which a distribution is stretched or squeezed" (wikipedia)

Analysing assessment data

Content analysis: Analysis of content in written responses, presentations, interview, audio and video recording by categorising, summarising and tabulating data.

QUALITATIVE DATA ANALYSIS

EVALUATING ASSESSMENT DATA

Compare measured data to a standard

e.g 80-100% is an excellent performance

Compare collected data to a set of criteria

e.g. 1: compare qualitative data to a set of criteria in a rubric e.g. 2: compare students verbal answers to what the teacher has in mind

Uses of assessment results

inform outcome attainment

Inform teachers and parents the performance of each student in the class

1

feedback & feedforward FOR learners

Inform students about their performance, strengths, weaknesses and feedforwards inform actions to be taken to improve overall learning

2

feedback & feedforward for lesson delivery

Inform teachers on the effectiveness of the delivery methods, the appropriateness of the assesment method and learning outcomes and what needs to done to improve

3

feedback & feedforward For schools

Inform administrator on the competency of teachers, adequacy of povisions of facilities/infrastructure/ teacher development program, etc.

4

Feedback: Informing and judging students' past performance Feedforward: Informing and focusing on students development in the future

"Dump the Past, Embrace the Future, and Lead the Way to Change" (Hirsch, 2017)

references

References

  • Biggs, J.B. (2003). Teaching for quality learning at university. Buckingham: Open University Press/Society for Research into Higher Education. (Second edition)
  • Hirsch, J. (2017) The Feedback Fix: Dump the Past, Embrace the Future, and Lead the Way to Change. Rowman & Littlefield Publishers