Accounts: 0782557223 Tracking: 0870574923 | 0870574924

assessment

IMPORTANT TIPS FOR EFFECTIVE FACILITATION

 

A.    BEFORE THE WORKSHOP

*         Schedule ample time for planning

*         Take some time to get to know each other

*         Discuss each other’s style of planning and facilitating

*         Avoid making assumptions about one another

*         Take time to discuss your views about the workshop topic

*         Especially examine areas of disagreement

*         Discuss any concerns about potential challenges that participants may present

*         Agree on common goals for workshop

*         Review each other’s triggers

*         Find out whether and when it is okay to interrupt

*         Decide how to keep track of time

*         Strategize about how to stick to the original outline and how to switch gears

*         Plan ways to give signals to one another

*         Divide facilitation of activities fairly

*         Share responsibility equally in preparing and bringing workshop materials and resources

*         Agree to arrive at the workshop site in time to set up and check-in before the workshop begins

*         Schedule time after the workshop to debrief

 

B.    DURING THE WORKSHOP

*         Remember to keep a professional demeanor at all times

*         Keep communicating with each other throughout the workshop

*         Support and validate one another

*         During activities that don’t require constant attention, check-in with one another

*         Include your co-facilitator even when you are leading an exercise or discussion, by asking, for example: “Do you have anything to add?”

*         Use lots of eye contact

*         Assert yourself if your co-facilitator is talking too much

*         Remember that it is okay to make mistakes

*         Take the initiative to step in if your co-facilitator misses an opportunity to address a myth

 

C.    AFTER THE WORKSHOP

*         If you can’t meet right after the workshop, schedule a time to debrief before you leave

*         Listen carefully to one another’s self-evaluation before giving feedback

*         Discuss what worked well

*         Examine what did not work

*         Brainstorm what could have been done differently

*         Use written evaluations as a reference point to talk about the workshop, and assess your effectiveness as co-facilitators

*         Name particular behaviors, for example: “When you kept interrupting me, I felt undermined and frustrated”, or “I got the impression that some participants were bored”, instead of “You always interrupt me” or “You were very controlling during the workshop.”

*         Realize the importance and potential difficulty of debriefing a challenging workshop

*         Make sure to share any clean-up or return of resource materials

*         REMEMBER: YOU HAVE MADE A DIFFERENCE

Assessment Memo or Memorandum

Assessment Memo or Memorandum

 

This is the new monster on the block – we just don’t know who the daddy is.

 

RESEARCHED TERMS

We tried to research the topic. As of (date of research?) we found that none of these terms anywhere on the SAQA website or documentation, nor do they appear on any SETA website or documentation.

 

We cannot find any of these terms in the two official unit standards used by the ETDP SETA for OBE Programme Design, namely 123401 or 123394. We can’t find them on any of the ETDP SETA’s (The primary SETA of the Education Training and Development SETA) programme approval or evaluation documents.

 

This is of course the SETA that is responsible for the unit standard and the design of programmes. Never has this been requested or checked in the past during programme submission or previous verification.
Assessment Memo or Assessment Memorandum refers to a separate document needed during programme approval. When we checked the SETA’s requirements for programme approval or SETA verification, they only requested the following documents:

  1. a) Matrix/Programme application
  2. b) Learner Guide
  3. c) Workbook/Assessment Guide
  4. d) Assessment Guide and in some cases the
  5. e) Mentors Guide

 

Once again, there was no mention of the Assessment Memorandum again
We found a similar term on the unit standard 115755 used for “Assessment Design,” namely:

The guide includes all support material and/or references to support material, including observations sheets, checklists, possible or required sources of evidence and guidance on expected quality of evidence including exemplars, memoranda or rubrics as applicable.

 

I then went and looked the definitions of these two terms mentioned up. The first search result for memoranda read,

 

memoranda:noun, plural memorandums, memoranda. [mem-uh-ran-duh] /ˌmɛm əˈræn də/ ([reference to IPA]) a short note designating something to be remembered, especially something to be done or acted upon in the future; reminder. a record or written statement of something.

A note recording something for future use.

 

And also,

 

a written message in business or diplomacy.

 

As well as,

 

a document recording the terms of a contract or other legal details.

 

The other word pulled up these results –

 

rubrics:a heading on a document.

 

a set of instructions or rules.
CONCLUSION

We guess the Assessment Memo or Assessment Memorandum – terms that are not once used on any SETA or SAQA documents that relate to this process – must be this “memoranda or rubrics as applicable.”

 

This is a pity because there are no resources on the internet or any SETA reference that provide an explanation on this. It would eliminate a lot of confusion around this topic, especially for new Training Providers.

 

CREATING THE ASSESSMENT MEMO OR MEMORANDUM

Some argue that this could form part of the model answers of your Assessment Guide. But why, then, is it required as a separate document pack during verification? At any rate, this is how we would recommend creating the document going forward:

 

–                      Create a separate document calling it the Assessment Memo Cover Page that makes reference to your Unit Standard

details, and maybe give it a “confidential” watermark, footnote or disclaimer of some sort.

–                      Include model answers for each activity/assessment activity in this guide – we’re not recommending any particular

format. You may also want to include the following, depending on the topic or structure of your activities:
*                      Support material and/or references that were provided to the learner – which he/she can use as resources (we mean

resources and references that were given to the learner during the induction or facilitation).

*                      Observations sheets – these should be in the Assessment Guide already if used previously

*                      Checklists – to check if the learner’s response is complete or that all required activities were handed in.

*                      Possible or required sources of evidence – or of course your model answers, or guidelines on how learners were asked

or could answer the question.
*                      Expected quality of evidence – maybe include the amount of pages, size of response, number of words, how many

points will be allocated to this activity and so forth.

 

In the meantime, we hope that this helps, clears some confusion and possibly gives a direction forward.

 

Conduct Assessment for dummies part 2

Conduct Assessment for dummies part 2

In this short video we look at Conducting Assessment for dummies part 2.

Links used in this video clip.

SAQA Website: www.saqa.org.za

TRAINYOUCAN Video Blog: www.youcantrain.co.za

YouTube Video

See our video online here: http://youtu.be/xunlJOg30vc

ASSESSMENT PROCESS

Assessment Process

GENERAL RULES

  1. Use a black pen.
  2. No single words or terms.
  3. Time cost money.
  4. No empty spaces.
  5. Evidence, evidence and evidence.

PLAN FOR ASSESSMENT

What? = Did you review the unit standard? Equipment, workplace, documents required, do you have the scope to assess this, are you registered with that SETA?

Where? = Place, venue, arranged with who?

When? = When will this happen – might be a series of events over more than one day, who did you confirm this with?

How? = Why type of instructions will you use – any role-players involved?

PREPARE FOR ASSESSMENT

This can be a meeting with the learner (at least 24 hours before the time) to make arrangements for the assessment.

Why is this important? Try and answer the questions below and see for yourself.

  • What happens if the learner comes to the assessment and he is not prepared. (Cost factor and who is responsible | disciplinary | cost)
  • What must the learner bring. What happens if he tells you he did not know that he must do something, or bring a form with?
  • Say for example the learner have special needs and you did not address it. He appeals and you get called into the SETA’s offices to answer.
  • Going for a test is stressful, so put the learner at ease and explain the process.
  • Do you own pre-assessment to see if the learner is ready or now. Ask him any question about the learning, or check how far is he with his projects or activities.
  • You must do a role-play, so who is going to help you with this?

CONDUCT ASSESSMENT

  • Examples of Instruments:
  • Concept Maps – A diagramming technique for assessing how well students see the “big picture”.
  • Concept Tests – Conceptual multiple-choice questions that are useful in large classes.
  • Knowledge Survey – Students answer whether they could answer a survey of course content questions.
  • Exams – Find tips on how to make exams better assessment instruments.
  • Oral Presentations – Tips for evaluating student presentations.
  • Poster Presentations – Tips for evaluating poster presentations.
  • Peer Review – Having students assess themselves and each other.
  • Portfolios – A collection of evidence to demonstrate mastery of a given set of concepts.
  • Rubrics – A set of evaluation criteria based on learning goals and student performance.
  • Written Reports – Tips for assessing written reports.

Other Assessment Types Includes concept sketches, case studies, seminar-style courses, mathematical thinking and performance assessments.

 Forms of Evidence

Evidence can come from a variety of sources. The assessor needs to ensure that he/she has enough evidence to make an accurate judgement about a learner’s competence.

  •  Evidence of knowledge:        Assess the ability to recall information (written or oral examination).
  •  Evidence of applied knowledge:      Assess the ability to apply knowledge and demonstrate performance in the workplace.
  •  Evidence of understanding:             Assess the ability to understand the impact of applied knowledge in the context of the workplace.
  •  Evidence of problem solving:           Assess the ability to analyse a problem and provide effective solutions.

 Types of evidence

  •  Direct evidence : Evidence produced by the learner and direct observation of performance, while executing the task.
  •  Indirect evidence : Evidence produced about the learner, either from another source or by the learner him/herself.
  •  Supplementary evidence : Refers to past achievements of what the learner is capable of doing.

Sponsored by TRAINYOUCAN

TRAINYOUCAN  is an accredited training provider through the South African Sector Education and Training Authority (SETA) and provide both accredited and customised learning programmes to organisations looking to maximise their investment in developing their staff.

 

 

Conduct Assessment for dummies part 3

Conduct Assessment for dummies part 3

In this short video we look at Conducting Assessment for dummies part 1.

Links used in this video clip.

SAQA Website: www.saqa.org.za

TRAINYOUCAN Video Blog: www.youcantrain.co.za

YouTube Video
Watch this video on YouTube.

See our video online here: http://youtu.be/9Y9v_ZddKjI

ASSESSMENT PROCESS

Assessment Process

GENERAL RULES

  1. Use a black pen.
  2. No single words or terms.
  3. Time cost money.
  4. No empty spaces.
  5. Evidence, evidence and evidence.

PLAN FOR ASSESSMENT

What? = Did you review the unit standard? Equipment, workplace, documents required, do you have the scope to assess this, are you registered with that SETA?

Where? = Place, venue, arranged with who?

When? = When will this happen – might be a series of events over more than one day, who did you confirm this with?

How? = Why type of instructions will you use – any role-players involved?

PREPARE FOR ASSESSMENT

This can be a meeting with the learner (at least 24 hours before the time) to make arrangements for the assessment.

Why is this important? Try and answer the questions below and see for yourself.

  • What happens if the learner comes to the assessment and he is not prepared. (Cost factor and who is responsible | disciplinary | cost)
  • What must the learner bring. What happens if he tells you he did not know that he must do something, or bring a form with?
  • Say for example the learner have special needs and you did not address it. He appeals and you get called into the SETA’s offices to answer.
  • Going for a test is stressful, so put the learner at ease and explain the process.
  • Do you own pre-assessment to see if the learner is ready or now. Ask him any question about the learning, or check how far is he with his projects or activities.
  • You must do a role-play, so who is going to help you with this?

CONDUCT ASSESSMENT

  • Examples of Instruments:
  • Concept Maps – A diagramming technique for assessing how well students see the “big picture”.
  • Concept Tests – Conceptual multiple-choice questions that are useful in large classes.
  • Knowledge Survey – Students answer whether they could answer a survey of course content questions.
  • Exams – Find tips on how to make exams better assessment instruments.
  • Oral Presentations – Tips for evaluating student presentations.
  • Poster Presentations – Tips for evaluating poster presentations.
  • Peer Review – Having students assess themselves and each other.
  • Portfolios – A collection of evidence to demonstrate mastery of a given set of concepts.
  • Rubrics – A set of evaluation criteria based on learning goals and student performance.
  • Written Reports – Tips for assessing written reports.

Other Assessment Types Includes concept sketches, case studies, seminar-style courses, mathematical thinking and performance assessments.

 Forms of Evidence

Evidence can come from a variety of sources. The assessor needs to ensure that he/she has enough evidence to make an accurate judgement about a learner’s competence.

  •  Evidence of knowledge:        Assess the ability to recall information (written or oral examination).
  •  Evidence of applied knowledge:      Assess the ability to apply knowledge and demonstrate performance in the workplace.
  •  Evidence of understanding:             Assess the ability to understand the impact of applied knowledge in the context of the workplace.
  •  Evidence of problem solving:           Assess the ability to analyse a problem and provide effective solutions.

 Types of evidence

  •  Direct evidence : Evidence produced by the learner and direct observation of performance, while executing the task.
  •  Indirect evidence : Evidence produced about the learner, either from another source or by the learner him/herself.
  •  Supplementary evidence : Refers to past achievements of what the learner is capable of doing.

ASSESSMENT JUDGEMENTS

  1. You can only find someone “COMPETENT” over collective questions, complete instrument or a full unit standard. This means that you cannot mark the person as competent for each questions or instructions.
  2. We rate individual knowledge, questions or instructions with:
      • a rating scale
      •  Meet Requirements /Do not meet Requirements
      •  Yes/or No
  3. You must collect a) evidence to provide proof that the assessment took place + b) collect evidence that the learner can perform the task + c) collect evidence that he/she practically can perform the skill / or performed it in the workplace.
  4. Remember the rules of evidence:
  • valid
  • authentic
  • consistent
  • sufficient
  • current

 FEEDBACK

This is where the Appeals Process always come in handy. Learners claim that you never provided feedback or told them what they did wrong. GET PROOF THAT YOU PROVIDED FEEDBACK!

  • STRENGTHS AND WEAKNESSES OF FEEDBACK
  •  TYPE AND MANNER OF FEEDBACK IS PROVIDED
    • Providing Constructive Feedback
    • Constructive feedback – An essential element of assessment
  •  FEEDBACK OBTAINED FROM CANDIDATE
  •  DISPUTES AND APPEALS
  •  RECORDING OF FEEDBACK

REVIEW PROCESSES

  • REVIEW STRENGTH AND WEAKNESSES
  • FEEDBACK FROM RELEVANT PARTIES
  • WEAKNESSES IDENTIFIED

Ever found problems with the Assessment Process or the Guide and no-one seems to take care of fix it? Well, this is where you provide feedback and review the entire assessment process to ensure this was properly.

Now what do you think is going to happen when the Moderator Moderates your Assessments 4 weeks later and find that you did not sign documents or included all the evidence in your Assessment Guide. He change the Assessment Decision from “Competent” to “Not-Yet-Competent”. The learner phones you and ask what’s going on?

Sponsored by TRAINYOUCAN

TRAINYOUCAN  is an accredited training provider through the South African Sector Education and Training Authority (SETA) and provide both accredited and customised learning programmes to organisations looking to maximise their investment in developing their staff.

 

 

Conduct Assessment for dummies part 3

Conduct Assessment for dummies part 3

In this short video we look at Conducting Assessment for dummies part 3.

Links used in this video clip.

SAQA Website: www.saqa.org.za

TRAINYOUCAN Video Blog: www.youcantrain.co.za

httpv://youtu.be/9Y9v_ZddKjI

See our video online here: http://youtu.be/9Y9v_ZddKjI

ASSESSMENT PROCESS

Assessment Process

GENERAL RULES

  1. Use a black pen.
  2. No single words or terms.
  3. Time cost money.
  4. No empty spaces.
  5. Evidence, evidence and evidence.

PLAN FOR ASSESSMENT

What? = Did you review the unit standard? Equipment, workplace, documents required, do you have the scope to assess this, are you registered with that SETA?

Where? = Place, venue, arranged with who?

When? = When will this happen – might be a series of events over more than one day, who did you confirm this with?

How? = Why type of instructions will you use – any role-players involved?

PREPARE FOR ASSESSMENT

This can be a meeting with the learner (at least 24 hours before the time) to make arrangements for the assessment.

Why is this important? Try and answer the questions below and see for yourself.

  • What happens if the learner comes to the assessment and he is not prepared. (Cost factor and who is responsible | disciplinary | cost)
  • What must the learner bring. What happens if he tells you he did not know that he must do something, or bring a form with?
  • Say for example the learner have special needs and you did not address it. He appeals and you get called into the SETA’s offices to answer.
  • Going for a test is stressful, so put the learner at ease and explain the process.
  • Do you own pre-assessment to see if the learner is ready or now. Ask him any question about the learning, or check how far is he with his projects or activities.
  • You must do a role-play, so who is going to help you with this?

CONDUCT ASSESSMENT

  • Examples of Instruments:
  • Concept Maps – A diagramming technique for assessing how well students see the “big picture”.
  • Concept Tests – Conceptual multiple-choice questions that are useful in large classes.
  • Knowledge Survey – Students answer whether they could answer a survey of course content questions.
  • Exams – Find tips on how to make exams better assessment instruments.
  • Oral Presentations – Tips for evaluating student presentations.
  • Poster Presentations – Tips for evaluating poster presentations.
  • Peer Review – Having students assess themselves and each other.
  • Portfolios – A collection of evidence to demonstrate mastery of a given set of concepts.
  • Rubrics – A set of evaluation criteria based on learning goals and student performance.
  • Written Reports – Tips for assessing written reports.

Other Assessment Types Includes concept sketches, case studies, seminar-style courses, mathematical thinking and performance assessments.

 Forms of Evidence

Evidence can come from a variety of sources. The assessor needs to ensure that he/she has enough evidence to make an accurate judgement about a learner’s competence.

  •  Evidence of knowledge:        Assess the ability to recall information (written or oral examination).
  •  Evidence of applied knowledge:      Assess the ability to apply knowledge and demonstrate performance in the workplace.
  •  Evidence of understanding:             Assess the ability to understand the impact of applied knowledge in the context of the workplace.
  •  Evidence of problem solving:           Assess the ability to analyse a problem and provide effective solutions.

 Types of evidence

  •  Direct evidence : Evidence produced by the learner and direct observation of performance, while executing the task.
  •  Indirect evidence : Evidence produced about the learner, either from another source or by the learner him/herself.
  •  Supplementary evidence : Refers to past achievements of what the learner is capable of doing.

ASSESSMENT JUDGEMENTS

  1. You can only find someone “COMPETENT” over collective questions, complete instrument or a full unit standard. This means that you cannot mark the person as competent for each questions or instructions.
  2. We rate individual knowledge, questions or instructions with:
      • a rating scale
      •  Meet Requirements /Do not meet Requirements
      •  Yes/or No
  3. You must collect a) evidence to provide proof that the assessment took place + b) collect evidence that the learner can perform the task + c) collect evidence that he/she practically can perform the skill / or performed it in the workplace.
  4. Remember the rules of evidence:
  • valid
  • authentic
  • consistent
  • sufficient
  • current

 FEEDBACK

This is where the Appeals Process always come in handy. Learners claim that you never provided feedback or told them what they did wrong. GET PROOF THAT YOU PROVIDED FEEDBACK!

  • STRENGTHS AND WEAKNESSES OF FEEDBACK
  •  TYPE AND MANNER OF FEEDBACK IS PROVIDED
    • Providing Constructive Feedback
    • Constructive feedback – An essential element of assessment
  •  FEEDBACK OBTAINED FROM CANDIDATE
  •  DISPUTES AND APPEALS
  •  RECORDING OF FEEDBACK

REVIEW PROCESSES

  • REVIEW STRENGTH AND WEAKNESSES
  • FEEDBACK FROM RELEVANT PARTIES
  • WEAKNESSES IDENTIFIED

Ever found problems with the Assessment Process or the Guide and no-one seems to take care of fix it? Well, this is where you provide feedback and review the entire assessment process to ensure this was properly.

Now what do you think is going to happen when the Moderator Moderates your Assessments 4 weeks later and find that you did not sign documents or included all the evidence in your Assessment Guide. He change the Assessment Decision from “Competent” to “Not-Yet-Competent”. The learner phones you and ask what’s going on?

Sponsored by TRAINYOUCAN

TRAINYOUCAN  is an accredited training provider through the South African Sector Education and Training Authority (SETA) and provide both accredited and customised learning programmes to organisations looking to maximise their investment in developing their staff.

 

 

VIDEO: Conduct Assessment for dummies part 2

Conduct Assessment for dummies part 2

In this short video we look at Conducting Assessment for dummies part 2.

Links used in this video clip.

SAQA Website: www.saqa.org.za

TRAINYOUCAN Video Blog: www.youcantrain.co.za

httpv://youtu.be/xunlJOg30vc

See our video online here: http://youtu.be/xunlJOg30vc

ASSESSMENT PROCESS

Assessment Process

GENERAL RULES

  1. Use a black pen.
  2. No single words or terms.
  3. Time cost money.
  4. No empty spaces.
  5. Evidence, evidence and evidence.

PLAN FOR ASSESSMENT

What? = Did you review the unit standard? Equipment, workplace, documents required, do you have the scope to assess this, are you registered with that SETA?

Where? = Place, venue, arranged with who?

When? = When will this happen – might be a series of events over more than one day, who did you confirm this with?

How? = Why type of instructions will you use – any role-players involved?

PREPARE FOR ASSESSMENT

This can be a meeting with the learner (at least 24 hours before the time) to make arrangements for the assessment.

Why is this important? Try and answer the questions below and see for yourself.

  • What happens if the learner comes to the assessment and he is not prepared. (Cost factor and who is responsible | disciplinary | cost)
  • What must the learner bring. What happens if he tells you he did not know that he must do something, or bring a form with?
  • Say for example the learner have special needs and you did not address it. He appeals and you get called into the SETA’s offices to answer.
  • Going for a test is stressful, so put the learner at ease and explain the process.
  • Do you own pre-assessment to see if the learner is ready or now. Ask him any question about the learning, or check how far is he with his projects or activities.
  • You must do a role-play, so who is going to help you with this?

CONDUCT ASSESSMENT

  • Examples of Instruments:
  • Concept Maps – A diagramming technique for assessing how well students see the “big picture”.
  • Concept Tests – Conceptual multiple-choice questions that are useful in large classes.
  • Knowledge Survey – Students answer whether they could answer a survey of course content questions.
  • Exams – Find tips on how to make exams better assessment instruments.
  • Oral Presentations – Tips for evaluating student presentations.
  • Poster Presentations – Tips for evaluating poster presentations.
  • Peer Review – Having students assess themselves and each other.
  • Portfolios – A collection of evidence to demonstrate mastery of a given set of concepts.
  • Rubrics – A set of evaluation criteria based on learning goals and student performance.
  • Written Reports – Tips for assessing written reports.

Other Assessment Types Includes concept sketches, case studies, seminar-style courses, mathematical thinking and performance assessments.

 Forms of Evidence

Evidence can come from a variety of sources. The assessor needs to ensure that he/she has enough evidence to make an accurate judgement about a learner’s competence.

  •  Evidence of knowledge:        Assess the ability to recall information (written or oral examination).
  •  Evidence of applied knowledge:      Assess the ability to apply knowledge and demonstrate performance in the workplace.
  •  Evidence of understanding:             Assess the ability to understand the impact of applied knowledge in the context of the workplace.
  •  Evidence of problem solving:           Assess the ability to analyse a problem and provide effective solutions.

 Types of evidence

  •  Direct evidence : Evidence produced by the learner and direct observation of performance, while executing the task.
  •  Indirect evidence : Evidence produced about the learner, either from another source or by the learner him/herself.
  •  Supplementary evidence : Refers to past achievements of what the learner is capable of doing.

Sponsored by TRAINYOUCAN

TRAINYOUCAN  is an accredited training provider through the South African Sector Education and Training Authority (SETA) and provide both accredited and customised learning programmes to organisations looking to maximise their investment in developing their staff.

 

 

Conduct Assessment for dummies part 1

Conduct Assessment for dummies part 1

In this short video we look at Conducting Assessment for dummies part 1.

Links used in this video clip.

SAQA Website: www.saqa.org.za

TRAINYOUCAN Video Blog: www.youcantrain.co.za

httpv://youtu.be/19dPI_hqL6U

See our video online here: http://youtu.be/19dPI_hqL6U

Traditional

Traditional education, also known as back-to-basics, conventional education or customary education, refers to long-established customs found in schools that society has traditionally deemed appropriate.

 OBE –  (Outcome Based Education)

Methods of outcome-based education (OBE) are student-centered learning methods that focus on empirically measuring student performance (the “outcome”).

 COMPETENT

 -You can perform the job. (We refer also to the outcome of the actual skill.)

-Someone assessed you physically to ensure you can do the job.

-Evidence was collected to provide proof that you competent.

 NYC  (Not Yet Competent)

 -You can’t perform the job. (There might be one small part that you missed)

-You cannot perform the outcome or the skill on your own.

-Some of the evidence could not be collected as proof that you can perform the function.

-Get an opportunity to re-visit the learning any try again on another assessment.

 NQF  (National Qualifications Framework)

The National Qualifications Framework (NQF) is a Framework on which standards and qualifications, agreed to by education and training stakeholders throughout the country, are registered. It came into being through the South African Qualifications Authority Act (No. 58 of 1995, Government Gazette No. 1521, 4 October 1995), which provides for ‘the development and implementation of a National Qualifications Framework’.

NQF levels

 RPL

Recognition of Prior Learning is a process whereby people’s prior learning can be formally recognised in terms of registered qualifications and unit standards, regardless of where and how the learning was attained. RPL acknowledges that people never stop learning, whether it takes place formally at an educational institution, or whether it happens informally.

The process of RPL is as follows:

  • Identifying what a person knows and can do;
  • Matching the person’s knowledge, skills and experience to specific standards and the associated assessment criteria of a qualification;
  • Assessing the learning against those standards; and
  • Crediting the person for skills, knowledge and experience built up through formal, informal and non-formal learning that occurred in the past

KEY PRINCIPLES OF ASSESSMENT

Appropriateness The method of assessment must be suited to the performance being assessed.
Fairness The method of assessment must not present any barriers to achievements, which are not related to the evidence.
Manageability The methods used must make for easily arranged, cost-effective assessments that do not interfere with learning.
Time efficient Assessments must not interfere with normal daily activities or productivity.
Integration into work or learning: Evidence collection must be integrated into the work or learning process where it is appropriate and feasible.
Validity The assessment must focus on the requirements laid down in the standard; i.e. the assessment must be fit for purpose.
Direct The activities in the assessment must mirror the conditions of actual performance as closely as possible.
Authenticity The assessor must be satisfied that the work being assessed is attributable to the person being assessed.
Sufficient The following questions can guide the assessor.

  • Valid
  • Sufficient
  • Authentic
  • Currency
  • Relevancy
  • Consistency
Systematic Planning and recording must be sufficiently rigorous to ensure that assessment is fair.
Open Learners must contribute to the planning and accumulation of evidence. Assessment candidates must understand the assessment process and the criteria that apply.
Consistent The same assessor must make the same judgement in similar circumstances.The judgment made, must be parallel to the judgment which would be made by other assessors.

 ASSESSMENT TYPES

FORMATIVE ASSESSMENT SUMMATIVE  ASSESSMENT
• Designed to support the teaching and learning process• Assists in the planning future learning• Diagnoses the learner’s strength and weaknesses• Provides feedback to the learner on his/her progress• Helps to make decisions on the readiness of learners to do a summative assessment

• Is developmental  in nature

• Credits/certificates are not awarded

 

• At the end of a learning programme(qualification, unit standard, or part qualification)• To determine whether the learner is competent  or not yet competent• In knowledge and inputs-based systems, this usually occurs after a specified period of study, e.g. one year• In OBET, learner-readiness determines when assessments will take place• Is carried out when the assessor and the learner agree that the learner is ready for

assessment

ASSESSMENT PROCESS

Assessment Process

Sponsored by TRAINYOUCAN

TRAINYOUCAN  is an accredited training provider through the South African Sector Education and Training Authority (SETA) and provide both accredited and customised learning programmes to organisations looking to maximise their investment in developing their staff.

 

 

The fundamentals of effective assessment: Twelve principles

The twelve principles below address practical assessment issues. They are united by a single idea: assessment is at the heart of the whole teaching and learning process.

1. Assessment should help students to learn.
2. Assessment must be consistent with the objectives of the course and what is taught and learnt.
3. Variety in types of assessment allows a range of different learning outcomes to be assessed. It also keeps students interested.
4. Students need to understand clearly what is expected of them in assessed tasks.
5. Criteria for assessment should be detailed, transparent and justifiable.
6. Students need specific and timely feedback on their work – not just a grade.
7. Too much assessment is unnecessary and may be counter-productive.
8. Assessment should be undertaken with an awareness that an assessor may be called upon to justify a student’s result.
9. The best starting point for countering plagiarism is in the design of the assessment tasks.
10. Group assessment needs to be carefully planned and structured.
11. When planning and wording assignments or questions, it is vital to mentally check their appropriateness to all students in the class, whatever their cultural differences.
12. Systematic analysis of students’ performance on assessed tasks can help identify areas of the curriculum which need improvement.

An explanation of the principles

1. Assessment should help students to learn.

Educational assessment has at least two main functions: it is part of a system of accreditation and it fosters student learning. These functions are generally described as ‘summative’ and ‘formative’ respectively. It is a useful theoretical distinction, although in practice the two purposes tend to be intertwined. Too often, the former function dominates discussion at the expense of the latter. Yet formative assessment is crucial to effective learning. In its broadest sense, it refers to the whole process of learners testing their understandings with and against others, especially the experts – their teachers. On the basis of feedback, learners modify and develop those understandings. This feedback can be given in different forms: in responses to students’ contributions in class, as well as written or oral commentary on their work. Some of these views will also form the basis for a summative judgement and the generation of marks and grades. A good deal of it will not. If assessment is conceptualised in this way, it is not an irksome ‘add on’ to teaching and learning, but is understood to be an integral part of the process.

2. Assessment must be consistent with the objectives of the course and what is taught and learnt.

The stated objectives of any given course of study in a university cover a wide range of understandings, higher order intellectual skills and values. If the assessment tasks do not test these outcomes, the statements remain empty pieties. Students behave in a quite rational way in ‘reading’ the true objectives of a program from the nature of the assessment. If an exam tests rote learning, that is apparently what is valued by assessors. Even if classes have encouraged more interesting and probing thought, students will judge that to be peripheral if it is not reflected in the nature of the required tasks. Assessment, then, is very powerful in driving students’ learning behaviours. Assessment tasks must be designed to foster a more demanding and challenging approach to learning, so that valued outcomes, such as the capacity to analyse and synthesise, are in fact rewarded.

3. Variety in types of assessment allows a range of different learning outcomes to be assessed. It also keeps students interested.

Until recently, there has tended to be a predictable uniformity in university assessment – if not across, then certainly within disciplines. The 3,000 word essay, the lab report, the multiple-choice or short essay exam, have been standard in different fields. A long tradition suggests that such forms have been quite valid for the assessment of certain outcomes but, as increasing emphasis has fallen on the development of skills (generic as well as subject-specific), some gaps have become obvious. To take the most obvious examples, as oral communication and teamwork skills are increasingly defined as important outcomes in many courses, these skills have to be assessed by new means. Teachers have started to question how often a student needs to show that he/she can write a 3,000 word analytical essay, for instance. Innovative and creative approaches to assessment are increasingly in evidence – often the result of probing thought about what a course is really trying to achieve. As long as they are clearly explained, such tasks can enhance student interest and motivation – and are usually a lot more interesting and rewarded for academics to assess.

4. Students need to understand clearly what is expected of them in assessed tasks.

This issue has two dimensions, one intellectual, the other practical. Confusion in either can make students very anxious and lead to unproductive work. The second is the easiest to address. Students have the right to a clear statement of the assessment schedule in any subject, preferably in the first class, with topics, dates, weightings, submission procedures, penalties for late submission, etc. They should also have a strong, specific statement about the nature of plagiarism and its consequences (this can perhaps be dealt with at the course level, early in first year). Any variation of these requirements during a program could have legal implications and should be approached very carefully (and probably with the advice of a head of department).

The second dimension requires a delicate balance, which is perhaps part of the ‘art’ of teaching. Understandably, students want to know exactly what they have to do to gain good marks or grades. Teachers can do a lot to assist them with this – and a great deal more than has usually been done in the past. They can set out criteria by which each task will be judged (see below), they can discuss the task in class before submission (and afterwards, with a view to the next task), they can provide sample answers, offer examples of good writing in the discipline, and so on. What they cannot do is reduce success to a formula that is easy to follow. To do so would be to discourage some of the higher order skills that university study attempts to develop. These skills require some room for individuality, originality, creativity, the unexpected. A graduated approach may be part of the answer, with strong direction provided in the early years and increasing encouragement of individual approaches as students progress.
5. Criteria for assessment should be detailed, transparent and justifiable.

In recent years, many academic teachers have provided students with statements of criteria against which their work is assessed. Many students coming straight from secondary school are used to working with such statements and may expect them. The question of how detailed these should be is a matter of judgement. It seems that students find very general statements about ‘advanced analytical skills’ and the like of little use. On the other hand, as discussed above, it is reductive and counter-productive to try to pin everything down. Probably such statements can never be more than a guide, but in certain ways their usefulness is clear. They can indicate, for instance, that grammar and spelling will be taken into account, or that a certain range of reference to sources is expected. It is probably helpful to look at some examples from colleagues. In drafting these statements, it is important to keep checking against the subject and course objectives and to give some thought to how the criteria – and the balance between them – can be justified to students and perhaps others.

6. Students need specific and timely feedback on their work – not just a grade.

As argued above, an important (and arguably the primary) function of assessment is helping students to learn. A mark or a grade tells students something about the effectiveness of their learning, but not very much. They will know that they have succeeded or failed by the assessor’s standards, but often will have little idea of why. If they are to recover from failure, or deepen their understanding, they need to have explanations – and suggestions for improvement. This means that blanket statements about the general quality of analysis, say, may be of little use. Really conscientious marking involves pointing out each individual flaw in logic or inadequacy of treatment. The reality of academics’ workloads means that a strategically selective approach is required, particularly if one considers the second aspect of this issue – that feedback needs to be relatively quick to be effective. A guiding principle is that students should get feedback on one piece of work in time for this to be of benefit for the next. A useful strategy for overwhelmed markers is to comment intensively on one section of a piece of work, as an example of how the student should go about addressing any problems. This is particularly useful when dealing with poor expression.

7. Too much assessment is unnecessary and may be counter-productive.

In setting an assessment regime for any subject, academic staff need to be aware of students’ overall workload – and, as far as possible, the deadlines for other subjects they may be doing. A very heavy assessment load does not allow students time to comprehend and explore material: it tends to push them into shallow, rote approaches to study, where they try to find shortcuts and formulae for tasks, without really understanding underlying principles. If students are faced with too much ‘busywork’, particularly if this involves a repetitive approach, they are likely to lose interest and motivation. As suggested above, there is no need to keep asking students to demonstrate a particular skill. Academic staff often worry about ensuring that students ‘cover’ a given body of knowledge. This concern can dominate thinking about how a course should be taught and assessed. Given the rapid expansion of knowledge in all fields, it is increasingly accepted that students cannot be expected to retain everything in working memory, and that greater emphasis should be placed on their learning how to access information when they need it – and how to use it when they do. If this approach is accepted, it is possible to define the important principles and concepts that need to be understood and to ensure that students cover these by strategic structuring of assessment tasks. Apart from the benefits to students of carefully targeted assessment, it is obviously in the interests of staff to avoid an oppressive assessment workload.

8. Assessment should be undertaken with an awareness that an assessor may be called upon to justify a student’s result.

It is important for universities that their assessment practices are transparent and demonstrably fair and reasonable, especially when assessment is associated with professional accreditation. Staff need to be careful about assessment procedures and records, without becoming paranoid about possible dangers. They should be able to demonstrate that students have been given adequate information and notice about assessment and that uniform procedures have been followed in grading them. There will always be room for ‘professional judgement’ but this should be arguable in terms of standards generally recognised within a field or profession. In areas where a high level of subjectivity is involved (such as some of the humanities and arts) it is highly advisable that an individual assessor’s judgement is confirmed by others. Some areas require panel assessments (of artistic performance, for instance); in others, key pieces of work are double-marked or, minimally, failing marks are referred to a second examiner. Practice in this area varies considerably according to discipline. It is vital that new academics are thoroughly acquainted with their departmental or faculty policies relating to marking. It is a good idea to consult with colleagues to moderate the assessment of student performance, especially when there is a high level of subjectivity involved.

9. The best starting point for countering plagiarism is in the design of the assessment tasks.

Plagiarism can be a problem in many areas of university teaching. A wide range of strategies for countering plagiarism is set out in another section of this website. Many university teachers believe that the most effective plagiarism minimisation strategy lies in the design of the tasks set for assessment. This involves not only avoiding the practice of repeating a few well-worn questions from year to year, but also requiring the kinds of analysis and/or creativity that preclude the direct use of others’ thinking. In some areas, students can be required to base their reflection on a context specific to the course. They can be asked to critique others’ work, such as journal articles. While not solving all problems associated with plagiarism, such measures can go a long way towards making it pointless.

10. Group assessment needs to be carefully planned and structured.

More and more courses are incorporating group projects into their assessment regimes. in response to an increasing emphasis on the need for students to be able to learn and work together. This development opens up many opportunities for innovate thinking about assessment, but it is fraught with dangers. At the moment, many students dislike group work and group assessment, particularly when the assessment is of the group as a whole (that is, when all members receive the same mark). Certainly there seem to be students, and academic staff, who have had bad experiences with this form of assessment. Nevertheless, group work is a valuable component of the higher education curriculum. It needs to be planned and structured very carefully, and students have to be systematically prepared to undertake group tasks. Another section of this website offers suggestions about how to make this important form of assessment effective and rewarding.

11. When planning and wording assignments or questions, it is vital to mentally check their appropriateness to all students in the class, whatever their cultural differences.

Staff working with students of non-English speaking backgrounds point to the fact that the phrasing of many assessment tasks often needlessly exacerbates difficulties for these students. They also argue, however, that simpler, clearer wording would benefit all students. Sometimes essay topics, for instance; seem to be aimed more at a teacher’s colleagues than at undergraduate students. Beyond the issues associated with language, there are many ways in which tasks can be unintentionally biased against some student groups – if references are quite specific to a particular culture, for instance, or if a task, such as oral presentation, may be more difficult for women for some countries. The important thing is for teachers to be sensitive to such implications. This does not necessarily mean that tasks such as the latter example should be eliminated, but it may mean that students disadvantaged in such ways should be given special assistance. Many universities or faculties have special provisions relating to assessment for disadvantaged students. Disabled students, for example, may be allowed extra time for examinations or the use of special equipment. New staff should acquaint themselves with all policies in this area. The Equal Opportunity office, or its equivalent, is a good starting point.

12. Systematic analysis of students’ performance on assessed tasks can help identify areas of the curriculum which need improvement.

The work submitted by students for assessment is a valuable source of feedback for staff on the effectiveness of their teaching. If certain areas are clearly not understood by significant numbers of students, this signals the need for urgent attention. It can be very helpful to approach the analysis somewhat formally – perhaps in the form of a regular review by all staff involved in the subject or course. Such a review can also monitor the effectiveness of assessment procedures in testing the desired outcomes of the program.