Evaluation of Employee Training Assessment Process

 

Scenario Analysis Report

 

table of contents

 

 

 

1 INTRODUCTION……………………………………4

2 GUIDING PRINCIPLES OF ASSESSMENT…………………………………….4

2.1Purpose…………………………………………4

2.2valitdility…………………………………………4

2.3Reliability………………………………………..5

2.4 grading and marking………………………………….5

2.5 Learner demographics…………………………………6

3 SUITABILITY OF ASSESSMENT PROCESSES FOR PURPOSE……6

4 RECOMMENDATIONS ……………………………….7

4.1Conduct an environment scan…………………………….7

4.2 Explain the purpose ………………………………….8

4.3 Develop and implement a clear Marking scheme or matrix……………8

4.4 Use consistent language………………………………..8

4.5 Develop and Implement a Marking Matrix…………………….9

5 REFERENCES…………………………………….10

5 APPENDIX 1………………………………………12

INTRODUCTION

 

The purpose of this report is to evaluate the employee training assessment process currently utilised by the staff at ‘Easy-go’ for the process of placing new recruits into one of two training levels. Refer to Appendix 1. 

This report highlights key areas of concern within the assessment process and provides recommendations of change based on theoretical models and methods of teaching. These recommendations should be taken into consideration to improve the reliability and validity of company’s future assessment. This report can also be used as a reference or training tool for current and new staff working within a teaching or training capacity.

The key areas of concern focused within this report are; Purpose and validity, reliability, grading and marking, learner demographics and cultural backgrounds.

4.0 GUIDING PRINCIPLES OF ASSESSMENT

 

4.1 Purpose

According to Jones (2015) the first step in composing an effective assessment process is to define the assessment objective. This process can be guided by theory or an analysis of content or job functions, and it includes a description of the test content and concepts to be measured. Once established, the purpose of the assessment provides a foundation for subsequent test construction, and scoring and evaluation activities (Haddad & Demsky, 1995).

4.2 Validity

Assessment validity is a key educational concept required in the effective formation of testing and assessment (Kulasegaram & Rangachari, 2018). Validity refers to whether a test measures what it claims to measure and whether the test was appropriate to the focus of the assessment. Validity is frequently regarded as the most important concept in educational testing because it emphasises the meaning placed on test results (Van der Walt & Steyn (Jr.), 2019).

4.3 Reliability

Reliability of assessment can be described as the accurateness and precision of measurement; and therefore also its reproducibility (“Assessment Matters”, 2012). When an assessment provides an accurate and precise measurement of student learning, it will produce the same, consistent result regardless of when the assessment occurs or who does the marking (“Assessment Matters”, 2012).

4.4 Grading and Marking

The marking matrix must be clear and concise, for any assessment task, students deserve to know what is expected of them and how the decisions about the quality of their work will be made i.e. how their work will be marked and graded. If the Marking Matrix is subjective to the teacher’s view or limited by the ambiguity then the assessment will not repeatability yield the same result and the reliability must be questioned (“School education: the National Plan for School Improvement – Parliament of Australia”, 2019). 

4.5 Learner Demographics and cultural backgrounds

The need to recognise that people from culturally and linguistically diverse backgrounds are not homogenous is consistently noted in literature and research relating to cultural diversity (Queensland Government, 2010). There are many different cultural and ethnic groups, considerable diversity within each of these groups, as well as many other factors that affect each person’s identity. Similarly, a person may have a bicultural or multicultural heritage. While a person’s cultural, ethnic, or religious identity is likely to have a significant influence, either conscious or unconscious, on their beliefs, behaviour, values and attitudes, there are a range of other factors that are relevant. These include, for example: The person’s age, gender, education and socioeconomic status, the person’s level of proficiency in English, the length of time they have been living in Australia, whether the person is a first, second or later generation Australian and the extent to which they identify with a particular cultural or ethnic group (Queensland Government, 2010). Different assessments are not required for each cultural or diverse group but considerations must be made regarding the instrument or tool used to create or implement the assessment.

An environmental scan is a planned and purposeful process of gathering, analysing, and reporting current data and information about the characteristics, strengths and needs of a demographic group (Pellegrino, Chudowsky & Glaser, 2001).

4.6 Marking and Grading

Marking and grading schemes perform a vital role in criterion assessments. They clearly explain how a student is graded or marked and every mark is accounted for. This assists the students to identify and meet the teachers’ assessment criteria whilst promoting deeper and self-learning. With a clear marking scheme or matrix the students are able to evaluate themselves and teachers can account for every mark the student has gained or lost. This aids in student confidence regarding the assessor and assessment system as a whole (Curtin University Teaching and Learning, 2010).

 

SUITABILITY OF ASSESSMENT PROCESSES FOR PURPOSE

In the assessment of ‘Easy-go’ staff there is no clear purpose or goal for the assessment. The information states the purpose is “The purpose of the test is to divide students into groups of 10 within two levels”. It provides no insight to the reason for being allocated to one of the two groups, delivers no explanation as to why the group allocation is required and deliver nothing to explain how the marking measurement decides which group each learner will be allocated to. 

The test asks the teacher to numerically score the students ability to ‘use a pleasant voice’. Exam questions, instructions or statements that are ambiguous or confusing cannot be considered valid or reliable (Drost, 2009). The Teachers or in this case Staff Trainers knowledge and understanding will vary from the learners. Statements such as ‘use a pleasant voice’ may seem clear to the teacher but can seem vague, ambiguous or misleading to the learner.

The marking system implemented in the ‘Easy-go’ assessment is unclear and subjective to the marker. There is a lack of an appropriate marking matrix, scheme or guideline. “If assessment defines the curriculum, so marking schemes define assessment”(Morgan, Dunn, Parry & O’Reilly, 2004).

Effective assessment begins by knowing who the students are and how this will affect their educational outcomes (Rhodes, Ochoa & Oritz, 2005). ‘Easy-go’ acknowledges that their students come from a range of cultural backgrounds, education level and age groups however there testing process continues to be inflexible.

Easy-go provide one test for the new recruits regardless of their age, educational level or cultural background. Furthermore, there is no evidence that the company has conducted an environmental scan to assess baseline English language skills as is recommended by (Rhodes, Ochoa & Oritz, 2005).

The test scores the student’s ability to ‘listen carefully’. It does not determine the student’s ability to effectively comprehend the question or context, making it an ambiguous and objectively immeasurable assessment.

The instruction is also confusing as it is culturally biased, containing a concept that may be unfamiliar to particular groups of learners due to their ethnicity.

Paralanguage refers to the non-speech sounds that speakers can use to modify the meaning of their speech, these vary across cultures. Power and clarity and the volume at which we speak conveys meaning that varies; for example, British-English speakers use volume to convey anger, but Indian-English speakers use loudness to demand attention. There are also cross-cultural differences in the normal baseline volume of speech; for example, Asians and Europeans speak at lower volumes than do North Americans. What is considered ‘pleasant speech’ in one culture is not necessarily ‘pleasant’ to another (MENZIES, 2015).

RECOMMENDATIONS

4.1Conduct an environment scan

An environmental scan will identify existing resources, services, instruments, tools and

Programs that are available to the new recruits and learners at ‘Easy-go’. An

Environmental scan will help ‘Easy-go’ to better understand the needs of

the students (new recruits). The findings of the san will help “Easy-go’ teachers see what services, resources, and aids may be required for their learners along with identifying the student(s) who need greater EAL or cultural awareness support (Snapp, 2006).

4.2 Explain the purpose

Teachers have to provide clear and easy to follow criteria and also explain the items on the criteria as well as the scoring standards or scales used. Dunn et al (2004:242) suggests ways to demystify Marking Criteria by providing students with a copy of the Marking Criteria and asking them to prepare questions for clarification before they attempt the task. Guesswork will lead to misunderstandings about the set purpose, objective or goals. Dunn et al (2004:242) recommends the distribution of a glossary of assessment terms to students with the Marking Criteria. Employing consensus moderation processes will ensure that the standards required of students to achieve a particular mark or grade is consistent and comparable across all new recruits.

4.3 Develop and implement a clear Marking scheme or matrix

The analysis of the questionnaires and interviews shows that it is important to make criteria that are easy to follow. Teachers have to provide clear and easy to follow criteria and also explain the items on the criteria as well as the scoring standards or scales used. Orsmond, Merry & Reiling (2000) suggests ways to demystify Marking Criteria by providing students with a copy of the Marking Criteria and asking them to prepare questions for clarification before they attempt the task. Guesswork will lead to misunderstandings about the set objectives. McDowell, Sambell & Davison (2009)

 also echo this view, “briefings and handbooks of information are useful but students often need a more substantial induction particularly to help them to understand assessment criteria.” Orsmond, Merry & Reiling (2000) recommends the distribution of a glossary of assessment terms to students with the Marking Criteria.

4.4 Use consistent language

Ensure test questions are clear, unambiguous and directly related to the assessment context. Avoid complex and convoluted sentence constructions, double negatives, and idiomatic language that may be difficult for students, especially EAL/D students, to understand. It is imperative that students understand the question to avoid unintended results or backwash (University, 2013). Be explicit about what the student needs to do and ‘Easy-go’ expectations of them regarding how they are to answer the questions.

4.5 Develop and Implement a Marking Matrix

The purpose of a Marking matrix is to score students based on their effort, performance and ability to follow directions. A marking matrix assists in communicating the standards of the assessment task to the students and markers. It is an effective way to implement standards-based assessment (Allen, 2009)

. The marking matrix should contain descriptors of the standards for a number of criteria, usually in the form of a grid or matrix (Allen, 2009).

REFERENCES

 

  • Allen, C. (2009). How to make a marking rubric. Retrieved from http://www.mdc.edu/sailearn/documents/4.1%20Rubric%20Workshop%20Handout-Mary%20Allen.pdf
  • Assessment Matters. (2012). Retrieved from https://app.griffith.edu.au/assessment-matters/docs/design-assessment/principles/reliable
  • Curtin University Teaching and Learning. (2010). Developing Appropriate Assessment Tasks. Retrieved from https://clt.curtin.edu.au/local/downloads/learning_teaching/tl_handbook/tlbookchap5_2012.pdf
  • Haddad, W., & Demsky, T. (1995). Education policy-planning: An applied framework. Retrieved from http://www.unesco.org/education/pdf/11_200.pdf
  • Jones, D. (2015). Assessment For Learning. Retrieved from https://dera.ioe.ac.uk/7800/1/AssessmentforLearning.pdf
  • Kulasegaram, K., & Rangachari, P. (2018). Beyond “formative”: assessments to enrich student learning. Retrieved from https://www.physiology.org/doi/full/10.1152/advan.00122.2017
  • McDowell, E., Sambell, K., & Davison, G. (2009). Assessment for learning : a brief history and review of terminology – Northumbria Research Link. Retrieved from http://nrl.northumbria.ac.uk/1433/
  • MENZIES, F. (2015). Paralanguage Across Cultures | Include-Empower.Com. Retrieved from https://cultureplusconsulting.com/2015/04/16/paralanguage-across-cultures/
  • Morgan, C., Dunn, L., Parry, S., & O’Reilly, M. (2004). The student assessment handbook. London: Routledge Falmer.
  • Orsmond, P., Merry, S., & Reiling, K. (2000). The Use of Student Derived Marking Criteria in Peer and Self-assessment. Assessment & Evaluation In Higher Education25(1), 23-38. doi: 10.1080/02602930050025006
  • Pellegrino, J., Chudowsky, N., & Glaser, R. (2001). Knowing what students know (1st ed.). Washington, DC: National Academies Press.
  • Queensland Government. (2010). Working with people from culturally and linguistically diverse backgrounds. Retrieved from https://www.communities.qld.gov.au/resources/childsafety/practice-manual/prac-paper-working-cald.pdf
  • Ramsden, P. (2000). Learning to teach in higher education. London: Routledge.
  • School education: the National Plan for School Improvement – Parliament of Australia. (2019). Retrieved from https://www.aph.gov.au/About_Parliament/Parliamentary_Departments/Parliamentary_Library/pubs/rp/BudgetReview201314/SchoolNPSI
  • University, C. (2013). Creating Exams – Eberly Center – Carnegie Mellon University. Retrieved from https://www.cmu.edu/teaching/assessment/assesslearning/creatingexams.html
  • Van der Walt, J., & Steyn (Jr.), H. (2019). The validation of language tests. Retrieved from https://www.ajol.info/index.php/spl/article/viewFile/116343/105877

APPENDIX 1

Scenario

“Easy-go” language college has just started a course to train recruits in basic telephone skills suitable for working in a call centre. They have made a request for an evaluation of the assessment processes used to place students in an appropriate level. Provided with the request was the following information.

Staff

Staff with a range of age and experience are involved in the teaching program.

Students

The students are a mix of ages and language proficiency, including native speakers and EAL/D students.

Pathway

Students come from a range of education backgrounds including high school and mature age students who are undergoing a career change.

Curriculum

The course is comprised of the following units;

  1. Being prepared, be professional
  2. Answering a customer enquiry          
  3. Putting a caller on hold          
  4. Giving spoken feedback signals                    
  5. Directing the conversation     
  6. Leaving a positive lasting impression
  7. Phone etiquette          
  8. Dealing with difficult calls      

Test Purpose

The purpose of the test is to divide students into groups of 10 within two levels.

Placement test

The students have a role-play phone call with a teacher.  The students are evaluated on the basis of the following rubric

Place your order
(550 words)

Approximate price: $22

Calculate the price of your order

550 words
We'll send you the first draft for approval by September 11, 2018 at 10:52 AM
Total price:
$26
The price is based on these factors:
Academic level
Number of pages
Urgency
Basic features
  • Free title page and bibliography
  • Unlimited revisions
  • Plagiarism-free guarantee
  • Money-back guarantee
  • 24/7 support
On-demand options
  • Writer’s samples
  • Part-by-part delivery
  • Overnight delivery
  • Copies of used sources
  • Expert Proofreading
Paper format
  • 275 words per page
  • 12 pt Arial/Times New Roman
  • Double line spacing
  • Any citation style (APA, MLA, Chicago/Turabian, Harvard)

Our Guarantees

Money-back Guarantee

You have to be 100% sure of the quality of your product to give a money-back guarantee. This describes us perfectly. Make sure that this guarantee is totally transparent.

Read more

Zero-plagiarism Guarantee

Each paper is composed from scratch, according to your instructions. It is then checked by our plagiarism-detection software. There is no gap where plagiarism could squeeze in.

Read more

Free-revision Policy

Thanks to our free revisions, there is no way for you to be unsatisfied. We will work on your paper until you are completely happy with the result.

Read more

Privacy Policy

Your email is safe, as we store it according to international data protection rules. Your bank details are secure, as we use only reliable payment systems.

Read more

Fair-cooperation Guarantee

By sending us your money, you buy the service we provide. Check out our terms and conditions if you prefer business talks to be laid out in official language.

Read more