Appendix

Appendix A – Forms

Faculty Roster Form

(for review of full-time faculty candidates prior to interview)

Faculty Roster Form

Appendix G - Pay Method Code Table – 2020

Pay Method Code Table

July 2020

Appendix J - Template for Writing Learning Outcomes29

This template may be used to write learning outcomes for a workshop, course, or program.

To use the template, work backwards! Begin with the far-right column. Note that “in here” means in the course, program, or workshop. “Out there” refers to after they leave and for an extended period after they leave. Stiehl (2017)15 even suggests this extended period is for life!

Directions

  1. Enter answers to the question about the intended outcomes
  2. Identify assessment tasks.
  3. List skills that must be mastered to demonstrate the outcome.
  4. List the concepts and issues that students must understand to demonstrate the outcome.
Concepts & Issues Skills Assessment Tasks Intended Outcomes
What must the students understand to demonstrate the outcome? What skills must students master to demonstrate the outcome? What will students do “in here” to demonstrate the outcome? What do students need to be able to DO “out there” that we are responsible for “in here?”









 
     
29Adapted from the Program/Course-Workshop Outcomes Guide, resource available by Ruth Stiehl (2017) in The New Outcome Primers Series 2.0, which includes six “primers” on outcomes and related topics. Published by The Learning Organization, Corvalis, Oregon. Visit www.outcomeprimer.com  

Appendix K - Assessing the Quality of Intended Outcome Statements

Copyright© 2016 Stiehl & Sours. Limited rights granted to photocopy for instructor and classroom use.

Template: Scoring Guide—Assessing the Quality of Intended Outcome Statements
Rating scale: 1=absent 2=minimally met 3=adequately met 4=exceptionally met
Characteristics of Good Learning Outcome Statements
Suggestions or

Improvements
1. Action 1 2 3 4
All the statements are written in active voice, and the action words have been carefully chosen to describe the intention.




2. Context 1 2 3 4
All the statements describe what you envision students doing “after” and “outside” this academic experience—because of this experience.




3. Scope 1 2 3 4
Given the time and resources available, the outcome statements represent reasonable expectations for students.




4. Complexity 1 2 3 4
The statements, as a whole, have sufficient substance to drive decisions about what students need to learn in this experience.




5. Brevity and Clarity 1 2 3 4
The language is concise and clear, easily understood by students and stakeholders.




Appendix L - Program Learning Outcomes Curriculum Map

When mapping program learning outcomes (PLOs) to curriculum or assessment, be sure to indicate who is doing the mapping (individual faculty names or a department), the name of the program, and the date.

Begin by listing each PLO. Add rows if needed.

# Program Learning Outcome
1
2
3
4
5
6
7

Continue by following these steps. Examples are provided in the first three rows. Add rows if needed.

  1. Enter each PLO in the first row.
  2. Enter the appropriate course (point of instruction) in the first column.
  3. For each PLO, indicate whether and when the required skills for that outcome are introduced (I) for the first time, reinforced (R), giving the student a chance to practice the skill(s), or emphasized (E) with an expectation of student mastery. Every outcome is not necessarily covered in every course, but every outcome should be covered at some level and assessed at some point in the program.
Point of Instruction

(When are required skills to achieve the PLO covered, and when are student required to demonstrate mastery?)
CLO # 1:

_______
CLO # 2:

_______
CLO # 3:

_______
CLO # 4:

_______
CLO # 5:

_______
CLO # 6:

_______
Example: Course 1 I

I I
Example: Course 2 R I
E R
Example: Course 3
R

E E

           
             
             
             
             
             
             

Appendix M - Course Learning Outcomes Curriculum Map

When mapping course learning outcomes (CLO) to curriculum or assessment, be sure to indicate who is doing the mapping (individual faculty names or a department), the name of the course, and the date.

Begin by listing each CLO. Add rows if needed.

# Course Learning Outcome
1
2
3
4
5
6
7

Continue by following these steps. Examples are provided in the first three rows. Add rows if needed.

  1. Enter each CLO in the first row.
  2. Enter the appropriate point of instruction in the first column.
  3. For each CLO, indicate whether and when the required skills for that outcome are introduced (I) for the first time, reinforced (R), giving the student a chance to practice the skill(s), or emphasized (E) with an expectation of student mastery. Every outcome is not necessarily covered at every level, but every outcome should be covered at some level and assessed at some point in the course.
Point of Instruction

(When are required skills to achieve the CLO covered? Enter unit, chapter, point in time, etc. in this column.)
CLO # 1:

_______
CLO # 2:

_______
CLO # 3:

_______
CLO # 4:

_______
CLO # 5:

_______
CLO # 6:

_______
Example: Ch. 1 I
I I R
Example: Ch. 2 R I R E

Example: Last week in semester
R E
E E

           

           

           

           

           

           

           

Appendix N - Assessment Plan Templates

Program Assessment Plan

Save document to add columns as needed for program learning outcomes (PLOs).

Assessment Plan for [insert program name and code]

Cycle:
[insert cycle dates]

PLO # 1: Enter PLO. PLO # 2: PLO # 3:
Measure(s)

(How will the PLO be assessed?)
Enter the measure, specifying in what class it is administered, and provide any scoring details.

Achievement Target(s)

(What is an acceptable performance level?)
Enter the percent of students expected to meet the standard, and clearly state the standard.

Which, if any, ILO(s) are supported?* Enter any institution-level outcome supported by the PLO.

Appendix O - Assessing the Assessment

Adapted from the “Template: Scoring Guide—Assessing the Quality of Content and Assessment Description.” (available at http://outcomeprimers.com/templates/)

Template: Assessing the Quality of an Assessment Plan
Rating scale: 1=absent 2=minimally met 3=adequately met 4=exceptionally met
Areas to Assess
Suggestions

Improvements
1. Purpose and Alignment 1 2 3 4
Selected assessments purposefully measure an intended outcome.




2. Content 1 2 3 4
Selected assessments are affirmed by content experts (faculty, staff, or literature)




3. Accurate Information 1 2 3 4
Selected assessments provide information that is as accurate and valid as possible.




4. Multiple and Direct Measures 1 2 3 4
The assessment plan includes multiple measure with at least one direct authentic measure of student learning for each learning outcome.




5. Appropriate Standards 1 2 3 4
Achievement targets are clearly stated and justified by faculty who teach the related content.




6. Data Collection 1 2 3 4
Data collection processes are explained and appropriate.




7. Use of Results 1 2 3 4
The plan includes ways to share, discuss, and use the results to improve student or institutional learning.




Appendix P - Outcomes and Assessment Evaluation Rubric

Adapted from the Assessment Progress Template (APT) Evaluation Rubric

James Madison University© 2013 Keston H. Fulcher, Donna L. Sundre & Javarro A. Russell

Full version: https://www.jmu.edu/assessment/_files/APT_Rubric_sp2015.pdf

1 – Beginning
2 – Developing
3 – Good
4 – Exemplary
1. Student-centered learning outcomes
Clarity and Specificity
No outcomes stated.
Outcomes present, but with imprecise verbs (e.g., know, understand), vague description of content/skill/or attitudinal domain

Outcomes generally contain precise verbs, rich description of the content/skill/or attitudinal domain
All outcomes stated with clarity and specificity including precise verbs, rich description of the content/skill/or attitudinal domain
2. Course/learning experiences that are mapped to outcomes
No activities/ courses listed.
Activities/courses listed but link to outcomes is absent.
Most outcomes have classes and/or activities linked to them.
All outcomes have classes and/or activities linked to them.
3. Systematic method for evaluating progress on outcomes
A. Relationship between measures and outcomes 
Seemingly no relationship between outcomes and measures.
At a superficial level, it appears the content assessed by the measures matches the outcomes, but no explanation is provided.
General detail about how outcomes relate to measures is provided. For example, the faculty wrote items to match the outcomes, or the instrument was selected “because its general description appeared to match our outcomes.”
Detail is provided regarding outcome-to-measure match. Specific items on the test are linked to outcomes. The match is affirmed by faculty subject experts (e.g., through a backwards translation).
B. Types of Measures
No measures indicated
Most outcomes assessed primarily via indirect (e.g., surveys) measures.
Most outcomes assessed primarily via direct measures.
All outcomes assessed using at least one direct measure (e.g., tests, essays).
C. Specification of desired results for outcomes
No a priori desired results for outcomes
Statement of desired result (e.g., student growth, comparison to previous year’s data, comparison to faculty standards, performance vs. a criterion), but no specificity (e.g., students will perform better than last year)
Desired result specified. (e.g., student performance will improve by at least 5 points next cycle; at least 80% of students will meet criteria) “Gathering baseline data” is acceptable for this rating.
Desired result specified and justified (e.g., Last year the typical student scored 20 points on measure x. Content coverage has been extended, which should improve the average score to at least 22 points.)
3. Systematic method for evaluating progress on outcomes (continued)
D. Data collection and research design integrity
No information is provided about data collection process or data not collected.
Limited information is provided about data collection such as who and how many took the assessment, but not enough to judge the veracity of the process (e.g., thirty-five seniors took the test).
Enough information is provided to understand the data collection process, such as a description of the sample, testing protocol, testing conditions, and student motivation. Nevertheless, several methodological flaws are evident such as unrepresentative sampling, inappropriate testing conditions, one rater for ratings, or mismatch with specification of desired results.
The data collection process is clearly explained and is appropriate to the specification of desired results (e.g., representative sampling, adequate motivation, two or more trained raters for performance assessment, pre-post design to measure gain, cutoff defended for performance vs. a criterion)

Appendix Q - Template for Writing an Assessment Report

In most cases, reports must be adjusted for the specific audience, and some type of summary should be a part of the report. However, the following information should always be included as report components. Save table and expand rows as needed.

Outcome Assessment Tool(s) Standard(s)

(or Benchmark)
Achievement Target(s) Results (include comparison to previous cycle) Improvement Strategies for Next Cycle



         



         



         

Explanations

  • Outcome – Each outcome should be reported separately.
  • Assessment tool – This is an assignment, test, project, etc., that is used to measure an outcome. More than one assessment may be used for a given outcome, but every outcome should have a unique assessment or unique assessment items.
  • Standard – This is a minimum score, rating, or other unit of achievement that is acceptable for satisfactory performance.
  • Achievement target – A target is the minimum percentage of students who are expected to meet the standard on an assessment.
  • Report the results as they relate to the achievement target, discuss any findings and related details, and discuss noted trends as they relate to previous cycle(s). For example,
    • If the target is that 75% of students will score at least 90 on an exam and 82% do, write, “82% of students achieved a score of 90 or higher.”
    • Include the number of students who participated and any other details that may be relevant, such as fewer class meetings due to college closures or lower/higher number of students than usual.
    • Note any trends toward increased or decreased performance, providing thoughts on contributing factors.
  • Improvement strategies for next cycle – Assessment reports should always include a description of any strategies that are planned as an effort to improve results.