Skip to Main Content

Program Learning Outcomes Assessment

The PLO Assessment Process is one of the three components of Integrated Review of Academic Programs (IRAP).  PLO review and assessment takes place every three years and includes:

  • Creating or revising program learning outcomes for every degree program (or major)
  • Mapping programs to curricular and co-curricular experiences from introduction through mastery
  • Following a department assessment plan to review PLOs
  • Conferring as a program regarding PLO assessment results against department standards for mastery
  • Revising curriculum and/or assessment process in light of PLO Assessment results


The posts below discuss various aspects of our PLO Assessment process.  Please comment and share–assessment should lead to a conversation, not a lecture.

Program Assessment Report – Timeline for Departments

Faculty Perspectives about Assessment

Cover of the AAHLE Winter 2017 edition

Winter 2017 edition of AAHLE

What happens when faculty are at the heart of learning outcomes assessment?

The winter 2017 quarterly edition of the Association for the Assessment of Learning in Higher Education focuses on faculty perspectives and experiences with assessment.  Since assessment conversations are primarily held by administrative folks, many faculty express feeling like they are on the outside looking in (or looking away).  At Boise State, however, faculty are at the heart of our assessment work.  The articles in this edition bring the faculty voices into conversations centered around assessment of student learning from increasing evaluation reliability, to process, and perspectives.

How will PAR Reports be shared?

A question was asked about the ways in which the Program Assessment Reports outcomes will be reviewed and then shared.  As we state in our recruitment information for PAR Peer Reviewers:

Boise State has incorporated its process for program review and has, in the tradition of academic scholarship, added peer review for Program Assessment Reports (PAR). After reports are completed and submitted by May 1st, teams of faculty will provide peer review to the departments. Reviewers, working in interdisciplinary triads, will apply a standard rubric to each report and generate a response for each department. The intent is to accurately identify the strengths and weaknesses of the assessment reports and processes for our peers. Ultimately, programs will have clearer visions of needed improvements for teaching and learning to provide an ever-better degree for our students.

Once the reviews are completed, feedback reports will be shared with the departments/programs.  Part of the process includes a feedback loop with a departmental response to the evaluation due October 1st.  What might the responses include?  For example, if action has already been taken on questions during the review process, the program can document that.  Or, if there is a misunderstanding of the report information, clarification can be offered.

Because the PAR process is part of our accreditation responsibilities, a summary of the data will be included in our accreditation report. That report is available to the campus, however, the level of detail is limited for most programs.  We’ll be highlighting a few programs and sharing those with NWCCU evaluators, as we have done in the past.  Prior reports are available for anyone who would like to review them.

Anthropology Program Assessment Overview

Using Assessment to Guide Dynamic Academic Programs

As a small department, Anthropology has been using assessment as a tool to continuously gather empirical data and other measures to answer questions about their students’ learning, experiences, and the paths they take after earning their degree at Boise State.  This brief case provides a glimpse into the way that this department, which sits in the heart of the Arts and Sciences, embraces assessment data in its various forms to inform change.


In 2005 assessment expert Barbara Walvoord visited Boise State. John Ziker, then a new professor in the department, Walvoord’s “clear and simple” approach to draw upon what we do every day as faculty and departments to support our assessment questions. The department approaches assessment as a “committee of the whole” to review data and institute curricular and other changes based on assessment findings.

Program Learning Outcomes and Assessment Methods

Anthropology posts its four (4) undergraduate learning outcomes on its website. Each of these learning outcomes is broadly written with related sub-outcomes listed beneath it. Having recently revised these outcomes as a WIDER Persist project, the outcomes are streamlined into groups that are covered through program curricula.

Like the PLOs, the curriculum map is posted on the department website so that students, faculty and advisors can easily refer to them. According to Ziker, advisors and faculty currently use the learning outcomes in conversations with new students and are considering building the curriculum map into advising meetings with upper division students as part of their indirect assessment methods.

The department has been using final portfolios as the centerpiece of its assessment efforts for decades. These binders, submitted by students in their final year, were collected during the fall semester, evaluated in person by the faculty beginning at Thanksgiving, and discussed in a senior interview with each student prior to graduation.  They have a departmental rubric which guides the faculty review and helps to standardize the feedback to the students and for the department.  The process, until recently, was labor intensive and restrictive since the binders needed to remain in the department and available to multiple faculty simultaneously. The faculty used these portfolios to consider student progress toward program goals and they served as the basis for curricular and programmatic changes considered by the faculty.

In 2015 Anthropology shifted to using Digication e-Portfolios. The online format offered a number of benefits: the faculty could review portfolios on their own schedule and anywhere, the department was able to “collect” e-portfolios in both fall and spring, and the new tool made reading and responding to students much easier and less time consuming. On the other hand, faculty have noted that the number of artifacts that students add to their e-Portfolio has declined. They seemed more inclined to save conference fliers or event handouts and add them to their paper portfolio. It is not clear what has triggered this change and better understanding it might be an area of exploration at some point. As a result of the tendency to include less in the e-Portfolio, the department is looking at ways they can change or increase messaging about building the e-portfolio with their current students.

Another innovative assessment effort conducted by the department that has been beneficial was a study of alumni LinkedIn profiles.  Ziker Anth_testimonialexplained that many of the graduates connect to him through LinkedIn.  Through that platform, he was able to view alumni job titles and then connect those titles to the Department of Labor employment category codes.  In addition, the department has been increasing its efforts to collect “testimonials.”  These brief snapshots of Anthropology graduates are featured prominanty on the department website to tell the story about exciting paths that graduates have taken.  In addition to providing excellent public relations, testimonials and LinkedIn pages contribute a rich source of indirect data for the program.  In fact, these sources and others contributed to the decision to launch a new 12 credit hour online certificate in Design Ethnography, a collaborative endeavor between Anthropology and the College of Innovation and Design.

Connecting Outcomes and Curriculum

In 2012, with the introduction of the University Learning Outcomes and the Foundational Studies Program, Anthropology made adjustments to their curriculum to align the coverage of written and oral communication, critical thinking, and teamwork. New Communication in the Disciplines (CID) and Finishing Foundations (FF) courses serve as the home for content and skills to meet these university learning outcomes which have corollaries in the Anthropology outcomes.  For example, while the term paper remains a signature assignment in the Finishing Foundations course, the department added peer review in the CID and FF courses to support the development of teamwork strategies.

The CID and FF courses also provide scaffolding for the program to build in student reflection on the PLOs and the way these outcomes prepare students for life after graduation. They also begin priming students to provide feedback to the department through the graduating student survey and alumni surveys. Collecting this end of program indirect assessment is a key to the program assessment plan for the department.  Anthropology has emphasized collecting narratives or testimonials from their students to be able to share with future students and to tell the story of what can be done with an anthropology degree. The testimonials, which are linked to the department website, serve several purposes. They provide a narrative about possible paths for new students, they celebrate the accomplishments of alumni, and they inform faculty conversations about curricular changes.  These data contribute to the courses and program in general.

Another department activity that both generates assessment data and supports collecting data is the annual majors meeting. This meeting provides an opportunity for the students and faculty to interact annually, for the department to share information about events and activities, and serves as a check in time for students each year.

As the program moves into its 2017 Program Assessment Reporting cycle, it will revisit the outcomes and assessment plans that it has put into place since its last program review in 2011. From there, the faculty will determine its next steps.  Questions about the ways that Anthropology continues to collect and evaluate assessment data can be directed to Dr. John Ziker, Chair, Department of Anthropology.

15 1/2 Weeks and Counting Down: Program Assessment Reporting

We are 15 1/2 weeks and counting down until the PAR Reports will be due this year (May 1, 2017).  As I write this, we are formalizing our last details for recruiting peer review faculty and ramping up for this semester’s workshops as well as briefing our Instructional Design staff in the Instructional Design and Educational Assessment Shop (IDEA Shop) to support your work.

Here is a check list for you to consider:

1. Has your faculty reviewed the Program Learning Outcomes for each program? Do changes need to be made?  The PLOs on record for each program are on our assessment web page:  (Register for the PLO Workshop:

2. What is the status of your program curriculum maps?  Have your faculty reviewed them?  A curriculum mapping is available online ( and a workshop is scheduled for February 3rd from 9-10:30 am (Register for the workshop:

3. What is the status of your assessment plan?  Programs are in various positions on their assessment cycle — the report will ask you to provide an update on the program’s assessment plan, results, and next steps.  Workshops to assist faculty in thinking through the fundamental steps of an action based assessment plan are also on the calendar for this spring.  See the CTL Workshop calendar for events on February 17th and March 9th.  (

4. Have you reviewed the Google Folders shared with you which include the program reporting documents for each program?  (If not, search Google Folders “shared with me” and search for my name.  Call or email me if you need assistance with this step). Add the shared folder to your drive for easy access.

Some departments have found it helpful to hold a workshop for their own faculty or the team that is/will be working on the PAR process. If you would like to schedule a workshop, please contact Teresa Focarile in the Center for Teaching and Learning (

Please know that we in Institutional Research, The Center for Teaching and Learning, and the Instructional Designers in the IDEA Shop are here to support your work.

Stop, Drop, and Reflect: Tips to Quickly Evaluate Courses for Student Learning

Coffee Mug about teaching

Image Source: TheHoldFastery

I saw a post this morning sent by a teacher friend that captures how many of us feel at the end of a semester.  Maybe everything is not on fire for you, but the last push for grading leaves many of us feeling like we are.  When we are that depleated, the last thing we want to do is to spend time reflecting on what those grades tell us.  But, stop before you run off — course assessments capture the story of students’ learning and our teaching.

Assessment guru, Linda Suskie reminds us that, despite exhaustion, we need to use this moment to dig into assessments.   She recommends, “pick at least one key test or assignment in one course whose scores aren’t where you’d like them. Your analysis and reflection on that one test or assignment will lead you into the habit of using the assessment evidence in front of you….”  As Suskie argues, investing time right now will improve your teaching.  I know the eggnog is so tempting.  Pour yourself a cup and sit down with one assignment to see what it tells you.  See the full blog post; it’s worth a quick read.

A Helpful Assessment Guidebook

So, all of this assessment talk makes your head spin?  You have company, and (luckily!) plenty of readily available assistance.

One resource that I found not too long ago is “Learning Outcomes Assessment: A Practitioner’s Handbook” (Goff, L. et. al,, 2015), a publication of the Higher Education Quality Council of Ontario (HEQCO).   The very readable online text draws from higher education scholars and provides simple strategies that departments can use to design learning outcomes and assessment approaches.  It also provides several case studies to contextualize the theoretical framework for assessment explained in the book.

As others have written, successful assessment hinges on shifting the organizational culture from one that looks at assessment as a hoop through which academics must jump every 5 or so years to one that is effective in terms of supporting teaching and learning.  To that end, the book also provides metrics that programs can use to evaluate their OWN efforts to develop assessment approaches and to follow through with those plans.  Formative and summative assessment of student learning, of our programs, and of our processes are supportive of the ongoing effort to harness the transformative power of assessment.

Learning Outcomes Verbs

Sorting through the archives of the National Institute for Learning Outcomes Assessment (NILOA) papers you will find scores of helpful articles intended to guide efforts to craft and conduct assessment of higher education learning.  As our University embarks on a new phase of program learning outcomes assessment efforts, I highlight one such article that may be helpful for those who are re-evaluating program learning outcomes statements and considering how to assess them.  “To Imagine a Verb: The Language and Syntax of Learning Outcomes Statements” (Adelman, 2015) offers an alternative to Bloom’s taxonomy which, while useful, can also lead to passive statements with no clear path to assessment.  Instead, Adelman, who has earned his stripes as an assessment guru in the US and abroad as a Senior Research Analyst at the US Department of Education for 27 years and a Senior Associate at the Institute for Higher Education Policy since 2006, offers an alternative.

What follows are a few guidelines gleaned from Adelman’s essay.

1.       Think carefully about the language used for learning outcomes statements. The statement is a declaration of learning that graduating students demonstrate as a result of a particular academic program.

2.       Adelman counsels pairing verbs that are operational (e.g. gathering, analyzing, creating, arranging)  with the object of the operation to clearly demonstrate learning (e.g. “analyzing regulatory conditions,” or “visually displays the functional groups of organic molecules”).

3.       Work recursively between current or planned summative projects or assignments and learning outcomes statement.  For every learning outcomes statement, clearly identify at least three projects, assignments, or activities so that students and faculty alike will be able to look at a program learning outcome and say, “Ah! I know what that means.”

4.       Statements including “’ability,’ ‘capacity.’ ‘teamwork,’ ‘communicate,’ and’ critical thinking’” steer program learning outcomes into foggy territory. All are essentially unobservable and can be replaced with words that provide transparency for faculty and students.

5.       Adelman provides a set of “productive active, operational verb groups” structured around function rather than level of thinking as Bloom’s taxonomy offers.  Words may be in more than one grouping – all of the following serve as examples; this is not an exhaustive list.  See below for a sampling of this approach:

Verbs indicating the modes of student characterization of the objects of knowledge or materials of production, performance, exhibit

Categorize, classify, define, describe, determine, frame, identify, prioritize, specify.

Verbs describing what students do in processing data and allied information

Calculate determine, estimate, manipulate, measure, solve, test, arrange, assemble, collate, organize, sort

Verbs describing what students do in explaining a position, creation, set of observations, or a text

Articulate, clarify, explicate, illustrate, interpret, outline, translate, elaborate, elucidate

Verbs falling under the cognitive activities we group under “analyze”

Compare, contrast, differentiate, distinguish, formulate, map, match, equate

Verbs describing what students do when they “inquire”

Examine, experiment, explore, hypothesize, investigate, research, test

Verbs describing what students do when they combine ideas, materials, observations

Assimilate, consolidate, merge, connect, integrate, link, synthesize, summarize

Verbs that describe what students do in various forms of “making”

Build, compose, construct, craft, create, design, develop, generate, model, shape, simulate

Verbs that describe the various ways in which students utilize the materials of learning

Apply, carry out, conduct, demonstrate, employ, implement, perform, produce, use

Verbs that describe various executive functions students perform

Operate, administer, control, coordinate, engage, lead, maintain, manage, navigate, optimize, plan

Verbs that describe forms of deliberative activity in which students engage

Argue, challenge, debate, defend, justify, resolve, dispute, advocate, persuade

Verbs that reference the types of communication in which we ask students to engage:

Report, edit, encode/decode, pantomime (v), map, display, draw/ diagram

Collaborate, contribute, negotiate, feed back

Verbs that describe what students do in rethinking or reconstructing:

Accommodate, adapt, adjust, improve, modify, refine, reflect, review

For more of the discussion and description of the approaches that Adelman advocates for learning outcomes assessment writing, please see