SCUP 2018 Session Recap
by Janet Webb, Institutional Projects Coordinator, Laramie County Community College
Session: Connecting the Dots: Assessment, Accountability, Analytics, and Accreditation
Presenters: Linda Baer | Ann Hill Duin, Professor, University of Minnesota-Twin Cities
As a first-time conference attendee at the SCUP’s annual conference, and as a student member, I was seeking any sort of familiarity when making my plans for which sessions I would attend. So when I spotted the session, “Connecting the Dots: Assessment, Accountability, Analytics, and Accreditation,” led by Linda L. Baer and Ann Hill Duin, I was comforted. I had not only read, but also cited Baer’s article for which the session was named (which can be found here) in a course term paper in a previous semester.
The session did not disappoint. It provided insight into many of the dimensions of institutional performance, how competing demands within these dimensions of institutional performance can harm data efforts, and how planners can “connect the dots” to manage these competing demands.
There are four dimensions of institutional performance: accountability, accreditation, assessment, and analytics. Generally, accountability and accreditation influence how you perform assessment and analytics. Unfortunately, accountability and accreditation make competing demands that can bring about paradoxes.
Accountability must satisfy demands from state priorities, academic concerns, and market forces. These demands are not always aligned and often pull away from each other, creating tension.
Accreditation, on the other hand, serves various roles for an institution. These roles include gatekeeper, auditor, and driver of continuous improvement—three roles that can compete with each other, creating paradoxes.
These competing demands can lead to fragmented assessment and analytics. Baer suggests that in order to “connect the dots,” between these fragmented efforts, we must ensure our institutions are focusing on the core functions of assessment and analytics: using the data we receive to gain insight and take action. This will help us navigate the multiple dimensions of performance and the paradoxes they bring.
Duin brought a real life example to the presentation, discussing the issue of disaggregated data at the University of Minnesota-Twin Cities (UMN). Duin’s case study highlighted the institution’s efforts to improve alignment in learning analytics. As UMN studied what was helping students most in online courses, they realized their data was disaggregated. So, UMN focused on “building out a comprehensive data platform as a way to normalize data across systems to then draw insights from the full student use.” This work improved the alignment of their metrics and outcomes across the institution.
During the session, we were given an opportunity to discuss issues with each other. We were asked to consider the dimensions of performance and answer a series of questions about our home institution. The questions prompted us to consider:
These discussions elicited a variety of perspectives and approaches to these dimensions and encouraged the exchange and consideration of ideas among many of the participants in my discussion group.
Overall, the session was informative, engaging, and very relevant as many institutions are facing the competing demands from accountability, accreditation, assessment, and analytics. More intentional effort, planning, and integration are needed to improve performance in each of these dimensions.
As Baer said, “As planners you need to be concerned about where do you fit [within the framework of paradoxes], when, and what are the metrics that matter.” So we must position ourselves to make a best effort at “connecting the dots and mov[ing] from data to insight to action to improve planning accountability.”
More recaps coming soon.