I had the pleasure of being at the CACUSS Assessment Institute recently, and it was such a rewarding and exciting experience for me. I met so many professionals from across the country who are looking to improve their programs and have more data to make informed decisions. It was truly so inspiring to be a part of some folks' assessment journey and hear all the 'ah-ha' moments that happened along the way. There are very few things in my career as rewarding as helping people become excited (instead of scared!) about assessment. The fact that my mentee group bought me a cool cat card, signed it with their thanks, and make an effort to show their appreciation, was a cherry on top. If you were in my mentor group and you are reading this: you inspired me so much by your dedication, engagement, and willingness to challenge yourself to be a better assessment professional - thank you!
There were a lot of themes that came out of the institute, including decolonizing assessment which I am still digging into and processing. I hope to share some of my key take aways from that conversation that I think will help me be a better educator in the future and more aware of the idea: "not about me without me". In the meantime, I wanted to share a conversation that I feel like I had over and over and over at the institute - thinking about assessment as a process, and not just a practice.
There are a lot of assessment models out there, which discuss identifying outcomes, selecting measures, engaging in assessment, using data to identify and bridge gaps, make program improvements, and revise outcomes. Obviously this is very important to ensuring that we think about assessment as something that we are constantly doing, and not just a box to tick and say we are done. However, I think we tend to get caught up in doing whatever kind of assessment or evaluation we prefer or feels safest, or in many cases, that our administration request from us. It can be very easy to become complacent and rely on an annual survey to tell us everything we need to know, or to count the number of students who participated in our programs. But what does that actually tell us? I want to know 'why' our students feel that way, not just that they do, or not just that a handful accessed our services. I want to know what they gained from that experience.
In an assessment presentation in 2017, I put together what I like to call the Puzzle Pieces of Assessment (Wills, 2017) to highlight that a variety of tools and techniques need to be used in combination to more effectively share our results. From my experience in student affairs, generally we focus most of our time, effort and resources on the pink and purple pieces - are our students satisfied and how many participated? While I do think that is a vital part of the our work, I also believe we need to think beyond experience and engagement, and really spend time intentionally digging into the learning and effectiveness.
If we think about our programs as a puzzle, without all the pieces you might have to make assumptions about the rest of the picture. That's similar to how I feel about assessment - without all of the pieces, it can be difficult to share a meaningful story that not only tells the 'what' but also includes the 'why'. Why do our students learn from our programs? Why are there gaps between what we thought they would learn and what they actually acquired? Why is our program working [or not]? While qualitative data in surveys can provide a lot of this context, using formative assessment throughout experiences can also give us that information in advance. We shouldn't have to wait until we finish a program to send along a survey and realize that our students might have been satisfied but they may not have learned what our outcome intended. Instead, formative assessment can help us learn about our students throughout the experience, bridge gaps along the way, and gain context. That way, when we do use surveys we have context for why we received some of those results. Furthermore, it also means we don't have to wonder, "So we had 500 students attend, but we don't actually know what they gained from the experience". This way, we can say that we had 500 students and not only does our survey data indicate they were satisfied with their experience, they demonstrated that they can list 3 on-campus resources for personal support in the future. See the difference?
I often believe that people think I only value formative assessment, but that isn't true. I think formative assessment is a very easy tool to use to gain information in a meaningful, manageable and measurable way, especially for student affairs professionals/educators who facilitate workshops, orientation programs, and 1:1 coaching. However, I don't think that formative assessment can paint a whole picture of our students' experience on its own... just like I don't think that you can make ALL your decisions based on survey results on a Likert scale. Even more importantly, I don't think that quantity alone = learning and effectiveness. So instead of thinking about assessment as something you have to do after a program or a box to tick, think about how you can view it as a process - during and after - a learning experience. Think about how demonstrating learning, evaluating satisfaction and including metrics together can help you prove your value, worth, and program effectiveness. After all, as educators we have a responsibility to not only provide a learning experience but to also continue to improve it for our current and future students!
A creative educator striving to enhance the holistic student experience and committed to exploring personal strengths and fulfillment.