This One Strategy Changed the Way I Think About Assessment
By Dorothy Hayden, Virginia Military Institute
Why do we assess our programs? There are plenty of jargon laden reasons that we can give to explain why we assess. When I started to explore the topic of assessment in career services about 5 years ago, I desperately searched for a document that would tell me why such diverse stakeholders were wanting learning outcomes, survey results, and first-destination data in rapid succession. My research allowed me to understand just how diverse our stakeholder groups are, but my research was not giving me a good reason why I should assess.
What motivates you to assess your programs, your advising appointments, or other work? I kept reading book after book on assessment and evaluation looking for my answer. Nothing. Finally, I came across the idea of iteration assessment. Iteration assessment requires that the process include building, operationalizing, reflecting on, and improving the process. The word reflect stood out to me. I had always considered that assessment was a purely quantitative analytical process. When I viewed assessment from the lens of reflection, I found a meaningful connection between assessing programs and leading programs when I read Marilee Bresciani Ludvik’s 2006 book, entitled, “Outcomes-Based Academic and Co-Curricular Program Review: A Compilation of Institutional Good Practices.” I took a step back and thought about a few of my office’s most recent assessments. What was the common goal? Why take the time to do pre and post testing? What did this assessment help us achieve? I realized that the common underlying theme of all of this work was to help our students.
Fast forward to today and I now have developed a set of questions which helps me to concurrently build my program and its assessment.
- How does this program benefit our students? As my colleague Dr. Alicia Monroe so aptly says, “It’s all about the students.” We need to ask ourselves when we begin to develop an idea how our students benefit from this action. In offices that are frequently understaffed; every program and every assessment needs to have a purpose. Our goal in career services is to serve our students.
- How will the time invested in evaluating X help you to answer questions previously unanswered? Whether you’re running a pilot program or a career fair for the 100th time, you need to be clear on what you are hoping to assess. Do not collect data for the sake of collecting data.
- When during the iteration process will I evaluate (reflect) on the process? If you don’t identify targeted times to stop and review, you probably won’t assess until June. Set time aside to review your program, the evaluation tool, and preliminary results. This process could be as simple as meeting with your team to review the results collected during a certain amount of time.
- How can we improve this process for the future? The iteration process is intended to be a continuous cycle. Evaluate how the process from idea development to outcomes review can be simplified, more efficient, and more effective. All of this sounds nice, but you need to know that your assessment efforts will fail. Even if you are an expert at program assessment, your tools and measures will at some point be flawed, provide inadequate information, or provide no significant results. Just like the iteration process, you need to develop the time and space to reflect on what you are doing with the intent to improve for the future.
In my learning about assessment, I realized that assessing in career services is about leadership (regardless of our position). We need to be willing to develop a vision, build a plan, run the plan, and take a careful look at the results to consider our outcomes and to improve for the future.
Once you have taken all of these steps, consider sharing what you’ve learned and share your knowledge with others. Did you know that EACE has an Assessment Resource Center for members to share and review assessment resources? If you find yourself stuck on an assessment issue or want to try something new, consider checking out the Assessment Resource Center. It’s free for members, and your contributions can help professionals throughout our region. Another item that has helped me to learn about different program ideas and assessment topics is EACE’s Twitter Chats. I have learned a lot about technology, assessment, program development, and received some great ideas from attending these chats. The chats are held the second Tuesday of each month. Learn more here.
Dorothy Hayden is the Assistant Director of Career Services at Virginia Military Institute. Dorothy enjoys reading about new ideas, programming, and design. When Dorothy is not at work, you’ll likely find her volunteering, cooking, or creating something artistic.