top of page

In my time as a graduate student in Higher Education, I've learned just how valuable assessment, evaluation and research can be. While those three words sound like they're the same thing, they actually have slight variances that are important to note in the student affairs field. Research is conducted in the hopes of gaining some new knowledge, or to re-create another study done in the hopes of either proving their knowledge wrong, or reinforcing the new knowledge. Assessment is conducted to determine the effectiveness of various programs and practices conducted. Evaluation is using the research or assessment evidence in the effort to improve current programs and practices. Using all three together ties improvement in student affairs into a nice, neat little package.

According to ACPA and NASPA, this competency "focuses on the ability to design, conduct, critique, and use various AER methodologies and the results obtained from them, to utilize AER processes and their results to inform practice, and to shape the political and ethical climate surrounding AER processes and uses in higher education" (ACPA/NASPA, 2015, p. 20). It is so important in student affairs because without it, professionals would be missing so many key aspects of the job. First, we might miss the signs of effectiveness (or lack thereof) of our programs and initiatives (assessment). We would not be able to improve our programs and initiatives (evaluation). And finally, we would not be able to recognize trends in the development of our students, our field, and the world itself (research).

When beginning my assistantship at MIT in 2015, I knew that assessment was relatively important - but I had no idea just how important it could be, or how it ties into everything we do as student affairs professionals. When the semester started, I was told by my supervisors that I would be handling and processing all of the Organization Membership Management (OMM) forms of the 37 FSILGs at MIT. Nearly 2,000 students, which is just under half of MIT's undergraduate population, would go through me. I thought it was dull                            and useless work, until I ended up being the one to analyze all of the OMM forms to determine important data -                                      membership numbers, recruitment statistics, and even retention in each organization. It turned into a very large                                     responsibility, and I can proudly say that I am much more proficient in Excel than I ever thought I would be.

                        Going on two years later, I have seen time and again the importance of these OMMs. By facilitating appropriate data                          collection for department-wide assessment and evaluation efforts, I am able to recognize trends and patterns        within the community - such as which organizations are having retention issues, which organizations are on an upward recruitment trend, or which organizations may need some assistance with filling the available spaces in their houses. To the left, you'll find an Excel document that is a blank version for a fake FSILG, so you can get a sense of the kind of information I was collecting and monitoring. Now, I oversee this process as it is managed by the graduate assistants in the office.

Every campus has its own form of an Institutional Review Board (IRB). I worked with them in my undergraduate career to

complete my study for my senior thesis paper. At the time, I didn't understand just how important an IRB actually is - I was

just irritated at all of the hoops to jump through. After completing the Collaborative Institutional Training Initiative (CITI)

training on Social and Behavioral Research, I understand why there are so many hoops to jump through. There is a lot of

important and confidential data at stake when studying other people, and it is important to make sure you can collect data

while not compromising the rights of the people you study. By understanding and following IRB methods early on in my

career, I am able to have that experience to draw on when I start research in my professional (or perhaps Doctorate) life! 

My certificate of completion for my CITI Training can be found to the right, by clicking the PDF icon.

 

According to Henning and Roberts (2016), learning outcomes have a lot of benefits in the Assessment process in Student Affairs. They can help not only with the assessment, but with the planning of the program needing assessment. In my practicum this past fall at Lesley University, I worked in the Office of Student Activities as the Graduate Intern. In particular, I worked on the CommonLYNX retreat, which is a retreat based on diversity and inclusion education led by peers. While this retreat has been going on for around 5 years, there is currently no assessment plan in place. As the Graduate Intern, I sought to change that by creating and designing some learning outcomes for the retreat - both for the Student Staff and the Participants. The outcomes I designed can be found below.

For Student Staff:

 

  • Student Staff at CommonLYNX will be able to organize a presentation on at least one form of privilege that exists in today’s society.

  • Student Staff at CommonLYNX will be able to support retreat participants that may struggle with cognitive dissonance, as measured by a short pre- and post-retreat survey administered to participants.

 

For Retreat Participants:

 

  • Retreat Participants at CommonLYNX will be able to identify at least two privileges that they may hold.

  • Retreat Participants at CommonLYNX will be able to describe how at least three of their identities intersect.

  • Retreat Participants at CommonLYNX will be able to distinguish between three different forms of privilege that exist in today’s society.

  • Retreat Participants at CommonLYNX will be able to explain how wealth distribution in the United States of America affects at least two different groups of people.

  • Retreat Participants at CommonLYNX will be able to explain what the acronym LGBTQ stands for.

  • Retreat Participants at CommonLYNX will be able to identify the difference between gender and biological sex.

  • Retreat Participants at CommonLYNX will be able to distinguish between at least two religions.

  • Retreat Participants at CommonLYNX will be able to name at least one way that stereotypes can be harmful.

  • Retreat Participants at CommonLYNX will be able to articulate the difference between racism and prejudice.

  • Retreat Participants at CommonLYNX will be able to distinguish mental and physical disabilities.

The format I used above is known as SWiBAT, or Students Will Be Able To, according to Henning and Roberts (2016). The idea is to start with "students will be able to..." and then add an action verb and condition behind it. So for example, if we take the outcome "Student Staff will be able to organize a presentation on at least one form of privilege that exists in today’s society" and break it down, "student staff will be able to" is the beginning, "organize" is the action verb, and "a presentation on at least one form of privilege that exists in today's society" is the condition.

Another common way to write outcomes, according to Henning and Roberts (2016) is the acronym ABCD, although the formula is actually written differently: Condition + Audience + Behavior + Degree. If we were to take the same outcome as the last example and re-word it using this formula, it would read: As a result of the CommonLYNX retreat, student staff will be able to organize a presentation on at least one form of privilege that exists in today’s society.

According to Schuh, Jones and Harper (2011), outcomes should be aligned to program goals. By doing so, it allows you to link the program to greater goals, such as that of the department, the student affairs division, and even the institution itself. When this is done, "you can illustrate how well the program is contributing to meeting higher-level organization goals and strategic planning initiatives" (p. 328). Beyond that, having your goals and outcomes linked will allow you to better plan the program as a whole. While some of the outcomes I created were unable to be utilized this year, such as a pre- and post-retreat survey, the plan going forward will be to utilize them in more in-depth planning in future years. Having developed this skill in my practicum and seen it play out in planning, I feel confident that any other programs I do in the future will be more effective than if I had not used this skill in a situation out of the classroom.

References

  • ACPA/NASPA. (2015). Professional Competency Areas for Student Affairs Educators.

  • Henning, G. W., & Roberts, D. (2016). Student Affairs Assessment: Theory to Practice. Sterling, VA: Stylus Publishing.

  • Schuh, J. H., Jones, S. R. & Harper, S. R. (2011). Student services: A handbook for the profession. San Francisco, CA: Jossey-Bass.

Assessment, Evaluation, and Research

Last Updated: 08/10/17

bottom of page