April 2005

Student Evaluations of my Performance

One of my goals this first year was to reflect upon my experiences as an adviser. By candidly writing about noteworthy occurrences and admitting areas in need of improvement, these reflections would then be used to intentionally improve my practice. During the course of my journal entries, it has become more and more evident that advising is a field in which, as one gains experience, the more he realizes how little he knows. It's quite humbling at times, but, with a positive attitude, the act of advising can be a pleasurable challenge. When I look at my work as pleasurable challenges involving objective problem-solving integrated with teaching/counseling/empathy/humor/teamwork, I feel a bit more grounded. Stated differently, when I acknowledge that advising is not an exact science but rather a work in progress, I am able to breathe a sigh of relief.

In graduate school, we looked at the definition of a profession versus an occupation, and every now and then I still refer back to a particular page in my text that presents Pavalko's (1971) occupational-professional model (as cited in Komives and Woodard, 2003). I have decided to include it here because it's nice to remind ourselves that we are in this field for the long haul and that the learning never stops. I prefer to think of advising and student affairs not as a professional practice that is never mastered but instead as a body of knowledge that continually builds upon the previous work of others.

Table 1

The Occupational-Professional Model
Dimensions Occupation Profession
1. Theory   Absent Present
2. Relevance to social values   Not relevant Relevant
3. Training period A Short Long
  B Non-specialized Specialized
  C Involves things Involves symbols
  D Subculture unimportant Subculture important
4. Motivation   Self-interest Service
5. Autonomy   Absent Present
6. Commitment   Short term Long term
7. Sense of community   Low High
8. Code of ethics   Underdeveloped Highly developed

Source: Adapted from Pavalko (1971), as cited in Komives and Woodard (2003, p. 374).

It becomes evident in academic advising that the more I invest myself in my role(s), the more I feel myself moving along the continuum from occupation to profession. Because I truly love what I am doing, I sometimes wish to speed up the process! To aid in assessing my abilities as an adviser, I decided this month to develop a student survey for the first-year students I saw throughout the academic year. I chose only first-year students because they had not been exposed to other advisers, and I felt that this would give me the most unbiased responses. Also, because the first-year students on my caseload are required to see me quarterly, by the time they filled out one of my surveys, I would have advised them a minimum of three times. I realize that there are more formal advising assessment instruments available, but, for the purposes of my own improvement, I thought a self-designed (and quite informal) instrument would best suit my needs. It would provide timely feedback on issues that I think are most important in an advising session. It would also be cost-effective, costing nothing more than my time and the cost of copies.

I have heralded the usefulness of the National Academic Advising Association (NACADA) listservs before, and here they enter once more into the equation. Within a week after asking other advisers to share similar “individual adviser surveys,” I had roughly fifteen different versions to work from. Most of the versions were abridged adaptations of formal advising assessment tools, modified to meet the needs of various advising units. I was able to pick and choose the questions I wanted on my survey, as well as the general layout and length. The final version of my survey takes about five minutes for a student to complete and utilizes Likert-scale and two open-ended questions. As part of my annual review, which will take place in July, my director and I will review the responses. My hope is that having the director's observations of my yearly performance combined with students' evaluation of my advising will give me the most effective means of feedback to improve and inform my future practice. Is it scary? Yes. Will it make me a better adviser? Most definitely.

Because individual adviser surveys are not used by my entire advising unit, it was up to me to figure how to distribute them to students in an effective and confidential manner. Response rate is always an issue that comes up in assessment, so I decided to give the survey to the students at the end of their last required advising appointment. They were then asked to complete the survey in the waiting area, insert it into the provided manila envelope, and place it in my mailbox. Students were not to put their names on the surveys, and I ensured them that I would not look at the responses until all of the first-year students had been seen for the quarter. As more and more students filled out the surveys, they continued to pile up in the said manila envelope, helping students to feel more comfortable, since putting a completed survey into the manila envelope was like randomly inserting a card back into the deck. Of course, this evaluation was by no means scientific or close to perfect, but I tried to adhere as closely as I could to proper research procedures.

By giving the surveys out at the end of every appointment, it allowed me to still have the maximum amount of allotted time with each student. I did not want the surveys to affect the content covered in the advising appointments. I also made it clear to students that participation was completely voluntary, and a few students chose not to complete the survey. I did not inquire about their reasoning, but it would seem that (1) they may have had class directly after our appointment and therefore not enough time to complete a survey or (2) they realized that the surveys could not be guaranteed to be confidential.

Within a few weeks, I had gathered over twenty surveys and was looking forward to a response rate of at least 80 percent of my first-year students. However, during the course of the month, our office was granted the financial resources to complete the standardized ACT Survey of Academic Advising. For me, that meant suspending my individual surveys as our entire unit began to administer the ACT assessments. Even though I didn't collect as many responses as I would have liked, I did learn several things from my informal assessment project. Some of the learning came during the process of creating and administering the survey, and some came as I looked over student responses.

The Process/Act of Administering

1. Don't try to reinvent the wheel! I could have wasted countless hours of research and development time trying to design a survey to meet my needs. I have learned that any idea is worth tossing out to the listservs for feedback, help, or suggestions. Referring again to the definitional use of profession, one can see that it involves a high sense of community among its members. A profession's members also have a long-term commitment to a given cause, and motivation is driven by service. The listservs are a perfect example of community. There are so many brilliant advisers out there, and collaborating creates a synergy of knowledge that makes us better and more efficient advisers.

2. Have a coworker look over the survey before you begin to hand them out and even go so far as having a work-study student complete a trial run. I did not do this with my first version of the survey (I handed out three of these and eventually threw them out) and could have caught a few mistakes by getting feedback before beginning. What made perfect sense to me in terms of wording, phrasing, and questioning did not necessarily make perfect sense to others. Having another set of eyes look over my survey would have saved time in the end by increasing clarity and conciseness at the outset.

3. Students seemed to appreciate it when I told them why I was giving them the survey, and doing so likely increased their willingness to participate. I acknowledged that this year had been my first year as a full-time adviser and that the surveys were intended to help me figure out what went well and what areas could use some improvement. A few students went so far as saying “that's a cool idea,” and I think their knowledge of my purpose for the assessment tool reiterated the fact that I do care about their advising experience.

Reading the Responses

Of course, the main issue here is that I would have liked more responses. Due to our unit's ACT implementation, I didn't really gather much impressive information from my surveys. The sample size was too small, and I think the responses were a bit skewed as I failed to get any data from students who tend to show up later in the quarter. Most of my sample was comprised of students who are always punctual and who did very well academically throughout the year. While it was flattering to read all the great responses, I know that some of my students wouldn't have rated me so highly. I really wanted this information and the reasons behind it—next year, I guess.

The Likert scale questions seemed to work well insomuch as their simplicity and ability to be completed quickly. However, I gathered most of my meaningful information from the open-ended questions at the end of the survey. The one item that I will modify next year will be to strike the words “please be specific” and replace them with “please provide two or three examples.” Asking students to be specific still generated vague responses. I think asking for examples might help to focus their thoughts a bit more.

The surveys were a great learning experience and a tool that will remain extremely valuable in the years to come. I enjoyed getting feedback about my performance from those who matter most: the students. And, although it's a bit uncomfortable to get raw honesty from students, it's also something that fuels me to continue improving as an adviser. It's almost too obvious at times to realize that one of the best ways to measure our performance is to simply ask our students how they feel about their advising experience.

General Undergraduate Academic Advising (GUAA)Adviser Evaluation

We are seeking feedback from you, the student, about the service you receive from your adviser. Please help us by completing this form. Thank you!

Directions: Please circle the response you feel is appropriate.

Table 2

General Undergraduate Academic Advising (GUAA)Adviser Evaluation
Attribute Not Present/Absent
Attribute is absent
Minimally Present/Emerging
Attribute has been attempted but deficiencies are evident
Present but Not Completely
Attribute is present but is not fully developed
Highly Present/Complete
Attribute is fully developed and consistently applied
My adviser ... Strongly
agree
Neutral Strongly
disagree
Not
Applicable
1 creates an atmosphere in which I feel comfortable 1 2 3 4 5 0
2 treats me as an individual with unique needs 1 2 3 4 5 0
3 is helpful in teaching me about EWU General Education Requirements 1 2 3 4 5 0
4 is interested in my academic progress 1 2 3 4 5 0
5 is helpful in assisting me if a problem arises 1 2 3 4 5 0
6 suggests other campus resources that are relevant/of interest to me 1 2 3 4 5 0
7 teaches me to use the EWU Catalog 1 2 3 4 5 0
8 responds to my phone calls and/or e-mails in a timely manner 1 2 3 4 5 0
9 talks with me about long-range goals 1 2 3 4 5 0
10 is someone I would recommend to other students 1 2 3 4 5 0

What has your academic adviser done that you found particularly helpful? Please be specific.

What could your adviser have done to be more helpful? Please be specific.

Please circle the appropriate response.

Number of times I have seen my adviser this year:    1-2 times    3-5 times    6 or more times