Summary: How does working under a social model of disability change how we do data collection in a disability resource office? How do we think differently about our work and what we document? This conversation with Sharon Downs (Assistant Vice Provost at University of Arkansas at Little Rock) explores some of the challenges and offers some starting points.
There is a lot of emphasis in the field on re-examining the work we do to focus on the environment. How does this shift to a Social Model of disability impact how we assess our work and the data we collect in a DS office?
Sharon Downs (SD): What we’re looking at is a completely different set of facts. With the old focus, data was all about how many students we see, what kinds of disabilities they have, how many interpreter hours were needed, etc. With the new focus it’s very much an external focus with the goal of developing an entire campus community to have the value of inclusive campus environments. The focus of data collection goes from working with 650 students in a semester to looking at an entire campus community. The kinds of questions we’re exploring are completely different.
Tell us a little more about the questions you’re exploring.
SD: We focus a lot on the design of courses and syllabi. We’re trying to work department by department, to have some kind of systematic contact with every department in a proactive way this academic year. So for example, we’re reaching out and offering to do a faculty training, or offering to review their curricula. We’re not just waiting for them to contact us. With that we have the opportunity to talk about the design of their courses, and how any current design flaws, that they may not see as design flaws, create barriers to students. I ask them to think back to when they’ve gotten one of those notifications from our office and it’s been kind of a big deal to be in compliance; it was a lot of work. They always say “oh I know exactly what you mean. I’ve got several stories.” And using that as sort of a training ground, if we can work with you to design your course in a little bit different way, when you get those notifications you won’t have to do very much at all in the way of accommodation because you will have already designed it to work for the greatest number of people possible.
When I first start talking about universal design (UD) with faculty I almost always get that glazed over look in their eyes and they are thinking, “great - that’s one more thing I have to do and I’m already underpaid and overworked.” But the trick is to frame it as proactive work that benefits students but also benefits the professor tremendously in terms of being able to plan their semester and not being surprised. We’ve had professors who got a blind student in their class and they’ve reported 10-15 hours a week trying to get everything accessible with us. I love helping faculty understand that with better planning and more accessible course design, having a blind student in their class doesn’t require much extra effort at all.
What are some of the challenges with data collection when we make that kind of shift to proactive work?
SD: It’s more elusive to capture that information. With a focus on students it’s more straightforward. You have one student who has ‘x’ disability. Mark it and we’re done. Or we used ‘x’ interpreter hours. Mark it and done. It’s very straight forward. Our challenge is to think in a more creative way to assess impact in the academic environment. We’ve tried a few things. Some have worked. Some have not worked. I’ll be honest.
Whenever we do anything in a proactive manner, such as a faculty training in a department or monthly meeting, we have on our website a web form for internal use where we can track the topic, the department, how many attended, and how long it was, and it calculates the contact hours. That actually works pretty well.
When we get into the less formal, less planned activities, capturing the data can be a challenge. What we want to end up with is a picture at the end of the year of the impact we’ve had on campus. That piece is sometimes very hard to capture. So I may be on the phone for 2 hours with a professor thinking through the logistical issues in term of redesigning their class. I know, one phone call seems like that’s not very much impact, but it actually can have a long term impact for a lot of students. Tracking that impact is difficult but we’re trying to capture what we have done and the efforts we have made in making changes on campus.
It sounds like there are some parts of systems change that are hard to quantify and put in numbers. Can you share some examples of ways you are starting to gather data about changes in the environment?
SD: We’re collecting data, but I don’t know how well we’re using it. Some of its old school, some of its new school. So, for example, we have the web form on our website for students to request books to be converted to accessible formats. We download that into a spread sheet. This is all online and on the back end of the website, which saves the staff time inputting the information manually.
We developed a Disability Resource Center (DRC) interaction tracking form. We tried to make it simple so that staff would use it. You select the date, the staff involved in which activities (all listed so you just check a box), and interaction type--we have proactive and reactive. We had to have discussion about using this form as a staff and I told staff I don’t want you spending a whole lot of your overworked selves documenting this. For example, the form isn’t for things like making a phone call, or when a student comes in. We really want to just use the form to track interaction of meaningful length and content.
Tell us more about types of interactions you’ve identified on the tracking form.
SD: Sure, so we have proactive options, and you just check a box from a short menu. Topics are UD, implementing accommodations, relationship building, and other. Once you select the topic, select which college. We’ve found that historically we focus on the same couple of colleges. Mostly in the social sciences is where we find people that generally want this information. We’re missing more of the hard science areas and folks who are more oriented to this. So as we collect this data on the tracking form, we check to see which colleges we have had no contact with. We track the number of contacts and the number of hours broken down by half hour. We also have reactive options on the form. They include student issues, professor issue, environmental barrier, educational barrier, materials barrier, and other.
How did you develop the interaction form? Did the staff do this together? Was a data person involved?
SD: I’m the web person in our department. So when we have ideas we want to explore, we usually talk to flesh things out. I usually develop the online piece, share with the group, and get feedback. Staff are pretty well versed on the social model, so the connection with data collection came pretty intuitively. Their biggest concern was to make the form easy to use.
Some other data we collect include:
- Service evaluation of interpreters and transcribers. We get feedback from faculty at the middle and end of the semester. We have DRC service provider evaluation of their semester to hear about challenges and what went well. We get feedback from students who used interpreters and transcribers.
- Campus barriers form. There is a link on our website to a form for reporting a barrier on campus. It’s online so it can be submitted by anyone (anonymous or with a name) to DRC and we get it to the right person on campus to address the issue.
- DRC training evaluation. This form is available online for people who have attended a training we provided.
- DRC feedback form. This is a form for gathering feedback from anyone. It’s on the DRC front page, and also in every DRC employee e-mail signature line.
Can you share any examples of ways you have used your data in an annual report?
SD: Two years ago we began accommodated exams in the university testing center to integrate with other tests. To get to this point, we surveyed faculty about their needs and wishes for this, and the feedback was very encouraging. After the change was implemented, we surveyed those faculty who used the university testing center and we’re using this data to improve the process. The survey questions were simple. Did you use it? Did it meet your needs? Would you recommend to others? Our focus was on the changes in the environment and this data was included in the DRC annual report. Next we’ll survey students about their experience with accommodated exams.
Does collecting data about changes in the environment require collaboration with other campus partners?
SD: Yes, definitely. For our data, we got the test center’s list of faculty users. They sent out the survey, but under a letter of support from our Vice Provost for Student Affairs. Both of our departments were able to use the data on our annual reports, which fit well into our provost’s emphasis on collaboration between departments.
Have you tried to track changes in student use of accommodations or in your office?
SD: I don’t know how to assess that because of the limitations in our system. A student meets with us for the first time and we talk about barriers and accommodations. We don’t have students come back and meet with us unless they have a need, because doing so creates a burden for the student. If anything changes, we welcome them to come back in. We tell them, “if you have any barriers we didn’t anticipate, we can tweak your accommodations.”
The challenge too with this is students may have accommodations listed but not need them in certain classes. So a student may request letters to go out to faculty, but once they get in class they don’t need the accommodation because a professor has designed a system where they can have the option of taking a timed test, or writing a paper. So the student does a paper and doesn’t need that accommodation for extended test time. If we just count the letters sent, we don’t know if the student actually needed or used the accommodations so it’s hard to track changes.
Any ideas about how that could be approached?
SD: I wonder about surveying students and asking questions about experiences where they didn’t use their accommodations. It would be easy to survey students who sent a faculty notification letter (FNL), and ask them which accommodations they didn’t use and why. That would be good anecdotal information. Even if it was just anecdotes and blurbs from students, such as “Speech class was easy and I didn’t have to use accommodations because of the way the course was designed.” That would look amazing on an annual report.
If the focus is on barriers in the environment, should we should be still tracking student numbers?
SD: I would love to ditch it. However, people in administration often need this information. At some point we’re going to need this data for whatever reason. For example, federal forms that have to be filled out to know we’re doing what we’re supposed to be doing. We get asked for this data.
What advice do you have for newcomers to thinking about data collection under a social model of disability?
SD: You can’t go forward until you have a vision and mission statement. Without them, it’s impossible to develop anything meaningful. Everyone on staff needs to help develop these. You need to be sure all are on board, and it satisfies the needs of your division.
This helps you focus. When we develop our own department strategic plan and we have a new task, then we can ask, “Should we do this? What are our vision or goals?” Until you have this, you’re grasping at straws. Your data is not going to tell a cohesive story about your department. It’s absolutely critical that it’s developed with staff. You have to have buy in. Otherwise you, the director, are going to be doing all of the work.
Thanks to Sharon Downs for participating in this interview.
Contact information:
Sharon Downs, Assistant Vice Provost - Inclusion and Wellness
Associate Dean for Division Strategy
University of Arkansas at Little Rock, Division of Student Affairs
sadowns@ualr.edu