2010 Poll - Executive Summary

In February and March, 2010, the Site License Software Consultation (Statistical) Team in the Office of the Chief Information Officer (OCIO) conducted the web-based CIO Technology Poll, which contained a series of 15 questions (see Appendix B).  This survey was commissioned by the Office of the CIO at The Ohio State University. This annual omnibus satisfaction survey provides data that is used by departments within the OCIO for assessing programs and services.

Summary of Results

Respondents were asked to rate their satisfaction with 13 different services, overall satisfaction, and leadership provided by the Office of the CIO on a 5-point scale (1-5) from very dissatisfied to very satisfied. According to the 2010 OCIO Survey, percentages of satisfied users ranged from a low of 46% to a high of 72% for these services and support. IT Support Service (8-HELP), Course Management System (Carmen) and Campus Wired Network (OSUNet) were identified as the top three services and support areas within the OCIO with greater degree of satisfaction. Student Information System, Media Services (streaming, videotaping, equipment loan), and Student Computing Center were found to be the areas with low satisfaction ratings (see Appendix A & B for details).

Also provided in the poll, was room for our customers to provide comments and constructive feedback.  The comments (see Appendix D) have been collated and aligned by service category.  Common themes include the distaste for Buckeye Mail, the frustration associated with only 30Mb of space in WebMail, and numerous suggestions for both Carmen and SIS. The Office of the CIO has an opportunity to conduct deeper analysis of these findings, prioritize an action plan, and focus its efforts on the areas of greatest need.

Methodology of Conducting the Survey

This survey of OSU faculty, staff, graduate students, and undergraduate students on the Columbus campus is based on web questionnaires completed during Winter Quarter 2010.  Completed questionnaires were collected from 366 faculty, 529 staff, 367 graduate students, and 301 undergraduate students.  Faculty and staff were randomly selected from a Human Resources database, but excluded clinical house faculty, for a total sample of 2,000 faculty and 2,000 staff.  In addition, 2,500 graduate students and 2,500 undergraduate students from the Columbus campus were randomly selected from a database provided by the Office of the Registrar.

A notification e-mail was sent to all sampled individuals. This e-mail briefly informed the recipients of the survey, advised them that a subsequent e-mail would direct them to the survey web site, and specified the e-mail address and subject line of the message. Invitations were sent to each individual at his/her published OSU e-mail address explaining the survey and including a uniquely coded URL link to the web survey. This code made it possible to track responses and reduced the chance of un-sampled individuals completing a questionnaire. To increase the response rate across all groups (faculty, staff, graduate students, undergraduate students), a reminder e-mail was sent to all individuals who had not yet completed the questionnaire by the deadline. A summary of the questionnaire timeline is shown in the table below:

Table 1: Summary of the Survey Timeline

Date e-mail notification sent

2/19/2010

Date survey launched

2/23/2010

Date reminder sent

3/02/2010

Date survey closed

3/10/2009

Response Rate and Margin of Error

The response rates and margins of sampling error at 95% confidence level are presented in the table below. The margins of error in the table show the +/- margins of difference between percentages estimated from the sample and those that would be obtained by interviewing all individuals in the population.

Table 2: Margins of Error for the Survey

Grouping

Response Rate

Margin of Sampling Error
at 95% Confidence Level

Faculty

366/1985 = 18.44%

5.0

Staff

529/1993 = 26.54%

4.1

Graduate

367/2482 = 14.79%

5.0

Undergraduate

301/2497 = 12.05%

5.6

 

Note: The respondents who opted-out from taking the survey were removed from the denominator.

In addition to sampling error, the survey is subject to other potential sources of imprecision and bias. These sources may include question wording, question ordering, and low response rate. The low response rate can be attributed to biasing factors such as timing of the survey, failure to recognize OCIO and over-surveying by previous requests. Non-participation in surveys is a growing concern. The goal of the notification email sent to all sampled individuals was to increase participation by making the recipients aware that they would receive the e-mail survey invitation.

A major obstacle to increasing participation rates for web surveys may be associated with the mode itself. Many individuals have more than one e-mail address and may prefer an e-mail address that is not in the Human Resources database. Less preferred e-mail addresses may not be forwarded to preferred addresses and may be checked only occasionally or not at all. Software filters may identify e-mail survey invitations as spam, and individuals may be reluctant to open e-mails from sources unknown to them. Also, many faculty e-mail addresses are over quota on their allotted storage space on the OSU mail server.

Data Analysis

After retrieving the data and performing basic exploratory analyses, bar charts (see Appendix B) and crosstabs (see Appendix C) were generated for each of the questions by respondent groups. A significant number of respondents preferred Neutral and Not Applicable as their response when answering the survey questions.

A "Neutral" response can reflect any of these scenarios:

  • respondents feel ambivalent about the issue and do not feel strongly about either direction,
  • respondents do not want to express strong opinion if it is not considered socially desirable,
  • or they do not remember a particular experience related to the issue that is being rated.

In this study, Neutral responses were considered as valid opinion and used in the computation of percentages on the crosstabs and bar charts.

A "Not Applicable" or missing response can reflect either of the following two scenarios:

  • respondents do not have an opinion about the issue due to lack of knowledge or experience,
  • or they never developed an opinion about the issue because they find it irrelevant.

 "Not Applicable" and missing responses were, therefore, not considered in the computation of percentages on the crosstabs and bar charts. Within the crosstabs, actual counts and percentages of the population answering "Not Applicable" or not answering the question were noted. 

Survey Rationale


The CIO Technology Poll data are used for operational metrics to help achieve OCIO strategic goals. The data are also used for assessing customer satisfaction in order to improve services and support provided by the Office of the CIO.

 

2010 Crosstabs and Questions (pdf)