The U.S. Equal Employment Opportunity Commission

Appendix D.  Methodology Details

Field Site Visits

JPS personnel visited the five district offices and one area office (both JPS and Convergys personnel visited one district office).  Convergys personnel visited an area office.  The team spent from one to one and a half days at each field site, with a goal of meeting with most of the staff in the seven offices; however, not all employees were available in all offices.  The team generally interviewed the Office Director and/or Enforcement Manager and, in the district offices, the Regional Attorney and Program Analyst.  The team also conducted several focus groups.  One was with the Charge Receipt/Technical Information Unit  Supervisor, Office Automation Assistants (OAAs), Investigator Support Assistants (ISAs), Receptionist(s), and Secretary(ies) (in offices where these jobs existed).  The team also conducted a focus group with Investigators, and, when available, a focus group with Attorneys.  The team also observed the processes followed when offices handle communication from the NCC. 

The JPS Team prepared an agenda and interview protocol for each field site visited.  The team asked employees to describe their role in the office and their perceptions of the NCC's impact on their work and their office.  When meeting with management personnel, the team inquired about their expectations and experiences with the NCC. 

The team asked support personnel (ISAs, OAAs, Receptionists, and Secretaries) how the NCC has affected their work, particularly in how they perform intake processes, support the taking of charges, and handle telephone calls.  When meeting with Investigators, the team asked about the intake process and their experiences and thoughts on the NCC's impact.  The team also asked Investigators to identify their current tasks and whether their jobs have changed since NCC implementation.  The team asked their supervisors questions to help confirm information provided by the Investigators.

Surveys

Office Director Survey. The team sent this survey to all Office Directors at each of the EEOC's 51 offices.  Where there was no Office Director, the survey was provided to the acting Office Director, the Enforcement Manager, or the District Director.  JPS sent the survey in the form of a Microsoft Word document as an email attachment.  Depending upon the extensiveness of the responses, the team followed up with some directors and requested additional information.

Prior to sending the survey, the team obtained a review of the items from OFP, OIG, and an Enforcement Manager in the field.  The final survey contained ten questions.  The items were ones that needed an answer from just one person in the respective offices; thus, sending this survey reduced the need to ask these same questions of everyone in the other survey. 

The questions asked were:

There were also a few questions asking about the intake process that the office follows and when the office records the EASQ into the EEOC's IMS.

The team initially asked for a response within one week.  When the team did not receive requested information or did not understand the responses, it followed up by email and/or telephone calls to the directors.  The team received a 100% response rate.  The survey is attached as Appendix E.

Electronic Survey.  The JPS Team drafted an electronic survey to learn the experience and opinions of staff in the field relative to the NCC.  Each survey item was reviewed by staff in OFP, the OIG, and a representative of the Union.  To ensure that the survey would perform as expected, the team then sent it to five incumbents in one field office.  The final survey was then sent to all employees located in the EEOC field offices.

The survey contained 27 items; most were multiple-choice questions.  Sixteen of the multiple-choice questions asked the respondents the extent to which they experienced certain phenomena related to the NCC.  Responses to these questions were provided on a seven-point scale (none, 1-20%, 21-40%, 41-60%, 61-80%, 81-100% and N/A).

The survey was comprised of four sections: (1) telephone calls pre/post-NCC, (2) accuracy and completeness of EASQs received from the NCC, (3) GroupWise emails from the NCC, and (4) the role of the NCC.  There was a filtering question for each of the first three sections to ensure that respondents without relevant experience would skip those sections.  Everyone was asked questions about the NCC's initial and possible future role.  There was also one question at the end of the survey where respondents were given the opportunity to provide additional comments regarding areas covered by the survey or other NCC-related issues.

The survey was made available to all employees in the field from February 9 through February 23, 2006. It was launched by an email from Aletha Brown, Inspector General, describing the nature of the survey and employee participation.  Within a few minutes after that email was sent, JPS sent an email with a link to the survey.  Two follow-up emails were sent - one seven days after launch and the second the day before the survey was closed.  The reminder email from the Inspector General thanked employees who took the survey and encouraged non-respondents to participate, and the JPS reminder email was sent only to non-respondents. [

Separate from the team's survey, the EEOC sent three emails about the NCC to all employees during the time the survey was being administered.  The first email (a few hours before the survey was launched) reported that a link to the NCC's FAQs was posted under “What's New” on the EEOC's internal website.  The day after the survey's launch, the EEOC sent a National Contact Center Newsletter to all employees.  One week after the survey's launch (and coincident with the first reminder emails) the EEOC sent a report describing NCC activities for the month of January.  The team was advised that this was the first time information on NCC operations was circulated to all employees in the field.  It is possible that sending this information while the survey was in the field could have been perceived as an effort to influence survey responses and therefore negatively influenced response rate and/or survey responses.

The survey was sent to 1,798 employees located in the field.  Forty-one employees did not receive the survey, thirty-two of whom were in the New Orleans office, which did not have Internet connectivity while the survey was in the field.  One survey was undeliverable and the remaining eight employees were out of the office the entire time the survey was in the field; therefore, 1,757 employees received the survey.  Of this total, 935 individuals responded, yielding a response rate of 53.2 percent.  Nineteen people completed only the first two questions. As both questions were just filter questions, there was no value in including these surveys in the data analyses.

The employees in the final sample were diverse in terms of tenure.  Of those taking the survey, 17 percent have been employed by the EEOC for 5 years, 20 percent for 6-10 years, 16 percent for 11-15 years, 17 percent for 16-20 years, and 30 percent for more than 20 years. 

A response rate of 53.2 percent is good.  While the survey provides an indication of what the respondents believed at the time they took the survey, the results are not generalizable to everyone in the field because the characteristics of the respondents differ from the population, as indicated by different response rates across offices and job types.  For example, the response rates by office (excluding the New Orleans office) and office type ranged from 48 percent to 60 percent.  Table 36 provides information on the response rate as a function of the type of work performed.  The range in rates was from a low of 30.6 percent of administrative staff to a high of 61.5 percent of State and Local.  If all variables in the survey sample and field populations had been the same (e.g., across office type, work, and other variables), then we could have generalized results to the population.  This usually occurs when the response rate is very close to 100 percent.

Table 40.  Survey Response Rate by Work Performed
Type of Work Performed Percent Response Rate

Admin

30.6

Clerical

46.2

District Director

60.0

Investigation

57.7

IT

59.6

Legal

38.3

Legal- Federal

38.8

Mediation

47.9

OA (Enforcement or Legal)

59.1

OA (Enforcement)

44.7

Outreach

60.9

State and Local

61.5

Support Staff - Federal

50.0

Support Staff - Individual

63.6

Support Staff - Investigation

47.8

Support Staff - Legal

50.0

This page was last modified on July 13, 2006.

Home Return to Home Page