State and Federal Corrections Information Systems An Inventory of Data Elements and an Assessment of Reporting Capabilities A joint project: Association of State Correctional Administrators Corrections Program Office, OJP Bureau of Justice Statistics National Institute of Justice August 1998, NCJ 170016 Acknowledgments This report was prepared by the Urban Institute under the supervision of Allen Beck, Ph.D., of the Bureau of Justice Statistics. Laura Maruschak was the project monitor at BJS. The project is sponsored by the following: Corrections Program Office, Larry Meachum, Director Bureau of Justice Statistics, Jan Chaiken, Ph.D., Director National Institute of Justice, Jeremy Travis, Director Project assistance was provided by the Association of State Correctional Administrators (ASCA). The project was supported by BJS grant number 97-MU-MU-K007. Principal staff for the project at the Urban Institute were William J. Sabol, Ph.D., Barbara Parthasarathy, Katherine Rosich, Mary Spence, and Mark Braza. Charles Friel, Ph.D., served as consultant to the project. Dave Williams and O. Jay Arwood designed and produced the report. Tom Hester of BJS provided editorial review. Marilyn Marbrook of BJS administered final publication. The project staff acknowledges the cooperation and support of the ASCA State-Federal Committee, especially the contributions they made at the Committee meeting of June 16, 1998, in St. Louis, Missouri. Committee members and other meeting participants, including Chairman Joseph Lehman, Kathy Hawk Sawyer, Robert Bayer, Mike Sullivan, Harold Clarke, Morris Thigpen, Ari Zavaras, Richard Lanham, Elaine Little, Larry Meachum, Phil Merkle, Camille Camp, and George Camp provided excellent comments after reviewing the draft report. The project thanks the members of the project's Advisory Committee for their assistance and thanks all of the staff of the corrections departments who responded to our surveys and participated in interviews. An electronic version of this report and the data analyzed in the report may be found on the Internet at the folllowing address: http://www.ojp.usdoj.gov.bjs/ The Corrections Program Office, the Bureau of Justice Statistics, and the National Institute of Justice are components of the Office of Justice Programs, the U.S. Department of Justice. The Association of State Correctional Administrators is a nonprofit membership organization dedicated to the improvement of correctional services and practices. The contents of this report do not necessarily reflect the views or policies of these organizations Introduction Study objectives, methodology, data, and report organization Objectives of the study Methodology Content of the questionnaires Organization of the Inventory items Table I. Stages and dimensions of offender-based data elements Measures used to analyze Inventory survey responses Rationale behind the availability indicators Full-availability scores and ratings Framework for defining and using the common core Study objectives, methodology, data, and report organization In a series of meetings, members of the State-Federal Committee of the Association of State Correctional Administrators and representatives from the Corrections Program Office (CPO), the Bureau of Justice Statistics (BJS), National Institute of Justice (NIJ), the National Institute of Corrections, and the Federal Bureau of Prisons identified the need to assess the current status of offender-based information systems in corrections. Correctional administrators expressed the need for a set of performance indicators that could be used to describe, measure, and compare management outcomes among departments of corrections. Administrators also expressed that they often lack basic information needed to formulate new policies or to defend existing practices. Researchers highlighted the difficulties of conducting comparative studies in the absence of basic agreement on concepts and definitions, and the diversity in the quality and coverage of data elements in these systems. In response, CPO, BJS, and NIJ sponsored a project to conduct an inventory and assessment of more than 200 data elements in State and Federal corrections information systems. An advisory committee, including representatives of the State-Federal Committee, other corrections officials, corrections researchers, and representatives of the sponsoring agencies and the Urban Institute, was formed to guide the design of the inventory and to identify priority information areas for attention. The Inventory of State and Federal Corrections Information Systems is built around the six priority information areas identified by the advisory committee: offender profile, internal order, program effectiveness, public safety, recidivism, and operational costs. The Inventory reports on the status of information systems in adult State and Federal departments of corrections. Its purpose is to provide a basis for improving the quality of corrections data, enhancing electronic sharing of information, and improving the capacity of corrections departments to provide comparable data for corrections performance measures, and for cross-jurisdictional research. Objectives of the study The study has several objectives. The first is to determine what data elements departments of corrections collect and maintain in their adult prisoner information systems and whether most departments maintain data elements about a common core of information that is roughly comparable across jurisdictions. The second objective is to assess the capabilities of corrections information systems to generate and report statistical information about offenders. The third objective is to organize the many data elements that departments collect into substantive categories that describe major stages of corrections processing and to develop a set of dimensions that measure events in these major phases of the corrections process. The fourth objective is to describe the information that corrections departments maintain about facilities management, medical care, and costs and revenues. Methodology The Inventory uses surveys and interviews to ask two questions about information systems: What data on most adult sentenced prisoners do departments collect and maintain in electronic form? and To what extend can departments use these data to respond to requests for statistical information about groups of offenders? These questions are asked in two structured questionnaires about the data elements and usability of the data systems, and a telephone interview collecting background information from key individuals to provide an overview of the architecture, capacities and capabilities of each corrections information system. Questionnaires. The first questionnaire, the "Inventory of Data Elements in State and Federal Corrections Information Systems," collects information on data elements officials maintain in the information systems they use to manage adult, sentenced prisoners (Appendix A). The Inventory contains 242 questions about data elements and capacities of information systems. Of these, 207 were offender-based data elements; 15 are about facilities; and 20 are about capacities to link data. For each of the 207 questions about offender-based data elements, the Inventory asks officials whether they maintain the data element and if so, whether it is in electronic or paper form, and the percentage of offenders for whom it is maintained. The second questionnaire, "Survey of Retrieval and Query Capacities of Corrections Information Systems" (Obstacles survey) collects information on barriers or obstacles these officials encounter in producing statistical information in response to queries about offenders (Appendix B). This 25-question survey is organized into 5 categories of obstacles or barriers related to: legislative and institutional matters, staffing, software, hardware, and data. During January 1998, both questionnaires were mailed to information officers in 50 State departments of corrections, the District of Columbia and the Federal Bureau of Prisons (Appendix C). Urban Institute staff collected the responses to the surveys, verified and coded them, and prepared them for data entry and analysis. Urban Institute staff also analyzed the data, prepared the tables, and did the analysis reported here. All 52 departments responded to the Inventory and telephone interviews, and 51 returned the Obstacles survey. Telephone interviews. With one to several staff members in the corrections information system, respondents included research staff, programmers, database administrators, system design specialists, and other personnel involved in designing, operating or maintaining corrections information systems (Appendix C). The interviews were open-ended but structured, focusing on the following themes: the institutional, political and legal context of information systems, nature of technology used, information system structure and business functions, data collection procedures, data system linkages and data sharing, and overview of how well systems meet current needs. Urban Institute staff conducted these telephone interviews during the summer of 1997 and were responsible for transcribing them and analyzing the content. In general these discussions with departmental staff provided vital background information relevant for the design of the Inventory questionnaires, and also supplemented data on the capabilities of corrections information systems. Content of the questionnaires Both questionnaires were designed in collaboration with the advisory committee. The advisory committee identified six priority information areas for attention: offender profile, recidivism, program effectiveness, internal order, public safety, and operational costs. In developing the Inventory survey, the first five of the committee's priority areas were organized into four stages of offender processing through the corrections systems. These stages included (1) profiling offenders, (2) committing offenders, (3) managing offenders, and (4) supervising offenders. The sixth priority area of the committee, operational costs, was used to guide development of questions related to facility management issues. A final component of the Inventory survey was developed to describe the capabilities of corrections information systems to extract and link archived data electronically. The links between the committee's priority information areas and the development of the two questionnaires are described further below. Profiling offenders. This priority area led to the development of the data elements in the first two stages of corrections processing: (1) profiling and describing offenders in corrections, and (2) committing offenders into prison. In the profiling offenders information area, the advisory committee identified a wide range of information concepts that included offenders demographic characteristics and risk potential, as well as their offenses, criminal histories, sentences, types of admissions, and releases from prison. For clarity, this priority area was divided into one stage that described demographic and social characteristics of offenders (profiling offenders), and another that described both the behaviors and decisions leading to commitment to prison and the assessment and placement decisions made upon entry into prison (committing offenders). Program effectiveness and internal order. These two information areas were combined into the third stage of corrections processing: (3) managing offenders in corrections facilities. This stage includes such information concepts as program participation by offenders, treatment, medical problems (e.g., HIV and TB) and medical care—as well as information related to offender misconduct, violations of rules, safety considerations, use of restraint, and drug and alcohol use. Recidivism and public safety. These two areas, in which the advisory committee identified concepts such as the re-arrest, re-conviction, and return to prison of released offenders, the harm to the public, and notification of victims, were combined into the fourth stage of corrections processing: (4) supervising offenders on release into the community. Operational costs. This priority area includes non-offender-based data, such as those that measure staffing ratios, program effectiveness, and costs of operating facilities. Since these questions are about information at a different level of analysis from the offender-based data elements, they are organized into the fifth area of the Inventory: facilities management. Many of the concepts derived from the priority information areas relate to events that may be repeated through an offender's career in corrections. For example, a single offender may have several admissions into and releases from prison; a single offender may be involved in several hearings related to misconduct; a single offender may have several medical tests. The challenge is to capture the nature of the repeats. This may be represented in information systems in several ways: Each separate record of an event may be added to an information system as a new record. Only information about the most recent event may be maintained in the information system. Records of events may be repeated as blocks or segments to the end of an individual's record. Or past events may be archived on tape or other medium, with only the current event retained in the information system. Regardless of the method used to record these events, an Inventory of data elements alone cannot provide answers to important questions about whether departments can provide data on the number of times events happened. To address this concern, the Inventory included 20 questions about whether departments maintain archival records of repeatable events and, if so, whether they have the capability to retrieve and link these records to current records by electronic means. Thus, the Inventory questionnaire contains three sets of questions. The vast majority of the questions (207) apply to the four stages of offender processing and ask about the existence of offender-based data elements in corrections information systems. Fifteen questions ask about facilities management, medical care, and costs. Twenty questions ask about the capabilities of information systems to extract and link electronically archived data. The Inventory project also addresses the reporting capabilities of information systems. Reporting capability is measured by the extent to which the information systems can report on requests for statistical information. The Obstacles survey asks departments to rate the severity of obstacles they face in providing this information. It asks about five areas of potential problems: legislative and institutional, hardware, software, staffing, and data. Organization of the Inventory items In this report, the 207 offender-based data elements are organized into 4 stages of corrections processing. Each stage contains several dimensions, which are relatively homogenous groupings of data elements that together define a given stage (table I). The stages and categories of dimensions include: * Stage 1, profiling and describing offenders, contains dimensions that describe offenders' demographic characteristics, socio-economic status, and family characteristics and living arrangements; * Stage 2, committing offenders, contains dimensions that describe offenders' commitment offenses, sentencing information, and assessment and confinement decisions; * Stage 3, managing offenders, contains dimensions that describe routine offender management, methods of release from prison, and internal order and security; and * Stage 4, supervising offenders, contains dimensions that describe offender behavior after release, and details about new crimes committed and the victims of these crimes. Measures used to analyze Inventory survey responses The report uses several indicators to summarize responses to questions in the Inventory survey. The indicators provide information about the overall completeness of the department of corrections information systems, and point to areas where gaps exist. The gaps may be at the level of a data element, of the extent of the population covered, or of the form in which elements are maintained (electronic versus paper). Identifying the gaps helps identify areas where improvements to information systems can be made—such as adding data elements, increasing the number of data elements maintained electronically, or expanding the coverage of populations. The indicators of corrections information systems capacities are based upon the following concepts: * High availability. This response shows that an information system has a data element in electronic form for more than 75% of offenders. This high percentage indicates extensive coverage on an element. The electronic form indicates the data potentially can be extracted, linked, and easily shared electronically. * Medium availability. This response shows that an information system has a data element in electronic form but for less than 75% of offenders. It indicates a medium level of availability because the scope of coverage is less. It also indicates that information about a comparatively large percentage of offenders is more likely to be missing than in the high-availability indicator. * Low availability. This response indicates that a data element is available only in paper form. Data elements available in low-availability form cannot be extracted, linked, and shared electronically. For the purposes of using offender-based data elements to generate statistical information, low-availability data elements present large obstacles for departments' capacities. * No availability. This response indicates that a department does not collect a data element in any form. * Unknown availability. For some elements, departments indicated that they maintain the element but did not indicate in which form, or the scope of coverage. * Missing. For some elements, departments did not answer, were not sure if they collected it, or indicated it did not apply to their system. Rationale behind the availability indicators The high- to no-availability indication implies a continuum of availability. The distinctions between high, medium, and low availability reflect information system management priorities. High availability. Maintaining data elements in a high-availability form indicates that the information may be used for on-going and day-to-day management concerns, or that the information is used to produce regular reports about corrections systems. Medium availability. The medium-availability form indicates a problem of scope of coverage—that the electronically maintained data element does not apply to or is not collected on most offenders. There may be many reasons for this. For example, in decentralized systems in which data on offenders are collected at facilities and then submitted to a central information system, scope of coverage may relate to the absence of elements in the local information system or to problems transmitting data. Partial coverage in electronic form may also reflect changes in sentencing policy that necessitates the introduction of new variables or data elements that apply only to specific classes of offenders. Or, it may reflect the retiring of old data elements that may consequently cover smaller and smaller percentages of offenders. Regardless of the reasons or explanations for it, the partial coverage of the medium-availability format presents a problem if the objective is to create statistical information on commonly defined concepts. It is also a problem for which there is unlikely to be a single solution. Low availability. The low-availability format indicates that departments do not collect the data element in electronic form. This suggests that departments may not consider the element among those needed for day-to-day management or for use in regular reports. But it does not imply that a data element is unimportant to the departments, or even that it is less important than a data element maintained in high-availability form. For example, parole decisions may be based on information that is maintained in a low-availability form. Such information is crucial for corrections management decisions, but it is not necessarily crucial for day-to-day decisions. Data elements are maintained in paper form for many reasons. Some, such as medical records, may not lend themselves to easy transcription and entry into computers. Others may be highly confidential. Still others may be used intermittently in decisionmaking about individual offenders. Alternatively, important data elements may be stored in paper form because of information system deficiencies. For example, some departments need to calculate data about average length of sentence or average time served manually, simply because the data elements are maintained only in paper form. No availability. At least maintaining data elements in paper form indicates that they do exist in a given corrections information system. This means that the system has developed rules and procedures for defining, collecting, and maintaining the data element—putting that system at a distinct advantage over other systems that do not maintain it in any form. In the case of no-availability elements, departments have not even defined the element, let alone established rules and procedures for collecting and maintaining it. The no-availability format reflects the judgment that the system in question does not use the data element for making corrections processing or management decisions. Full-availability scores and ratings The concepts of high, medium, low, and no availability are used to create full-availability scores and ratings that measure the extent to which departments have the availability to maintain data elements electronically for the large majority of offenders. In this report, full-availability scores and ratings are used to assess data availability in departments for groups of data elements, although they can also be used to assess availability for individual elements. A full-availability score is created by assigning a value to each data element based on the level of availability at which a department maintains it. The values of the individual elements are then summed across groups of elements to achieve a department's availability score for each group. The availability indicators are scored as follows: * High availability elements are given a score of 3 points; * Medium availability elements are given a score of 2 points; * Low availability elements are given a score of 1 point; * Unknown availability elements are given a score of 1 point; and * No availability elements are given a score of 0 points. For example, in a department with 10 data elements of which 4 are maintained in high availability, 2 in medium availability, and 4 in no availability, its full-availability score equals 16, which is derived by assigning the 4 high-availability elements 3 points each, the 2 medium-availability element 2 points each, and the 2 no-availability elements 0 points each. The sum of the availability indicator scores for these 10 data elements equals the full-availability score of 16 points. Full-availability scores are a function of the number of data elements in a group. Thus, department's relative availability across groups of data elements are difficult to assess by reference to their full-availability scores alone. For example, if 1 dimension has 8 data elements and another dimension has only 5 data elements, departments may receive a higher full-availability score for the 8-element dimension by virtue of the additional number of elements. To standardize for the differences in the number of elements among dimensions, a full-availability index and full-availability rating are created. The full-availability index is the score that a department would receive if it maintained all of the data elements in a group in a high-availability form. A full-availability rating is the percentage of the full-availability index that a department achieves for a given group of data elements. For example, if a dimension of corrections processing contained 9 data elements, the full-availability index would equal 27 (3 full-availability points times 9 data elements). If a department receives a full-availability score of 20 (for any combination of high-, medium-, low-, unknown-, and no-availability data elements) the department's full-availability rating would be 74% (20 divided by 27 times 100% equals 74%). Tables of availability scores for each department are generated for each stage of corrections processing and for the common core of data elements (i.e., the dimensions) within each stage. Comparisons may be made between departments to obtain an indication of relative availability, although not the reasons for differences. Comparisons also may be made between stages or dimensions within departments to obtain an indication of the relative availability among information areas of a given information system. Included in some of the tables of availability scores are the distributions of the number of data elements at each level of availability for each department. This permits some analysis of the reasons for the overall availability scores. For example, some departments may receive relatively higher scores than other departments because they maintain more data elements in a dimension, even though they maintain a smaller proportion of the data elements in a high-availability form. For example, Colorado and Iowa both rate relatively high on the availability index (83% and 80%, respectively). Colorado receives most of its score from the 168 data elements that it collects in high-availability form. Iowa maintains fewer data elements in high-availability form (147), but it also maintains 55 data elements in paper form or low availability. The distribution of the number of data elements that a department maintains at each level of availability points out the strengths and weakness of departments' capacities for any group of data elements. Framework for defining and using the common core The framework for analysis, which largely follows the format of the Inventory and Obstacles questionnaires, aims to: (1) identify commonly collected data elements; (2) measure the availability of data elements in a "high-availability format," that is, in electronic form for a large majority (more than 75%) of offenders; and (3) measure the availability of data in the areas where data are more commonly collected. The approach taken to identify commonly collected data elements relies on delineating the major dimensions of corrections processing that comprise each of the four stages of corrections outlined above. The term "common core" of data refers to the data elements in specific dimensions of processing. To say that a common-core of data exists for a given dimension does not mean that every data element is maintained in a high-availability format by every department. What it does mean is that, within a dimension, enough data elements are maintained by a majority of departments in a high-availability format to constitute a meaningful core of information measures for a given dimension. Specifically, the common core consists of the data elements that are in core dimensions of corrections processing. Core dimensions are identified separately for each stage of processing; a core dimension is one in which more than an average number of departments maintain data elements in high-availability form (Footnote: These are identified in the following manner. First, for each stage, the average number of departments that maintain data elements in high-availability form is calculated as the full-availability score divided by the number of departments reporting data for the stage. Second, a similar measure is calculated for each dimension within a stage. The dimensions for which the average number of departments collecting data in high availability exceeds the average for the stage are defined as the core dimensions. Finally, all data elements within core dimensions are included in the core. ) This approach is based on several assumptions. First, identifying the "common-core" areas in terms of dimensions recognizes a limitation of the Inventory: Departments collect data elements that are similar to those identified in the Inventory but may be defined in a somewhat different manner. Second, within any common-core dimension, some specific data elements may not be collected as commonly as others. Third, the common dimensions may obscure the fact that specific data elements are commonly collected even if the dimensions to which they belong are not. However, to the extent that corrections concepts are measured by several data elements, analysis of single data elements does not reveal the common information areas. Fourth, underlying the "common core" is the assumption that departments have the capability to use computer methods to generate statistical information about the common-core areas. To use the common core or any data elements to generate comparable statistics or indicators of corrections performance, standard definitions for the statistics are needed. Departments may not be able to meet standard statistical definitions in all cases. For example, differences in definitions of offense classes, types of offenders, and coverage of the corrections system may vary among departments. These may lead to differences in definitions of data elements and, consequently, in statistics. These differences need to be understood. To begin to ascertain the reporting capabilities of departments, the study also describes the types of queries for statistical information to which departments are asked to respond; describes and analyzes the obstacles that departments face in responding to statistical queries; and describes departments' capabilities for sharing and linking data externally and linking data internally. These objectives are achieved by analyzing the data from the telephone surveys about the queries on statistical information; by analyzing the Obstacles survey to identify and rank obstacles for producing data; and by analyzing the Inventory survey about capabilities to link data. Chapter 1 Profiling and describing offenders Highlights * All 52 departments maintain data electronically on the race and sex of offenders entering prison; 51 do so for their date of birth. * At least 40 departments collect data on the occupation, military discharge, marital status, and education level of offenders, but some of these maintain only paper records on this information. * Twenty departments can not report on whether inmates have and support children. * Data describing offenders' demographic characteristics are more commonly collected than data describing socio-economic status or family relationships. ------------------------------------------------------------ Profiling and describing offenders ------------------------------------------------------------ The Inventory includes 29 data elements that can be used for describing and profiling offenders under correctional authority. This stage is comprised of three dimensions, each containing several elements that describe: * Demographic characteristics such as age, race, sex (11 elements); * Socio-economic status such as offenders employment, education, and related experiences prior to prison admission (13 elements); and * Familial relationships (5 elements). ------------------------------------------------------------ Demographic characteristics of offenders ------------------------------------------------------------ In the profiling offenders stage of corrections processing, data elements that describe the demographic characteristics of offenders are collected in any format-- paper or electronic -by more departments than are the elements that describe offenders' socio-economic status or familial relationships. Most departments maintain data elements on demographic characteristics of offenders, such as their age, race, sex, Hispanic origin, and residence in high-availability form (Footnote: High-availability format is defined as maintaining data electronically for more than 75% of offenders table 1.1). Specifically, of the 52 departments reporting, 51 meet this criterion for maintaining data on the sex and race of offenders; 50 do so for the date of birth of offenders. In addition, 39 departments have high-availability data elements describing Hispanic origin, while 40 meet it for data on State of birth and on country of birth. Other demographic variables are less well represented according to this high-availability measure. Thirty-three departments collect data elements on citizenship at a high-availability level, and 31 departments do so on religious affiliation. Information on immigrant status (i.e., whether legal or illegal) is maintained in high-availability form by 18 departments, and 29 departments have high availability on offenders' residence (table 1.1). ------------------------------------------------------------ Socio-economic status of offenders ------------------------------------------------------------ Relatively few departments maintain data elements for substantial segments of their population on aspects of military service, employment status, sources of income or financial obligations at commitment in a high-availability form (table 1.2). However, 29 departments maintain data in high availability form on education level and 23 do so for whether the offender served in the U.S. Armed Forces. For the most part, departments do not maintain data elements on income and financial obligations. For example, 17 departments do not collect employment data and 40 do not collect data on income of offenders at the time of commitment. Also, about three quarters of the departments do not collect data on the type or amount of financial obligations. --------------------------------------------- Familial relationships of offenders --------------------------------------------- Many of the departments do not collect or maintain in high-availability form data elements on the family circumstances and living arrangements of offenders prior to admission (table 1.3). With the exception of the 35 departments that maintain data elements on marital status in high-availability form, no other data element about familial relations is maintained in a high-availability form by more than half of the reporting departments. Less than one-third of the departments have a high availability to provide data on children and dependents. In general, data elements on family and domestic circumstances are not collected at all. Thirty departments do not collect data on the relationship of the offender to others in the household -- although eleven departments collect such data in electronic form at varying levels of coverage of correctional populations. Similarly, most departments do not collect data that describes the residential status of offenders (e.g., whether they rent or own their residences, or are homeless). --------------------------- Summary --------------------------- Among the 29 data elements in the profiling offenders stage of corrections processing, the 11 that comprise the dimension of demographic characteristics of offenders have a higher concentration of elements maintained in common and high-availability form than do elements in the other dimensions -- socio-economic status or family relations. Data elements on race and sex are maintained by all departments, and by all departments but one on age and marital status. Data elements on education, employment, occupation, and military experience are maintained by a majority of departments. Data elements on residential status are maintained by very few departments. Each of the 29 elements in the offender profile category is collected by at least some of the departments, although some of these data elements are collected by only a few departments. ------------------------------------------------------- Chapter 2 Committing offenders into correctional authority ------------------------------------------------------- Highlights * All 52 departments maintain data for the type and date of commitment into prison and the length of sentence imposed. More than 45 maintain several detailed data elements that describe offenders commitment offenses and their expected release dates. * At least 48 departments collect data about sentencing: date of sentencing, county of conviction, total length of sentences imposed, and whether sentences are concurrent or consecutive. A few maintain these elements in paper form only. * Thirty-seven departments maintain data about the date of the criminal incident underlying the conviction; 27 do so about whether a weapon was involved; 21 about the number of victims in the incident; and 16 about victim injuries. In most cases, these departments maintain this information on criminal incidents in paper form. * Most departments (between 39 and 47) maintain data electronically about offenders needs, their security classifications, and units in which they are housed. * Data describing conviction offenses, sentences imposed, current commitment, expected time to be served, risk assessment classification decisions, and confinement characteristics are more commonly collected than data describing other areas of committing offenders. The Inventory includes 70 data elements that describe the second stage of corrections processing: committing offenders into correctional authority -- specifically, committing offenders into prison. This stage includes elements that describe the offenses and sentencing decisions leading up to the commitment into prison and elements describing the assessment and placement of offenders upon commitment. The 70 data elements in this stage are organized into three broad categories that provide information about the offenses leading to the conviction and sentences, about the sentences imposed by the court, and about the assessment and confinement decisions made by corrections officials upon receipt of an offender from the court or other authorities. Among the categories, the data elements are further divided into 10 dimensions that describe more finely defined aspects of this stage. In describing offenses leading to the conviction underlying a commitment, the 29 data elements are organized into 3 dimensions: *Criminal incident underlying the conviction offenses, including the data elements that describe the victims of crimes (14 data elements); *Conviction offenses (7 data elements); and *Offenders criminal histories and records of prior arrests, convictions, and criminal justice supervision status at the time the conviction offense was committed (8 data elements). The 19 data elements about sentencing outcomes and type of current commitment are organized into 3 dimensions that describe: * Sentences imposed by the court (13 data elements); * Type of current commitment (3 data elements); and * Upon commitment, expected time to be served until release from prison (3 data elements). Finally, the 22 data elements that describe assessment, classification, and confinement decisions made by corrections officials are organized into 4 dimensions: * Risk assessment data elements describing the characteristics of offenders leading to placement decisions (4 data elements); * Needs assessment describing the needs of offenders for treatment or placement (6 data elements); * Classification decisions including 9 data elements that describe an offender's security level at commitment; and * Confinement characteristics describing the location and housing into which an offender is placed (3 data elements). ------------------------------------------------ Offenses leading to commitments ------------------------------------------------ Within this area, the data elements are organized into three dimensions, those that describe the criminal incident, the conviction offense, and the offender's criminal history. Few departments maintain any of the data elements on the criminal incident in a high-availability form (table 2.1). This includes data elements about the incident itself, victim-related information, and other damages. The exception to this is the data element on the date of the criminal incident, which is collected by 21 of the departments. About one quarter of all departments reporting maintain data elements about the criminal incident in paper form. Few retain descriptive data about the victims of offender crimes. Overall, the majority of departments do not maintain data elements about victims. Forty-six departments have conviction offense information in high-availability form (Footnote: High-availability format is defined as maintaining data electronically for more than 75% of offenders. table 2.2). Most of the departments (42) obtain their conviction offense information from court commitment orders and maintain them as high-availability data elements and most (43) maintain high-availability data elements about the severity of the offenses (e.g., felony or misdemeanor). A majority of departments have specific information about criminal codes and written descriptions of offenses; 31 departments maintain some data elements with detailed offense descriptions in high-availability form. More than half of the departments (31) maintain high-availability data on the criminal justice status of the offender at commitment (table 2.3). But, a large majority of departments do not capture other data elements on criminal history in high-availability form. Nearly half (25) of the departments maintain data elements on offenders' prior record of arrests and convictions in a high-availability form. Twenty-five maintain high-availability data elements on the severity level of prior offenses, and 24 departments do so on the actual number of prior convictions. Relatively few departments have high availability to produce data on arrests, and a sizable number (23) do not maintain data elements on the number of prior arrests. Only 20 departments collect data elements that describe whether offenders were habitual offenders in high-availability form. Finally, many of the data elements about criminal history are collected in paper form, or in electronic form for smaller segments of the population. For example, data on the number of prior arrests is collected by an additional 20 departments in either electronic form for smaller segments of the population or in paper form. ------------------------------------------------- Sentencing information ------------------------------------------------- Most departments collect data elements about the sentences imposed by courts in the high-availability format (table 2.4). More than 44 departments maintain high-availability data elements on the date of sentencing, the number of sentences imposed, whether the sentences imposed were concurrent or consecutive sentences, the length of sentence for each offense, and the total length of sentence. In addition, 47 departments maintain high-availability data elements on the county in which the offender was sentenced, and 34 maintain elements that identify the sentencing judge. The number of departments maintaining high-availability elements on mandatory sentences, combinations of sentences, and supervisory sentences is lower than the number having high-availability elements on the basic sentencing information, but a majority or near majority of departments maintain high-availability elements on these other aspects of sentencing (table 2.4). In general, departments are less likely to have data elements about sanctions other than prison sentences than they are about the prison sentences. More than half of the departments (30) have a high availability to produce data on length of community supervision, but only 23 departments have this availability for monetary sanctions, and even fewer (19) have it for the amount of the monetary sanction. All departments can provide data on type and date of commitment in electronic form, and all but two can provide both of these data elements in high-availability form (table 2.5). Fewer departments maintain data on the agency having the authority to release the offender (28). As with data elements about sentencing, most departments also maintain high-availability data elements about expected release dates of offenders. Of the 52 reporting departments, 48 maintain high-availability elements on the expected date of release, 43 on the expected parole release date, and 45 on the date of expiration of sentence (table 2.6). ------------------------------------------------ Assessment and confinement decisions ------------------------------------------------ About half of the departments maintain data elements that are used in assessing offenders' risk in high-availability form. More specifically, 28 departments maintain high-availability data elements on offenders' history of violence, 22 on the use of a weapon, and 16 on gang membership. Additional departments maintain these data elements as in medium availability or paper form, but relatively sizable numbers of departments do not maintain elements on these aspects of offenders' behavior (table 2.7). On the data elements that measured needs assessment, departments were split. For the data element on the type of needs that offenders had, 34 departments maintained it in high-availability form. For offenders' psychological and medical histories, 20 and 26 departments, respectively, maintained elements in high-availability form. Conversely, for program participation prior to imprisonment and drug testing upon admission, most departments did not collect data elements to measure these activities (table 2.8). Despite the interest in medical conditions of offenders, and the concern over the increasing incidence of tuberculosis infection and HIV-infected populations in prison, only 26 departments maintain high-availability data elements on medical conditions. However, an additional 17 departments maintain data elements in some other format on medical conditions. In addition, 15 departments do not collect data on psychological history at time of admission. Twenty departments have psychological history data for large segments of their populations. In general, most departments maintain as high-availability elements, the data about classification decisions (table 2.9). For example, 45 departments have a high availability to produce data on security level, 43 have a high availability on date of initial classification, and 45 have a high availability to produce date of classification change. The outcomes of these procedures, however, are produced in a high-availability range at slightly lower levels: 40 departments have data representing a classification index or score, and 30 departments collect data identifying the agency making the decision on classification. Four departments (Alaska, District of Columbia, Idaho, and New Mexico) do not collect data on security level (not shown in a table). Relatively few departments have a high availability to produce data on various types of scores and indices related to classification. Only 18 departments collect data at a high-availability level on a psychological index or score, and 23 departments collect data at this level on a medical classification index or score. Similarly, all departments collect data on the type of facility to which the offender is placed at admission (table 2.10). Fifty departments maintain it as a high-availability data element, and only one (New Mexico) does not collect data on type of facility at placement (not shown in a table). Slightly less than all departments collect information about the type of housing into which offenders are placed at admission (table 2.10). Thirty-eight departments have a high availability to provide data on the type of housing unit in which the offender is placed, and 40 departments have a high-availability level on data on the type of special housing which offenders are placed. Seven departments (Alaska, Indiana, Idaho, Louisiana, New Mexico, Minnesota, and Wisconsin) do not collect data on the type of placement housing at admission and five (Alaska, District of Columbia, Idaho, Minnesota, and Wisconsin) do not collect data on special units housing the offenders. ----------------------- Summary ----------------------- Among the 70 data elements in the committing offenders stage of corrections processing, the 42 data elements that measure conviction offenses, sentences imposed, current commitment, expected time to be served, risk assessment, classification decisions, and confinement characteristics are the most commonly collected by the reporting departments. The data elements that measure the criminal incident leading to the conviction offense, which include data elements that describe victims of criminal incidents, are the least commonly collected by the departments. For the more commonly collected data elements, more than 45 departments maintain in high-availability form the data elements on number and type of conviction offenses, county of sentencing court, date and length of sentence, whether sentences are imposed concurrently or consecutively, type and date of commitment, expected dates of release, type of confinement facility, and date of classification changes. -------------------------------------------------- Chapter 3 Managing offenders in corrections facilities --------------------------------------------------- Highlights * All 52 departments maintain data electronically on offenders' types and dates of releases from prison and dates and types of transfers between facilities, and most (up to 39) report data on the reasons for changes or adjustments to sentence and time to be served, including good time and other credits. * Forty-two departments maintain data on offenders' participation in programs, and 28 of these do so at a high-availability level. In general, departments collect data on programs or medical care in paper format. * Thirty-three departments report that they maintain data on drug tests since admissions, but only 18 maintain this information electronically, and only 15 maintain data on the results of the tests electronically. * Forty-seven departments maintain data about the most recent occurrence of misconduct in prison and most do so electronically. More than 27 departments maintain detailed information about these incidents -- such as who was involved, whether drugs, alcohol, or weapons were involved, and injuries -- but most of these maintain the data in paper form. * Forty-two departments maintain data on victim notification requirements. * Data describing post-commitment movements, good time and other adjustments to sentences, offender registry, and releases from custody are more commonly collected than other areas of managing offenders. The third major stage of corrections processing relates to managing offenders while they are in correctional facilities. This stage includes data elements that describe movements of prisoners, the procedures and actions that corrections officials take to manage offenders in their custody, behaviors of offenders leading to disciplinary actions, and official responses to misconduct. The Inventory includes 63 data elements about managing offenders. These elements are organized into 3 broad categories that describe routine management and program participation, the release of offenders from custody, and internal security matters. To describe routine management activities, the 26 data elements are organized into 4 dimensions: * Post-commitment transfers between jurisdictions and movements between and within facilities (7 data elements); * Program participation by offenders (11 data elements); * Drug testing since prison admission (2 data elements); and * Medical care of offenders (6 data elements). To describe how offenders are released from custody and the processes leading to adjustments to their time served in prison, the 18 data elements in this category are organized into 3 dimensions: * Good time and other adjustments to sentences and length of stay, as well as the reasons for the changes (10 data elements); * Method of release from custody (5 data elements); and * Offender registry requirements (3 data elements). Finally, in this stage of managing offenders, the 19 data elements related to internal order and security are organized into 3 dimensions that describe behaviors of offenders and official responses to misconduct: * Misconduct and infractions -- describing events leading to disciplinary actions (11 data elements); * Responses to misconduct -- describing the immediate response to misconduct taken by corrections officials (3 data elements); and * Proceedings against offenders -- describing the legal proceedings and outcomes taken in response to misconduct (5 data elements). --------------------------------------------- Routine offender management --------------------------------------------- Fifty-two departments maintain in high-availability form data elements that track the movements of prisoners between facilities and the transfer of offenders to other jurisdictions (Footnote: High-availability format is defined as maintaining data electronically for more than 75% of offenders table 3.1). Forty-one departments maintain data elements that track internal movements in high-availability form. Slightly fewer departments maintain high-availability data elements about the reason for a transfer or internal movement (31) or the official who authorized the movement or transfer (9 and 16 departments, respectively). In general, very few data elements are collected on programmatic activities at a high-availability level. Data elements on types of programs are collected by 28 of the departments at a high-availability level. Twenty-eight departments collect data at a high-availability level on the date the offender began the program, and 26 departments do so on the date the offender ended the program (table 3.2). Data elements on programs tend to be collected electronically for more than 75% of the corrections population. About a fifth collect this information in paper format. About half of the departments do not collect data on reasons for program participation or on the authorization for the program. Of 52 departments reporting, 14 departments have a high availability to produce data on drug tests of offenders since admission, and 12 departments can do so on the date of the last drug test (table 3.3). About a third of the departments do not collect either of these data elements, or collect these data in paper records only. In general, departments maintain data elements on medical care of offenders in paper records only. With the exception of the current medical condition of offenders, for which 18 departments have a high availability to produce data, less than one third of the departments collect data in electronic format on the medical condition of offenders for large segments of their populations (table 3.4). In addition, a few departments collect data in electronic format for less than 75% of the offender population, and more than 10 do not collect medical data on offenders at all. Twenty-four departments report that they do not maintain data elements on the costs of medical treatment. --------------------------------------------- Methods of release from prison --------------------------------------------- Thirty or more departments maintain data elements in a high-availability format on whether sentences are modified, by what amount, and the dates and reasons for good time or other adjustments. More than half of the departments (28) have a high availability on data relating to changes in available good time credits and only 18 departments maintain data elements in high-availability on special credits (e.g., housing credits). Also, 29 departments do not collect data on these special credits (table 3.5). High-availability data elements on the type and date of release from custody are maintained by all departments. Thirty-four departments can produce data at a high-availability level about the time served in custody, and 36 departments can do so on the type of facility that the offender is released to (e.g., community corrections facility, work release center, treatment facility). More than half of the departments have a high availability to provide data on the agency gaining jurisdiction of the offender on release. About a quarter of the departments do not collect data on time served, or on the jurisdiction or facility to which the offender is released (table 3.6). About two-thirds of the departments have a high availability to provide data to comply with victim notification requirements (table 3.7). Less than one half of the departments can provide data on whether an offender is required to register as a sex offender under Megan's Law or some similar statute. Only 14 departments can identify whether an offender actually registered as a sex offender under such statutes. --------------------------------------------- Internal order and security --------------------------------------------- With the exception of the type of misconduct and the date of the event, most departments do not collect data about misconduct and infractions in a high-availability form. Thirty-three departments have a high availability to provide data on the type of misconduct or infraction, and 34 can do so on the date of the event (table 3.8). Twenty-seven departments have a high availability to produce data on the history of offenders' behavior in custody. About half of the departments collect data in electronic form on the location of the event. Overall, much of the data pertaining to internal security is collected on paper records (table 3.8). For example, more than a third of the departments collect data in paper records about who was involved in the event, who sustained an injury, the type of injury sustained, whether drugs or weapons were involved, and the amount of property damage. Substantial numbers of departments indicate that they do not collect data on certain aspects of misconduct. Twenty-two departments do not collect data on whether weapons were involved, and 24 do not collect data on whether drugs or alcohol were involved in the incident. Only 12 departments maintain high-availability data elements on the responses to infractions (table 3.9). About one third of the departments do not collect data at all on the type or date of the immediate response, although another third do so in paper form. With the exception of the result of the immediate response (on which 21 departments report a high availability to produce data) relatively few departments produce these data elements in electronic form. Data elements about formal legal responses to violations of internal order are also not generally collected at high-availability levels. For example, 26 departments have a high availability to produce data about the disposition of the proceeding, but fewer than half can do so on charges filed against the offender (table 3.10). Also, 20 departments can produce data at a high-availability level on the date of legal procedure. Sixteen departments have a high availability on data relating to who initiated the response, and 21 departments have data at this level on the type of response. About a third of the departments report that they do not collect these data elements at all. ----------------------- Summary ----------------------- Among the 63 data elements that describe management of offenders, data elements that measure the form of release, good time and other adjustments to sentencing, post-commitment movements, and offender registry are commonly collected in high-availability form by reporting departments. Data elements on program participation and medical care are commonly collected but largely in paper form. Data elements on drug testing, offender misconduct and responses to misconduct are less commonly collected, but many of them are also maintained in paper form. Overall, many of the data elements pertaining to internal security and medical care are collected in paper records. ------------------------------------------------------------------------------ Chapter 4 Supervising offenders on release and maintaining public safety ------------------------------------------------------------------------------ Highlights * Forty of the 52 departments maintain data about the behaviors of offenders released into the community; 12 do not. * Thirty-eight of these departments record data on the reasons for termination of supervision; 32 report on the type of new crime committed by offenders under supervision; and 35 report data on when offenders return to prison after having been sentenced for a new crime. Most departments maintain this information electronically. * For crimes committed by offenders under supervision, 35 departments have data on the type of crime, but no more than 16 have data about victim-related elements of these crimes, and fewer still maintain data on the characteristics of victims; for those that do, most maintain victim information in paper form. * While 31 of 40 departments maintain data on the address of offenders released from prison, only 20 maintain data about released offenders' living arrangements and 17 about their employment; for many departments, this information is maintained on paper. * Data describing reasons for terminating supervision and the criminal justice response to violations of conditions of supervision are more commonly collected by these 40 departments than other areas of supervising offenders. The fourth stage of corrections processing in the Inventory relates to the supervision of offenders released from custody and the maintenance of public safety. The Inventory includes 45 data elements related to this stage. As with the second and third stages, this fourth stage is divided into broad categories which are then divided into dimensions. The data elements in this stage measure where offenders are in the community, what they are doing there, and whether they have a record of criminal activity after release. The data elements also address the behavior of offenders under supervision in the community, any new crimes committed, and the response to these crimes. Additional data elements focus on information about victims of crimes committed by offenders under supervision in the community. These 45 data elements fall into three categories: social integration, offender behaviors after release from custody, and new crimes and victims of crimes. Social integration includes one dimension of data elements about offenders' residence and employment status during release. The data elements about offender behaviors on release are organized into 3 dimensions: * Residence and employment during release (7 data elements); * Behaviors on supervision leading to reasons for terminating supervision (12 data elements); and * Responses to new crimes and violations of conditions of supervision (10 data elements). The other major category of elements relates to new crimes and victims of crimes. This category is organized into 2 dimensions: * Details of new crimes committed on supervision (9 data elements); and * Details about victims of crimes committed by offenders under supervision (7 data elements). In the supervising offenders stage, at most 40 departments of correction use their adult sentenced prisoner information systems to maintain data on offenders while they are under supervision in the community: * Forty departments in tables 4.1, 4.2, and 4.3 reported having data about offenders under supervision; and * Thirty-eight departments in tables 4.4 and 4.5 reported having detailed data about criminal incidents committed by released offenders. ------------------------------ Social integration ------------------------------ Relatively few of the 40 departments reporting that they maintain data elements about released offenders maintain the data elements on offenders employment experiences and residence on release (table 4.1). With the exception of the address of the offender, which is collected in a high-availability form (Footnote: High-availability format is defined as maintaining data electronically for more than 75% of offenders) by 19 departments, most of the departments do not maintain the data elements that describe personal data about offenders on release. ----------------------------------- Offender behavior after release ------------------------------------ For the 40 departments that maintain data on offenders released into the community, a large percentage are able to report data in high-availability form. Most departments collect the key data elements on completion of release supervision in a high-availability form. Of the 40 departments reporting, 32 each have a high availability to produce data on type of supervision, on whether supervision was terminated, and on the date supervision was completed (table 4.2). Twenty-seven departments maintain high-availability data elements about whether an offender absconded while on release; 26 do so for the type of new crime that was committed; and 25 departments maintain data elements on the length of supervision in high availability. More than a third of the departments collect data in a high-availability form on the type of technical violation, and on the dates related to the new crime or violation. For the data elements related to the criminal justice response to new crimes and technical violations committed by offenders under supervision in the community, departments vary in their capacities to maintain information in electronic form. For the data elements that measure an offender's return to prison, such as date of return to prison, and whether an offender was sentenced, most of the departments (32 and 27, respectively) have high-availability data elements (table 4.3). Twenty-one of the departments maintain high-availability data elements on offenders arrested and subsequently adjudicated for crimes committed while on release. Most departments that report these data elements obtain their data when offenders return to prison (28 departments), but 18 obtain data on the new crimes committed by offenders on release when the offenders are convicted, and 10 departments report that they obtain this information when offenders are arrested (not shown in a table in this chapter). ------------------------------------ New crimes and victims of crimes ------------------------------------ Although a relatively high number of departments can identify whether supervision terminated for reasons of a new crime (table 4.2), departments vary in their availability to maintain data elements that describe the new crimes committed by released offenders, the victims of the crimes, or the damages done by the offender (table 4.4). Of the 38 that report maintaining data elements about criminal incidents involving offenders under supervision, 31 departments report in a high-availability format the supervision status of the offender, and 27 departments report on the type of offense associated with the new crime. However, only 2 departments maintain high-availability data elements about the victims of these crimes. Six maintain the data elements for the location of the event, and 11 maintain data elements about victim restitution in electronic form for more than 75% of the offender population. Most of the departments do not maintain these elements (table 4.4). Few departments collect detailed data in any form about victims in incidents committed by offenders on release. Thirteen report collecting data elements on the sex of the victim and whether the victim was a child; 16 maintain data on the victim's address. In addition, the comparatively few data elements about victims of crimes committed by offenders on release in the community are maintained primarily in paper format. For example, only 4 departments report that they maintain data elements on the sex of the victim in electronic form (table 4.5). ------------------------------ Summary ------------------------------ Twelve of the 52 departments in the survey report that they do not maintain data elements on the behaviors of offenders under supervision in the community. The other 40 departments report that they maintain these data elements. For the 40 departments that maintain data about offenders under supervision, the data elements that describe the behaviors of offenders leading to terminations of supervision (table 4.2) and the criminal justice response to these behaviors (table 4.3) are more commonly collected in paper and in high-availability form than the other areas of data. Most of the departments maintain data elements in high-availability form on the type of supervision, whether supervision was terminated, the length of supervision, and reasons for termination of supervision. Most departments maintain few of the data elements about the employment status and living arrangements of offenders on release in the community. ---------------------------------------------------------- Chapter 5 Facility management information ---------------------------------------------------------- Highlights * Forty-two departments maintain information about the type of in-prison programs available to offenders; 37 report maintaining data about assessments of these programs. * At most, 25 departments maintain data about the quality and availability of medical staff in prison facilities, and most maintain these data in paper form. * Thirty-four departments maintain some data about annual operating costs of facilities; less than half maintain data on whether facilities generate revenue and if so, how much. The Inventory contains 15 questions about correctional facilities, programs and their evaluation, medical care, staffing, revenues, and costs. In the area of program management, the Inventory includes 3 questions about the types of programs and their assessment. On medical services, it asks 3 questions about the number of medical staff, their qualifications, and their availability (Footnote: Data elements about medical services provided to offenders are included in Chapter 3, "Managing offenders in corrections facilities.") Finally, the inventory includes 9 data elements about the number of facilities, staff, costs, and revenues. Departments were not rated on a full-availability measure for these data elements. Rather, their capability to maintain these data elements electronically is distinguished from their capability to maintain them on paper. ----------------------------- Program management ----------------------------- Program management data are not widely collected. About half of the departments (25) collect data in electronic form on types of programs offered by facilities under their jurisdiction. A third of the departments (17) collect information about programs in paper form only, and about one half of the departments (24) collect evaluation data on programs in electronic form. Most departments (36) do not collect data on dates of evaluation of programs (table 5.1). ---------------------------------------------------------- Medical services offered within facilities ---------------------------------------------------------- Data about medical services staff in facilities are not widely collected. One half of the departments (25) do not maintain data elements on medical services. About one third of the departments maintain data about medical services offered by facilities in paper format only. For example, 17 departments report having data in paper format on the number of staff who provide medical services; 21 departments report having data elements on the qualifications of medical staff; and 18 departments maintain data on the availability of medical staff (table 5.2). Few of these departments maintain data elements about medical services in electronic form. For example, only 8 departments collect data in electronic from on the number of medical staff that service their facilities. ----------------------------- Managing facilities ----------------------------- The Inventory included 9 data elements about the management of facilities, including elements about staffing ratios, beds per facility, revenues generated by facilities, and costs to operate facilities. Most of the departments (40) maintain data in electronic form on the total number of facilities, and 35 departments maintain data electronically on the number of beds per facility (table 5.3). More than half of the departments (29) maintain data electronically on the number of staff at facilities, and 25 departments maintain data elements electronically on the number of custodial staff at each facility. Up to a quarter of the departments report that they do not collect these data elements (table 5.3). About one half of the departments (25) do not maintain data elements on whether facilities generate revenue. In addition, 25 departments do not collect data on the amount of revenue generated by facilities. However, one quarter of the States collect this type of data on paper records. Almost half of the departments (22) maintain records about costs to operate facilities or on capital costs in paper format only. About one third of them (16) do not collect these data elements. Twelve departments collect cost data in electronic form. ----------------------------- Summary ----------------------------- In general, departments maintain limited information about programs, medical services, and facilities. However, with the exception of data elements about the number of facilities, beds per facility, and staff per facility, few departments maintain any of this information in readily accessible electronic form. Many departments maintain this information in paper form or do not collect it. This is especially the case for medical care and some of the data about facilities for which more than half of the departments do not maintain the data. -------------------------------------- Chapter 6 Reporting capabilities ----------------------------------- Highlights * The extent to which departments maintain all 207 offender-based data elements electronically for a large majority of offenders ranges from 85% to 16%. Thirty-two departments rate at or above 50% for all data elements on this rating of data availability. * Seven departments rate above 70% of full availability for the data elements in the profiling offenders stage. Twelve do so in the committing offenders stage, as do 10 departments in the managing offenders stage and 9 departments in the supervising offenders stage. * Departments' ability to provide statistical information about released offenders varies. All departments maintain the records of released offenders, and about half can electronically link and retrieve archived records of these offenders when they return to prison. * Only 40 departments maintain data about the behaviors of offenders under supervision in the community; and only 38 maintain data on the crimes they commit while under supervision. * Departments rate staffing and software problems as the most severe problems they must overcome in providing statistical information about offenders. Thirty of the 46 that rate staffing as a serious problem also rate software or data availability as a serious problem. Thus far, the report has focused on whether, and how, corrections information systems maintain data elements. This chapter shifts the focus of the report to how departments use data elements and to the obstacles and barriers they confront in providing statistical information about offenders, and sharing data electronically. -------------------------------------------------------------- Forms of statistical information provided by departments -------------------------------------------------------------- Statistical information describes outcomes, activities, or events pertaining to groups of offenders or to a corrections system as a whole. Such information may be used for many purposes--such as profiling the composition of offender populations; developing management and budget plans; responding to inquiries from the press, academics, or law makers; and developing corrections performance indicators. Questions such as, --How many offenders are in prison for robbery at yearend?-are commonly requested pieces of statistical information that profile offender populations. Answers to questions such as, --How many offenders who were released from prison during 1995 returned to the prison from which they were released within one year of their release?-- are often used for evaluative purposes, either implicit or explicit. Queries about --the proportion of all offenders who remained drug-free during the past year,-- or --the proportion of eligible offenders who were involved in prison work or training programs during the past year-- often are asked as indicators of the degree to which a corrections system achieved a particular goal. Information officials report that departments receive many different types of requests for statistical information. The most common are for summary statistics about specific groups of offenders. In addition to internal departmental requests for information from corrections managers, departments also regularly provide statistical information to governors, legislators, and officials in other State agencies (e.g., State auditors, departments of education, mental health, or labor). Such summary information is used for a variety of purposes: for scheduling (courts), assessing suitability of offenders for placement (halfway houses), sentencing and criminal investigations (district attorney's offices), locating --dead-beat dads-- (social service agencies), forecasting prison population (State planning agencies), and verifying benefits (Social Security Administration). Federal agencies request summary statistical information regularly from corrections departments. The Bureau of Justice Statistics requests summary data on several surveys of corrections populations: The National Prisoner Statistics (summary data on prison admissions, releases, and stocks), the Parole Data Survey (summary statistics on offenders on parole and other forms of post-incarceration supervision), and the Probation Data Survey (summary statistics on offenders on probation). For these particular surveys, departments are required to provide statistical information that is based on external standards or definitions. Thirty-eight departments, for example, provide data to the Bureau of Justice Statistics' National Corrections Reporting Program, which requires them to meet BJS definitional standards for counting offenders admitted into or released from prison, their form of admission, sentences imposed, and method of release. Most departments that submit data extracts to the BJS reporting program are able to meet these definitional standards. Departments that are unable to meet the definitions provide reasons why they cannot do so. Departments also respond to requests for data extracts that requesters of the extracts intend to analyze for their own purposes. Such requesters include researchers, newspapers, commercial banking systems and other private companies. Data extracts are provided on diskette, tape, or other medium, or via File Transfer Protocol. (For example, in Oregon, several companies purchase data tape from corrections departments and resell them to other entities looking into criminal histories of potential employees.) ------------------------------- Availability of data ------------------------------- Information systems cannot easily fulfill requests for information if the data are not readily available for analysis or for sharing with other jurisdictions. Maintaining data elements electronically for all (or most) offenders allows for greater data availability and facilitates responding to statistical inquiries. The Inventory rates data availability using an index that measures the extent to which departments maintain data elements electronically for a majority of offenders (more than 75%). The availability index ranges from 0% to 100%. A rating of 100% means that a department maintains all data elements in electronic form for the majority of offenders (full availability), while a rating of 0% indicates that a department does not collect any of the data elements being rated. To obtain a department's score on the availability index, each data element in a set of elements is given a value of 3, 2, 1, or 0, depending on how the department maintains the data element. High-availability data elements (maintained electronically for more than 75% of offenders) are given a value of 3. Medium- availability data elements (maintained electronically for less than 75% of offenders) are given a value of 2. Low-availability data elements (maintained in paper form only) receive a value of 1. Finally, no-availability data elements (a department does not collect the element) are given a zero (Footnote 1: 1This scoring system is described on pages 10-12 in the --Introduction-- to this report.) After each element is scored, the sum of the values for a group of elements is computed. This sum, also known as a department's availability rating, is divided by the total number of points that would be obtained if all data elements were maintained as high-availability data elements and then multiplied by 100%. For example, for the 207 data offender-based data elements, Colorado receives an availability index of 83% of full availability. Colorado receives a total of 518 points as its availability rating out of a possible 621 points, if it maintained all data elements as high-availability data elements. The rating of 518 is obtained from: 168 high-availability elements (168 x 3 points = 504 points), 7 medium-availability data elements (7 x 2 points = 14 points), and 32 no-availability data elements (32 x 0 points = 0 points). The sum of the points, 504 + 14, equals Colorado's score of 518. Finally, 518 divided by the 621 possible points yields the availability index of 83% when multiplied by 100%. Ten departments receive a full-availability rating above 70% (table 6.1) for the entire set of 207 offender-based data elements. Nine of these departments are among the 40 that maintain data elements for all 4 stages. One of them is among the 12 that maintain data on 3 of 4 stages. Twenty departments are rated at less than 50% of full availability. Generally, a department's availability rating increases with the number of data elements collected. For example, Iowa collects all but 5 data elements and rates at 80% of full availability, and Arizona, which rates at 85%, has all but 19 data elements (table 6.1). Many of the departments that rate below 50% of full availability collect less than half of the data elements. The two lowest rated departments, Alaska and the District of Columbia, do not collect a substantial number of data elements (157 and 156, respectively). Within the four stages of corrections processing, the availability of data among departments varies. While no department in any stage is rated at 100% of availability, some stages have greater availability than others. Within the profiling offenders stage the availability index ranges from 80% to 30% (table 6.2). Seven departments have full-availability ratings above 70% and 22 departments are at less than 50% of full availability. The committing offenders stage ranges from 92% (Iowa) to 16% (Alaska) of full availability, with 12 departments rating above 70%. One half of the departments are at 60% or more of full availability. Only 11 departments rate less than 50% of availability. In the managing offenders stage, there are ten departments rated at above 70% of full availability; while 20 operate at 50% of full availability. The full-availability ratings for managing offenders data range from 94% (Missouri) to 11% (Alaska). Twelve departments in the supervising offenders stage do not maintain in the information systems data about released offenders, and 14 do not maintain data about new crimes committed by offenders under supervision (including the victims of these crimes). For the 40 departments that do collect data about either or both of these areas, the full-availability ratings for this stage range from 93% (Arizona) to 7% (District of Columbia). Only two departments receive a rating of 90% or more of full availability. Less than a third have full-availability ratings of more than 50%. Not only does the availability of data among stages vary, but the number of data elements maintained in high-availability form also differs. In the profiling offenders stage, no department has the capability to provide all 29 data elements in a high-availability form (Appendix G). Most departments maintain some data in electronic form. Thirteen States and the Federal Bureau of Prisons (BOP) have the capacity to provide all 11 data elements on demographic characteristics in a high- or medium-availability form (not shown in a table). Most departments maintain very few of the elements on socio-economic status of offenders in electronic form. Two departments (Georgia and the BOP) maintain all five data elements about family relationships in electronic form. Most of the departments with relatively high-availability ratings in the committing offenders stage maintain a large number of data elements in high-availability form (Appendix G). However, some of these departments also maintain many data elements in a medium-availability form. For example, Ohio and Tennessee rate above 65% of full-availability, and each maintains a relatively large number of data elements in medium availability. In general, departments maintain conviction, sentencing, and commitment data with high availability. In the area of sentencing information, departments generally have much higher capacities to produce all elements in electronic format. Nine departments (Alabama, Colorado, Florida, Georgia, Missouri, North Carolina, Washington, Oregon, and South Carolina) maintain all 13 data elements electronically on sentencing information, and 26 departments have all three data elements on type of commitment in high-availability form (not shown in a table). No department maintains in high-availability form all of the 14 data elements about the criminal incident. In the managing offenders stage, seventeen departments maintain a third or more of the data elements in paper, or low-availability form (Appendix G). In general, these data describe program participation and outcomes, drug testing, medical treatment, and misconduct and infractions. Data elements that measure the form of release, good time and other adjustments to sentencing, post-commitment movements and offender registry are maintained in high-availability form. Data about post-commitment transfers, and methods of release from prison are maintained by all 52 departments, and a majority of departments have a high availability for data about movements, good-time adjustments, and victim notification requirements. No department in the supervising offenders stage maintains all the data elements in high-availability form (Appendix G). Many of the departments either do not collect sizable numbers of these data elements, or maintain data in low-availability form. For example, the District of Columbia does not collect 42 out of 45 data elements; Wyoming maintains 43 out of 45 data elements in paper form. The data with the highest availability are those that describe offenders' behavior on release and the response by corrections to violations of conditions of supervision. Few departments maintain high-availability data about victims of crimes committed by released offenders. Departments are not rated on a full-availability measure for facility-based data elements such as program management, medical services, staffing, and facility costs. Rather, their ability to maintain these data elements electronically is distinguished from their ability to maintain them on paper. Fourteen departments maintain more than half of the 15 facility-based data elements electronically (Appendix G). But, 26 other departments did not have at least 10 of the data elements. Many departments report that they do not maintain data electronically on program management, medical services and staffing, and costs of facilities. ------------------------------------- Capacities to link and share data ------------------------------------- To answer many statistical queries, departments need to link data from several databases or files, or to databases maintained by other sources. For questions related to offenders' histories, for example, departments need to link current records with the records of past behaviors. This may involve extracting archived records from tapes or other media and linking them to the existing case-management database. For questions about offenders' behaviors when they are outside the jurisdiction of correctional institutions--such as when they are released into the community--departments may have to link their records with those in an information system outside corrections, such as that maintained by a parole agency. This requires corrections departments to link parole records (which many do not record) to those in their database, and process the combined information to produce the desired statistic. Respondents expressed mixed views about the need to link data across jurisdictions. Some maintain that information on inmate movements after release (such as the police arrest data) is the only area of interest for sharing information across jurisdictions. Others either see no need to share information across jurisdictions, or think the task is virtually impossible without a really thorough understanding of the definitions and content of the information. Still others are more expansive in their views about the need to share information with departments in other States and the need to conduct comparisons across States. -- We get tons and tons of questions from other States about the number of offenders who have some characteristic, and having data from those States would facilitate comparisons-- is an example of this perspective. The types of linkages most frequently cited were those to their counterparts in other corrections departments. While one official noted that electronic linkages to share data would be valuable, he stressed that direct connections with the human resources of information systems were most crucial for him. He would like to have e-mail and telephone contacts with other information systems officials so that he could ask simple but very important questions about creating statistical information. Having contacts in other departments to discuss questions such as: --What do you do?-- or --How do you create that measure?-or --What data are in that other data base?-- or --Who is the best person to talk with in your system?-- are extremely valuable in his view and the view of some other respondents. Corrections departments link databases in a variety of ways. The most advanced types of linkages occur when different agencies share data systems. Some departments are decentralized, but have some form of communication system to link systems across facilities. These links are generally through advanced communications systems such as LANs or WANs. But in some cases the connections among facilities involve sharing the most recent updates on diskettes or by fax. In few departments, databases are linked by giving users from other departments query-only access. The primary links in the departments' information systems are for users at workstations in the system (correctional officers, counselors, and personnel in the business offices). These officials are given routine access to the database tracking offenders. A considerable number of departments (23) have no links to other agencies outside of the corrections system (secondary links). But a majority (28) have connections to at least some parts of other agencies' database, typically on a query–only basis (not shown in a table). For example, beginning in 1993, the New Jersey Department of Law and Public Safety, together with the Office of Telecommunications and Information Systems (OTIS), the Administrative Office of Courts (AOC), and the Department of Corrections (DOC) implemented a plan to improve overall offender tracking whereby each could access the other's independent systems and routinely update selected parts of records1 (Footnote 2: New Jersey Department of Corrections. Computerized law enforcement systems in the Department of Corrections. (Trenton, NJ: Department of Corrections, Office of Policy and Planning, 1994). Most links to agencies, however, are not through electronic means but through hard copy reports or extracts of tapes. The importance of working toward a goal of integrating data from all criminal justice agencies--including corrections, probation and parole offices, the courts, and the police--into one comprehensive information system for users in all these agencies was stressed by many corrections information officials. Officials also made many other recommendations for improving corrections information systems capacities to respond to statistical queries. These include: creating common definitions, unique identifiers, and other standard formats for linking records across agencies; converting systems currently on mainframe (especially state-wide systems) into client-server, stand-alone systems; transforming departmental systems into more tightly-centralized operations; and integrating all in-facility computer functions using one server or platform. Respondents stressed that existing systems have a good history of service. But they also think information systems need to be much more flexible if they are to respond adequately and efficiently to the volume of requests for data and information. Many asserted that corrections information systems are in overall need of improvements, and better linkages, to meet the challenges of corrections change and to keep pace with technological change. -------------------------------------------------------------- Internal capacities to extract and link archival records -------------------------------------------------------------- The number of times that groups of offenders behaved in certain ways is often an important focus of statistical questions. For example, questions about the number of infractions committed by offenders having certain characteristics may involve extracting historical records and linking them to the current case-management records. "What is the average number of disciplinary infractions committed during the first year of imprisonment for offenders who entered during 1990 and stayed at least one year, and how does that compare to the same statistic for offenders who entered during 1995, after a new reform was implemented?" is a concrete example. Information systems may be structured so that individual records of disciplinary actions are stored separately from the records related to current information. Before the averages can be computed to answer this question, the information about past disciplinary infractions for offenders who had more than one infraction needs to be extracted from the historical record and linked to the current records. A large number of important corrections events have relevant histories. Several of them include offenders' commitments into correction facilities, movements within a jurisdiction or transfers between jurisdictions, behaviors constituting misconduct or infractions, and behaviors on release in the community. Behaviors of offenders on release in the community are particularly important for impacts of corrections policy on public safety. Departments vary in their capacities to store, retrieve, and link data about these events. Many keep all information about these repeatable events on-line for offenders currently under correctional authority. Others also store these data and have the capability to retrieve and link this information electronically. In general—for information about prison commitments, behavior in prison, and prison releases—most departments either store on-line histories of these repeatable events or have the capacity to link archived records of these events. Forty-six departments maintain an on-line history of an offender's commitments into prison. Thirty-one archive commitment histories, and of these, 28 departments have the capability to retrieve and link electronically the archived records (table 6.3). With respect to information about an offender's post-commitment movements, almost all departments (49) maintain this information on-line, while about half of these also archive this information, and 22 of this group of 26 have the capacity to retrieve and link the archived data electronically. In other categories of repeatable events, many departments either store the information on-line or have the capacity to link archived records. All departments maintain records of previously released offenders, with the majority (44) keeping these data permanently available on-line. On data about behavior on release, 40 departments maintain data elements on the reasons for termination of supervision, and 38 departments obtain information about the new crimes committed by offenders who were released in the community. Of these 38, most (28) obtain this information about new crimes only after the offender returned to prison. The capacity of corrections information systems to store, retrieve, and link data on the supervising offenders stage may be related to the organization of corrections in particular states. Of the 12 departments that do not record information about crimes of offenders on release in the community, many are "prison only" systems (Footnote 3: Vital Statistics, American Correctional Association, 1994.) Other departments may be integrated corrections systems that utilized an information system other than the corrections information system to record information about offenders on release in the community. For example, in the State of Maryland, corrections is a division within the Department of Public Safety and Correctional Services. The corrections division maintains data elements on offenders in Maryland's prisons, while the Division of Parole and Probation maintains data elements on offenders released into the community. -------------------------------------------------------------- Capacities to provide statistical information -------------------------------------------------------------- In general, the overall ability of a corrections department to provide statistical information depends upon the capabilities of each of several components of its information system. These capabilities are organized into five categories: * Legislative and institutional * legal restrictions on access or use of data * legislative reforms that affect operation of the information system * institutional requirements * Hardware, meaning the computer system that maintains software and data * storage capacity * capacity to process data * ability to access historical data * reliability (amount of downtime) * Software, meaning programs that operate on the data (whether these were developed from standard programming languages, purchased off the shelf, or specific routines designed for specific tasks) * capability of existing software * capability of existing query language * ability to integrate data from separate files * ability to integrate data from separate databases * ability to structure data files * Staffing, from data entry to management staff * number of current programming staff * lack of in-house programming staff * experience level of programming staff * ability to provide adequate training for staff availability of funding to upgrade systems * Data, including collection of data elements and the data stored on each * completeness of coverage for each data element * accuracy of data for each element * timeliness of data (Footnote 4: A copy of the questionnaire is provided in Appendix B.)2 Problems that arise in any one of these areas can affect the capabilities of information systems to provide statistical information. Conversely, strengths in one component of an information system may be used to overcome deficiencies in another. The Obstacles survey asked departments to rate each component on a scale ranging from 1 (not at all) to 5 (critical problem). The most severely rated obstacle to providing statistical information is the number of analysis and programming staff (table 6.4). Across the 52 reporting departments, it receives a mean score of 3.9 (on a scale of 1 to 5) and the least variation around its mean. Funding for systems upgrades, modifications, or staffing are the second most severe barrier. Departments tending to experience this obstacle in a relatively severe manner, as reflected in a mean ranking of 3.8 and relatively little variation around the mean. Other obstacles that rank as relatively severe problems by the departments include: lack of in-house programming staff, inability to provide adequate training for staff, inability to integrate data from separate databases, and the low experience level of programming and analysis staffing. Two additional obstacles -- the accuracy of the data and integrating data from separate data files -present somewhat of a barrier. Another eight obstacles present less severe barriers. These items receive mean ratings of between 2.5 and 2.9 and include: the data completeness, legislative reforms, the structure of data files, capability of the query language, data timeliness, statistical software capabilities, ability to access historical data, and institutional system requirements, and there was greater variability around the means. Finally, four -- legal restrictions on data, capacity to process data, storage capacity, and system downtime -- present relatively minor barriers to departments, as reflected in their average rankings of 2.4 or less. The grouping of individual items into the five major obstacle categories is shown in table 6.5 with the mean category score and severity ranking of each. The group averages range from 3.6 for staffing-related, the most serious group of obstacles, to 2.2 for hardware-related obstacles, the least serious set of obstacles. Software and data problems each average ratings of 2.9. Institutional arrangements -- legislative reforms, requirements to use specific hardware, and legal restrictions of the use of data -- are rated at 2.6, on average. Staffing related issues present severe obstacles, as the 3.6 average score for these obstacles indicates. The individual items within the staffing group indicate that the number of staff (too few), ability to provide for their adequate training, and availability of funding for new staff all approach very severe levels for the departments. Software problems, such as the capacity of query languages or of statistical software, and data problems, such as the timeliness, completeness, and accuracy of data elements, also present relatively severe levels of barrier. Staffing, software, and data are interrelated. Having access to sophisticated database software, query languages, and statistical packages is not enough if a department lacks staff trained in the use of these technologies. Staff that are knowledgeable in the use of these software tools but lacking access to them cannot use their skills to produce statistical information. And having sophisticated software and data entry procedures is not enough if staff are not adequately trained in data collection, data entry, and other data preparation tasks. The reported deficiencies in the number of staff, lack of funding for system upgrades, modifications (software problems), and staffing skills combine to suggest that the primary obstacle to overcome is lack of resources. Additional resources will allow departments to overcome these deficiencies with staffing shortages, training deficiencies, and system inadequacies as they see fit -- with maximum impact on their overall capacities to provide statistical information. ---------------------------------------- Varying obstacles among departments ---------------------------------------- Departments vary in the severity of the obstacles they confront (tables 6.6 and 6.7). Individual staffing obstacles rate an average of 5, the critical level, by 8 departments. An additional 20 departments rate staffing obstacles between 4 and 5 on average, indicating a very severe obstacle. Only 6 departments rate staffing obstacles at 2 or less on average. Software problems average slightly lower severity than staffing obstacles. Only two departments rate the 5 software obstacles at the critical level, and 12 departments rate them as very severe. An additional 20 departments rate them as a moderate obstacle. Four departments rate the 3 data obstacles at a critical level, on average, and an additional 8 departments rate them as very severe. Eighteen departments rate the severity of data obstacles as being very little to none. Legislative and institutional obstacles do not provide major barriers but they can not be ignored either. No department rates these problems as critical, but five departments rate legislative and institutional obstacles as very severe and 25 rate them as a moderate obstacle. No department rates hardware problems at the critical level, and one has very severe hardware problems. Twenty-six of the responding departments rate hardware obstacles as having very little severity. Staffing, software, and data problems tend to go together (Footnote5: 5The clustering of obstacles within groups of departments was addressed in more detail in a preliminary analysis based on factor and cluster analyses. The results from these analyses confirm the observations here that staffing, software, and data problems tend to cluster together, and that different groups of departments experienced different degrees of each cluster of obstacles.) For the 46 departments that rate staffing obstacles an average of 3 or higher, most also rated software and data problems at average severity levels above 2 (table 6.6). Of the same 46 departments, only 16 rate either software or data problems at an average of 2 or lower. There are exceptions. In Florida and North Carolina, for example, staffing obstacles are reported to be more severe (averaging 3.6 and 3, respectively) than software, data, or hardware (2 or less on average). North Carolina recently completed a major redesign of its correctional information system to improve its capabilities. One of the models North Carolina used to redesign its system was the information system developed by the State of Florida. Both departments indicate that while staffing problems tend to be accompanied by software and data problems, information systems with the best designed software, good data, and advanced hardware can still confront major barriers in providing statistical information. ------------------------------- Summary ------------------------------- Officials in corrections information departments report that they routinely respond to requests for raw data and summary information on offenders. They also receive requests that require analysis and processing of data elements into specified formats to meet external definitions and standards. Departments use a variety of media to submit data, including hard copy, tape, diskettes, or file transfer protocols. In addition to corrections staff users, requesters of corrections data include Federal agencies, a wide range of State and local agencies, researchers, and private companies. High-availability data varies widely among the departments and all of the four stages of corrections processing. No department has all the data elements in high-availability form, nor does any department have all of the elements that correspond to each stage of corrections processing. The stages related to committing offenders and managing offenders have the most departments with relatively high data availability index scores. Twelve departments rate at 70% or above for all 72 data elements in the committing offenders stage, and 10 departments in the managing offenders stage rate above 70%. The supervising offenders stage has nine departments that collect data about released offenders rated at above 70%, and seven departments in the profiling offenders stage scored higher than 70% on the index. The information most available in high-availability form are data that describe offenders' demographic characteristics, conviction offenses, sentences imposed, current commitment, expected time to be served, risk assessment, classification and confinement decisions, post-commitment movements, good time and other sentence adjustments, releases from custody, reasons for terminating supervision, and the criminal justice response to supervision violations. In general, the information with the lowest level of high-availability are data describing offenders' socio-economic status, family characteristics, the criminal incident, victim information, medical care, and employment and residence information about released offenders. To answer statistical queries, departments frequently need to construct links among internal databases or between them and databases of other agencies. Commonly mentioned obstacles in creating links to other data systems include: existence of several platforms of different ages and data formats, which makes interfaces between them complex; lack of common definitions, unique identifiers, and other standard formats for linking records across agencies; and outdated systems that do not respond readily or flexibly to queries for information. Corrections staff also frequently noted the importance of working toward a goal of integrating all criminal justice agencies into one comprehensive information system that would be shared by all users. A key issue for corrections information systems is the extent to which departments can record data on event histories. For example, can an information system record the number of times an offender commits an infraction, or the number of times an offender enters and exits prison during a single prison term? Data on such "repeatable events" may be required information for measuring corrections system performance. Departments generally reported that their highest capacities for storing, retrieving, and linking archived data on repeatable events for data elements related to commitments into prison, post-commitment movements, and releases from prison during a term. Departments report that they do indeed confront several obstacles in producing statistical information. These range from the need to reformat their data to comply with standards and formats of the requester, through hardware limitations that restrict the capabilities for executing queries, and software limitations that require departments to create customized programs to generate reports and data, to shortages of experienced staff that prevent timely resolution of data requests. Responses to the Obstacles survey confirm that the most serious obstacles encountered by corrections departments in producing statistical information are staffing-related obstacles—including the number of analysis and programming staff, their experience, and the resources to further train them. Staffing-related obstacles are closely related in severity to software applications and data constraints. Forty-six departments rated staffing obstacles as providing a serious barrier to their providing statistical information. All but 16 of these also rated either software or data as serious constraints. Relatively few departments rate either hardware or legislative and legal factors as serious barriers to producing statistical information. ---------------------------------------- Chapter 7 An empirical common core ---------------------------------------- Highlights * Departments maintain a common core of data about 14 dimensions of corrections processing that contain 100 of the 207 offender-based data elements in this Inventory. * For the 100 core elements, most departments (48) rate above 50% of full availability—the extent to which they maintain core data elements electronically for more than 75% of offenders. Eight departments rate above 90% and 29 rate above 70%. * Thirty-nine departments rate above 70% of full availability in the profiling offenders stage, as do 35 in the committing offenders stage, 22 in the managing offenders stage, and 18 out of the 40 that maintain data on offenders released into the community. * Within non-core dimensions, there are 15 data elements that more than 26 departments maintain in high-availability form. -------------------------------------------------------------------- Commonly maintained, high-availability data elements -------------------------------------------------------------------- Departments of corrections currently maintain a common core of data about 14 dimensions of corrections processing. These dimensions describe several aspects of who offenders are, what they have done, how they arrive in prison, how they are managed, and what happens to them upon release. The common core is based on the dimensions of corrections processing for which most departments maintain data in electronic form for most offenders. Each core dimension contains several data elements. All of the data elements in a core dimension are included in the common core of data elements. To say that a common core of data exists for a given dimension does not mean that every data element in that dimension is maintained in a high-availability form by every department. It means that, within a dimension, enough data elements are maintained by a majority of departments in high-availability form to constitute a meaningful core of information measures for a given dimension. Dimensions determine the core because many corrections concepts are best measured by several data elements. The high-availability standard (maintained electronically for more than 75% of offenders) reflects the form of data that can most readily be analyzed, shared electronically, and processed into the types of statistical information that measure corrections performance. Data elements in 14 of the 28 dimensions of corrections processing are included in the common core (table 7.1). Each stage of corrections processing has at least one dimension in the common core, and a total of 100 of the 207 offender-based data elements fall within the dimensions that comprise the core. By stage, the core dimensions include: * Stage 1, profiling and identifying offenders: demographic characteristics (11 elements); * Stage 2, committing offenders: conviction offenses, sentences imposed, current commitment, expected time to be served, risk assessment, classification decisions, and confinement characteristics (42 data elements); * Stage 3, managing offenders: post-commitment movements, good-time and other sentence adjustments, offender registry, and releases from custody (25 data elements); and * Stage 4, supervising offenders: behavior on supervision and responses to violations of conditions of supervision (22 data elements). The common core for stages 1, 2, and 3 is based on the responses from all 52 departments. The common core for stage 4 excludes the 12 departments that do not use their adult, sentenced prisoner information systems to collect data on offenders released into the community; it is based on the responses of the 40 departments that do maintain data elements on released offenders. The common core describes offenders' demographic characteristics such as age, sex, race, and country of origin. It contains data elements about commitments, convictions, sentences, and offenses that describe how and why offenders arrived in prison. The common core also describes classification, confinement, and risk assessment decisions, and it contains data elements that describe how long offenders can expect to stay in prison. In the management of offenders stage of processing, the core describes reasons for changes in sentences and changes in expected length of stay; it also describes offenders' movements and releases from prison. The core also includes limited information about victim notification requirements. Finally, the core dimensions of the supervising offenders stage include data elements that describe the form of supervision, the reasons for termination of supervision, whether a new crime was committed by an offender on supervised release, and if so, whether an offender was arrested, convicted, sentenced, and returned to prison. Of the 100 core data elements, 8 are collected by all departments, 60 are collected by more than 70% of departments, and only 9 data elements are collected by fewer than 50% of departments (not shown in a table). The common core data elements that are collected by all departments include transfer to another facility, date of transfer, type and date of release, sex and race of offender, and type and date of commitment. The 60 data elements that most departments collect include nearly all of the data elements that describe demographic characteristics, sentencing, time to be served, classification decisions, confinement characteristics, post-commitment movements, good time and releases. The nine data elements that are less commonly collected are the offender's psychological index, who authorized a prison movement, amount of change to sentence, amount of change to expected release date, special credits to sentences, if offender actually registered as a sex offender, if and when a release violation was adjudicated, and the date a released offender is rearrested (not shown in a table). ----------------------------------------------------- Full-availability ratings for common-core data ----------------------------------------------------- Assessed on the extent to which they maintain all data elements included in the common core in high-availability form, 29 departments rate at greater than 70% of full availability, and only four departments rate at less than 50% (table 7.2). Colorado maintains all 100 core data elements in high-availability form, and seven other departments maintain more than 90% of their core data elements in high availability. Departments that rate less than 70% on the availability index generally maintain many core data elements in paper form, or do not collect a majority of this information. For the 11 data elements in common core of Stage 1 (profiling offenders)—the demographic characteristics of offenders—39 departments receive full-availability ratings of greater than 70%, and 9 have full-availability ratings of 100%. Only 4 departments rate at less than 50% of full availability (table 7.3). In Stage 2 (committing offenders), 35 departments receive full-availability ratings above 70%; and 2 have full-availability ratings of 100% for 42 core data elements. Only 4 departments receive full-availability ratings of less than 50%. For the 25 core data elements in the third stage (managing offenders), 22 departments receive full-availability ratings greater than 70%. Twelve received full-availability ratings of less than 50%. Of the 40 departments that maintain 22 core data elements in the Stage 4 (supervising offenders on release), 18 have full-availability ratings greater than 70%. Eleven departments rate at less than 50% of full availability. Departments maintain core data elements in high-availability form Within each stage of processing, relatively few core data elements are maintained in low-availability forms, and there are few data elements that are not collected. Only 12 departments maintain fewer than 7 of the 11 demographic data elements in high availability form, but 9 have all 11 data elements in this form, and 20 collect all demographic core data elements (Appendix H). In the committing offenders stage, half of the departments maintain high-availability core data for at least 30 out of the 42 data elements. Colorado and North Carolina maintain all core commitment data elements in high availability. In the managing offenders stage, 2 departments—Colorado and Ohio—maintain all 25 core data elements in high availability, and an additional 12 departments maintain at least 20 high-availability data elements. Four departments collect all core management data elements in some form. In the supervising offenders stage, 8 departments out of 40 collect all 22 data elements in high-availability form. Seventeen departments collect all core supervision data. ---------------------------------- Common definitions ---------------------------------- Many corrections departments have data elements needed to generate statistical information on a core set of information issues. Of the 40 departments that collect data on all 4 stages of corrections processing, almost all are above 50% availability for the 100 core data elements (table 7.3). Colorado is at 100% availability for all core data elements. Departments maintain a large number of data elements, and they do so in a high-availability form that facilitates processing, analyzing, and sharing the data maintained on these elements. Not surprisingly, the common-core data reflect information issues that revolve around the day-to-day management concerns of corrections. These day-to-day management concerns include many important pieces of information that are related to corrections performance. For example, counts of offenders who enter prison, complete their stay without incident, leave prison for supervision, and complete supervision without incident are fundamental for corrections performance indicators. The ability to provide such counts for subpopulations of offenders and according to criminal justice processing variables (such as type of offense, or length of stay) enhance the comparability of performance indicators. Further, the capability to measure duration of supervision and returns to prison for new crimes or violations of conditions of supervision lie at the basis of developing measures of recidivism. This is true regardless of the controversies associated with interpreting a recidivism rate as a "good" or "bad" indicator of performance. In short, the capacities of the departments to maintain a large volume of common data elements about offenders is quite high. This is particularly impressive given the wide variation in corrections organization, legal frameworks, and penal codes operating in the States, the Federal government, and the District of Columbia. Findings about common-core data elements do not necessarily imply that all departments define the elements in exactly the same way or collect exactly the same pieces of information in the elements. For example, several of the Inventory questions asked departments to indicate the response categories (such as type of program in which an offender participated) for data elements they collect. For some data elements, departments collect different categories of information. But for other data elements, (such as type of commitment or method of release from prison), there is greater agreement. For the latter, differences in response categories indicate a greater or lesser degree of precision rather than differences in scope or definition. ---------------------------------------------- Non-core high-availability data elements ---------------------------------------------- Within several core dimensions, there are some data elements that are collected by fewer than 26 departments. Conversely, within several non-core dimensions there are data elements that a majority of departments maintain in high-availability form. Out of the 100 core data elements, 20 are not maintained in high-availability form by most departments; out of the 107 non-core data elements, 15 are maintained in high availability by most departments. There is one data element in the core of the profiling offenders stage that a majority of departments do not maintain in high availability: offenders' citizenship status. On the other hand, there are two data elements in this stage that are not included in the core but which a majority of departments maintain in high availability form: Education level and marital status (table 7.4). Among the common core in the committing offenders stage, there are 8 data elements that are maintained in high availability form by fewer than 26 departments (table 7.5). These include: * Charges on the indictment; * The reason for a mandatory sentence; * If monetary sanctions were imposed; * Amount of sanctions imposed; * If a weapon was used in the offense; * Gang membership; * Psychological index or score; and * Medical index or score. Also in this stage, there are three data elements that are not included in the core but that a majority of departments maintain in high-availability form: the criminal justice status of the offender, the medical condition of the offender at admission, and the types of need as determined by an assessment. In the managing offenders stage, seven of the data elements included in the core are not maintained by a majority of departments, including: * Who authorized the transfer; * Who authorized the move; * Amount of change to sentence; * Changes to release date; * Special credits; * If offender is required to register as sex offender; and * If offender actually registered as sex offender (table 7.6). On the other hand, there are six data elements outside of the core for managing offenders that a majority of departments maintain in high-availability form. These include: * Type of program participation; * Date of program participation; * Type of misconduct in prison; * Date of misconduct in prison; * History of misconduct in prison; and * Disposition of the charges of a legal proceeding (table 7.7). Within the supervising offenders stage, 4 of the data elements in the core are collected by less than 26 departments. Three are in the responses to violations dimension: if offender was adjudicated, date adjudicated, and date the offender was arrested (table 7.7). The type of technical violation is also less commonly collected in high availability than the other elements in the dimension describing violations committed after release. There are three non-core data elements in this stage that are collected by a majority of departments: the address of the released offender, if the offender was under supervision when the crime was committed, and the type of the new offense committed. ---------------------------------- Data elements about facilities ---------------------------------- Fifteen of the Inventory's survey questions are about facilities, costs, and availability of medical care. These data are not collected by most departments in electronic form and are not considered part of the core. They generally are maintained in paper form. The number of facilities and the number of beds per facility are the only data elements that are maintained electronically by more than 70% of the departments. Only 12 departments maintain data electronically about operational costs, annual capital costs and facility-generated revenue. Fewer than half of the departments collect information in any form about the number, qualifications, or availability of their medical staff. ---------------------------------- Conclusions ---------------------------------- Of the 207 offender-based data elements in this inventory, 100 of them are included in the core 14 dimensions of corrections processing. Most of the common-core data are found in areas related to committing offenders into prison (7 dimensions) and managing offenders in corrections facilities (4 dimensions). For the departments that collect data on released offenders, 2 dimensions are included in the common core. The profiling of offenders has one dimension in the core. Within these core dimensions, 20 of the 100 data elements are not maintained in a high-availability form by a majority of departments. Of the 107 non-core data elements, 15 are maintained in high-availability form by most departments. However, these high-availability non-core data elements do not measure an entire dimension of corrections processing, as do the data elements currently in the core. Several implications for the objective of creating commonly-defined indicators of corrections performance flow from these findings. First, the commonality with which departments collect and maintain data in high-availability form does not necessarily (and need not) imply that departments define these commonly-collected data elements in exactly the same manner, or that they apply to exactly the same groups of offenders among the departments. Nor should it be expected that the data elements necessarily be defined in exactly the same manner among all departments. State penal codes and criminal procedures vary. And a department's adherence to its jurisdiction's laws and regulations rightly takes precedence over defining a group of data elements in the same way that other departments define them, or applying them to the same groups of offenders. Second, there are important areas in which no common-core data currently exists. Under-covered areas in Stage 1 (profiling offenders) include information about the social and economic characteristics and family relationships of offenders. These variables can be used to indicate the degree to which offenders maintain connections with mainstream institutions outside of prison. In Stage 2 (committing offenders), under-represented areas include the criminal incident leading to the conviction—especially victim-related information, criminal history, and offender needs assessments. Within Stage 3 (managing offenders), the under-covered areas include program participation, drug testing, medical care, misconduct and infractions, and the corrections system responses. The drug testing information is especially important for developing indicators of how well corrections institutions keep offenders involved in maintaining positive behaviors. Finally, Stage 4 (supervising offenders in the community), data elements about offenders employment experiences, about new crimes they commit on release and about the victims of the crimes is not widely collected. Although the methods to increase coverage of data elements may vary from stage to stage and dimension to dimension, there are two basic strategies to address the problem of under-covered data elements: (1) departments may collect the data themselves; or (2) departments may rely on other agencies to collect it and then obtain the data or access to it in electronic format. If the strategy of relying on another agency to collect data elements is used, then departments must solve both technical problems related to transferring and linking data, and the substantive problems associated with defining the data elements. Obtaining data from other agencies may also increase the staffing requirements for corrections information systems departments, perhaps by requiring staff with a higher level of skill. If departments rely on other agencies to collect data elements, they may have to expand in both directions. For example, to obtain data on offenses and victims leading to a conviction, corrections departments may have to develop links with prosecutors or the courts. Conversely, to develop additional capacities in Stage 4 (supervising offenders on release), departments may have to develop links with parole departments. In either case, the complications associated with using other agencies' data exist and may be compounded. Third, there may be important data elements beyond those in this survey, and these may suggest areas to expand coverage of data elements. Fourth, the capacities of departments' information systems to provide statistical information are constrained by staff, software, and to a lesser degree, data. ---------------------------------- Chapter 8 Using the Inventory report ---------------------------------- The purpose of the Inventory project is to provide a basis for improving the quality of corrections data and enhancing electronic sharing of information. This report identifies the capacity of corrections departments to provide comparable data for performance measures and for cross-jurisdictional research. It describes existing information systems, but does not recommend a model system for all departments or develop a strategy for future actions. The report identifies a common core of data elements that most or all departments collect; describes and analyzes the obstacles departments face in responding to statistical inquiries; and describes departments' capacities for sharing and linking data internally and externally. Additionally, the report provides a list of respondents (Appendix C) that may be used by departments or researchers to obtain information or assistance. This report may be used-- * by departments for expanding data collection. Departments may use information about the availability of the common-core data elements to develop priorities for adding data elements and improving the availability of existing data. * by departments to assist in their ongoing information system redesign activities. Departments in the process of modifying their information systems may use the report to identify commonly collected data elements and to understand how departments differ in their capacities to maintain data in electronic form. * by research directors and other corrections researchers to determine availability of data elements in cross-jurisdictional studies. In designing comparative studies, researchers may use the report to identify the reporting capabilities of participating departments * by ASCA members to develop strategies for establishing performance measures. ASCA members may use the report to develop more specific priorities about measuring corrections performance, to identify indicators based on commonly collected data elements, and to decide what additional information is needed for these performance measures. ---------------------------- Expanding data collection ---------------------------- Departments may use information about the availability of data elements to develop priorities for expanding their data collections. Data collections may be expanded by adding data elements and by improving the availability -- their storage in electronic medium -- of data elements. Departments using the Inventory in developing priorities for expanding data collection may wish to consider several related issues. An advisory committee established priority information areas. The 207 offender-based data elements in the Inventory were derived from the six priority information areas that the project's advisory committee identified. These six areas --offender profile, recidivism, program effectiveness, internal order, public safety, and operational costs -- cover the scope of corrections processing and reflect important corrections management outcomes. The Inventory shows what departments have. The Inventory shows which of the 207 offender-based data elements departments collect and how they maintain the data elements. It shows which data elements are more commonly collected and which are collected by fewer departments. It permits departments to compare their data collection with other departments. The Inventory's common core is an experiential core. The common core of 100 data elements reflects what departments currently collect and not necessarily what they should collect. Departments wishing to use this experiential core in establishing priorities should recognize that expanding collections up to the existing core will increase the concentration of departments that collect core elements; but it will not necessarily expand the scope of the common core. To do this, departments should consider the entire set of 207 data elements and the six priority information areas. High-availability formats facilitate sharing data. Maintaining data electronically can facilitate sharing information. This important objective can be met by expanding collections to increase the number of data elements that are maintained in electronic format. Cross-agency linkages may be a way to obtain additional data elements. Departments may wish to consider developing cross-agency linkages with other information systems as a method for adding data elements. In departments for which core data elements are beyond the scope of the information system that they use to manage adult sentenced prisoners, electronic linkages with other agencies may provide a relatively inexpensive method for gathering data or additional data elements. ------------------------------------ Redesigning information systems ------------------------------------ The Inventory may help departments establish priorities for upgrading their information systems. It shows areas where improvements are needed in data collection and maintenance, and where problems are faced by many corrections departments in reporting statistical information. The results of the obstacles survey do not suffice as or replace any internal audit of an information system, but they can point to areas where departments may want to concentrate efforts in MIS re-engineering. Establishing cross-agency linkages and offender tracking systems. As part of an MIS upgrade or independently of such efforts, departments may wish to consider developing more cross-agency linkages and better systems for tracking offenders. Such efforts may be undertaken in a variety of ways. At a simple level, data extracts can be shared on diskette, tape, or physically transferable media. At a higher level, the capability to query another agency's database could be established. At a higher level still, agencies can participate in an offender-based tracking system (OBTS). An OBTS allows a participating agency direct access to the data for which they have collection responsibility, but only permits them to gain access to obtain data in the system through a specific request to information systems staff or staff from the agency with collection responsibility. At the highest level, agencies could participate in an integrated information system (ITS) that permits sharing of all automated data among all participating agencies. At the levels of sharing below those of the OBTS or ITS, the major problem lies not in sharing information per se but in linking it and ensuring that data elements are defined in the same way among information systems. Linking records is greatly facilitated if all the agencies involved use a common identifying number. If that is not feasible, other methods could be developed to link records. ---------------------------------- Cross-jurisdictional research ---------------------------------- Corrections researchers may use the Inventory to help to identify research topics for and potential barriers to conducting cross-jurisdictional research. In a survey of research units in departments of corrections, researchers identified several important topics for comparative research (Footnote 1: Association of State Corrections Administrators Subcommittee on Research. Cross-jurisdictional survey of correctional research offices, summary of findings: Final report, Vol. 1. September 1995. (Prepared by the Pennsylvania Department of Corrections, Office of Planning, Research, and Statistics.) These include: studies on recidivism, alternatives to prison, sentencing structures, and evaluations of corrections programs and policies. Many of these topics1 are reflected in the experiential core of data elements that currently are commonly collected. Researchers interested in topics that are not reflected in the core can use the Inventory to design research and plan data collection activities. As the availability of data elements in electronic form and the resources to prepare extracts or research datasets pose potential problems for conducting research, the Inventory can show when and where these are likely to occur. Researchers can use this information to plan the scope of research and to learn about data systems. ---------------------------- Performance measures ---------------------------- The Association of State Correctional Administrators has expressed an interest in developing and using corrections performance indicators to describe, measure, and compare the management of corrections populations. For the following reasons, that goal is beyond the scope of this Inventory project. First, performance indicators in general are tied to the mission, goals, and objectives of organizations. Comparative corrections performance indicators would have to consider the varying missions, legal structures, and organizational arrangements of corrections departments throughout the country. The standardization of measures that take such factors into account is extremely complex. Second, standard or traditional approaches to measuring corrections performance, such as those that use the crime-rate related concepts of recidivism, deterrence, and incapacitation, are difficult to measure and interpret. More importantly, these indicators establish a standard for corrections performance that is based on what happens outside of prison or beyond the scope of corrections supervision. For example, an offender on release in a community is subject to many factors that are beyond the control of corrections. Even if this offender commits a crime while under supervision, the measure of the recidivism rate is related to the performance of the police, prosecutors, and judges in apprehending, convicting and sentencing offenders. Third, alternative approaches to measuring corrections performance, such as those proposed in several papers in the Bureau of Justice Statistics Performance Measures for the Criminal Justice System, provide a useful starting point for developing corrections indicators that are tied to specific and shared corrections goals. These alternatives limit the mission and goals of corrections to the activities and outcomes that are within the scope of control of corrections. For example, in his article on "Criminal Justice Performance Indicators for Prisons," Charles Logan (Footnote 2: Logan, Charles H., Ph.D., "Criminal Justice Performance Indicators for Prisons," in Performance Measures for the Criminal Justice System. Washington, DC: Bureau of Justice Statistics, NCJ-143505, 1993: pp. 19-59.) develops a series of measures for prisons that are tied to a confinement model of prisons. In this model, Logan identifies the mission of prisons as "keeping prisoners," keeping them in, safe, in line, healthy, and busy and doing this without undue suffering and as efficiently as possible (Footnote 3: Logan, p. 25.)2 From this, he derives measures of performance that are related to security, safety, order, care, activities, justice (as fairness), conditions (without undue suffering), and management. Each indicator can be tied to the effort of corrections officials. Similarly, in her article about community corrections in the same volume, Joan Petersilia echoes many of Logan's sentiments. Petersilia argues that performance indicators for community corrections should be based on: (1) an articulate mission statement for community corrections; (2) a clear statement of the goals contained within the mission statement; (3) specific methods or activities that address each goal; and (4) measurable indicators of performance for each goal. (Footnote 4: Petersilia, Joan, "Measuring the Performance of Community Corrections," in Performance Measures for the Criminal Justice System. Washington, DC: Bureau of Justice Statistics, NCJ-143505, 1993: pp. 61-85.) She also stresses that the performance and success of community corrections should "reflect only activities that occur while the offender is formally on community corrections status, not beyond" [emphasis original] (Footnote 5: Petersilia, p. 74.) While much of the work related to developing corrections performance indicators must be done by a deliberative body that can address the complexities described above, the Inventory may be useful in developing indicators in several ways. The Inventory points to areas of commonality. The Inventory results show that for many important areas of corrections processing, many departments collect roughly comparable data. This is reason for optimism. If many or most departments have the raw material needed to develop performance indicators, then embarking on an effort to measure and compare performance could be reasonably successful. The Inventory shows that the common core reflects experience. The common core of data is based on what departments currently collect, and performance indicators may be developed from these experiential core data elements. While the experiential core may show what departments can measure more easily, indicators that are measured by data elements that fall outside of the common core can also provide departments with guidance in expanding data collection. The Inventory points to the need for precise definitions. While many departments collect roughly comparable data in many important areas, departments may still define data elements differently or they may use different categories to record data about offenders. Comparative performance indicators need to be defined precisely and the differences in definition of data elements assessed. The Inventory points to the need to look at sources of non-comparability. While there is much commonality in what is collected, there are sources of non- comparability in corrections data. These derive primarily from differences in definition, scope of coverage, and methods for counting and classifying offenders. For example, definitions of a prisoner may differ among departments that include offenders in halfway houses or jails, and those that exclude them. And differences in defining sentences confound simple comparisons of time served or the percent of sentence served. Further, differences in methods for classifying offenders—e.g., by offense category, method of commitment, or other classes of offenders—need to be considered when interpreting comparative indicators. Any set of comparative corrections performance measures that are developed would have to be assessed empirically in relation to these and other sources of non-comparability in measurement among departments. (See printed version or acrobat file for copy of Inventory of State and Federal Corrections Information Systems survey). ---------------------------------------- Appendix C List of respondents Table C. List of respondents Department Respondent's name Address Contact information Programmer/Analyst Alabama Jake Jacobs Department of Corrections (334) 242-9187 P.O. Box 30150 Montgomery, AL 36130 Alaska Annette Smith Department of Corrections (907) 465-3313 DP Manager P.O. Box 112000 Juneau, AK 99811 Arizona Chet Homan Department of Corrections (602) 542-4527 EDP Project Manager 1601 W. Jefferson Mail Code 310 Rich Camine Phoenix, AZ 85007 EDP Programmer Analyst Arkansas Bob G. King Department of Corrections (501) 247-6341 Senior Project Leader P.O. Box 8707 Pine Bluff, AR 71611 California Judy Metz Department of Corrections (916) 323-4062 Chief, Correctional Case 1515 S Street Records Services Peter Lai Sacramento, CA 94283 Colorado Gary Saddler Department of Corrections (719) 540-4775 Director of Information Systems 2862 S. Circle Drive Suite 400 Jerry Hunter Colorado Springs, CO 80906 Data Base Administrator Connecticut Edmund Hayes Department of Correction (860) 692-7667 Andrew P. Shook 24 Wolcott Hill Road Research Analyst Wethersfield, CT 06106 Delaware Rodney Gibbons Department of Corrections (302) 739-5601 Ed Babowski Central Admin. Building 80 Monrovia Ave Smyrna, DE 19977 District of Thomas Hoey Department of Corrections (202) 673-2300 Columbia Steve Fezuk 1923 Vermont St. NW Washington, DC 20001 Federal Bureau Meredith Barosso Federal Bureau of Prisons (202) 307-3065 of Prison Supervisory, Computer Specialist 320 First St. Washington, DC 20534 Florida Paul Mauer Department of Corrections (904) 488-5963 2601 Blair Stone Road Tallahassee, FL 32399 Georgia Fred Radford Department of Corrections (404) 656-4609 Douglas Engle Two M.L. King Jr., Drive, SE Systems Manager Atlanta, GA 30334 Hawaii Ken Hashi Department of Public Safety (808) 587-1237 Research Statistician 919 Ala Moana Blvd Mike Mamitsuka Honolulu, HI 96814 Judy Yamada Computer Programmer Idaho Craig Potcher Department of Corrections (208) 332-8298 John R. Hofland 500 South 10th Statehouse Mail John Hoffman Boise, ID 83720 Illinois Mike Noga, Supervisor Department of Corrections (217) 522-2666 Offender Systems Report 1301 Concordia Ct. Management P.O. Box 19277 Springfield, IL 62794 Indiana Robert W. Hughes Department of Corrections (317) 232-6930 (317) 233-5400 Jeanne McFarland Government Center S. Systems Analyst Supervisor 302 W. Washington St Indianapolis, IN 46204 Iowa John Baldwin Department of Corrections (515) 281-4807 Capitol Annex 523 E. 12th St Des Moines, IA 50319 Kansas Patricia Biggs Department of Corrections (913) 296-5515 Director, Research & Planning 900 S.W. Jackson St. 4th Floor Jeff Lewis, Carlos Usera, Topeka, KS 66612 Cathy Clayton Programming & Analysis Supervisor Kentucky Louis Smith Department of Corrections (502) 564-4360 State Office Building 5th Floor Frankfort, KY 40601 Louisiana Walt Worley Department of Public Safety & Corrections (504) 342-8770 (504) 342-8782 Project Leader P.O. Box 94304 Terry Clair Capitol Station Baton Rouge, LA 70804 Maine Lita Cunningham Department of Corrections (207) 287-4343 Jerry Steeves State House Station 111 Michael Hughes Augusta, ME 04333 Maryland Edmen Tausendschoen, Program Analyst Division of Correction (410) 764-4107 Lawrence Zamarski, 6776 Reisterstown Road Suite 311 Program Analyst Baltimore, MD 21215 Massachusetts Lisa Sampson Department of Corrections (617) 727-8857 617) 727-2106 Systems Analyst Technical Services Route 1A Curt Wood, Robert Hughes Norfolk, MA 02056 Michigan Terrence Murphy Department of Corrections (517) 335-1383 Steve Paddock P.O. Box 30003 Manager of MIS Lansing, MI 48909 Jeff Anderson Projection Specialist Minnesota Dan Storkamp Department of Corrections (612) 603-0194 612) 642-0301 Director, Office of Planning 1450 Energy Park Dr. Suite 200 and Research St. Paul, MN 55108 Mark Evenson, Database Administrator Mississippi Audrey McAfee Department of Corrections (601) 359-5608 Applications Analyst Manager 723 N President St. Jackson, MS 39202 Missouri David Schulte Director, Department of Corrections (573) 526-6452 Information Systems PO. Box 236 Deborah Stegman Manager, Jefferson City, MO 65102 Applications Development Montana Mike Cronin Department of Corrections (406) 444-4907 Dewey Hall 1539 11th Avenue Research Specialist Helena, MT 59620 Nebraska Steve King Department of Correctional Services (402) 479-5767 Judy Egger P.O. Box 94661 Applications Analyst Lincoln, NB 68509 Nevada Glen Whorton Department of Prisons (702) 887-3277 Chief, Classification P.O. Box 7011 & Planning Carson City, NV 89702 New Hampshire Mary Keniston Department of Corrections (603) 271-5609 Supervisor, BIS 105 Pleasant St. P.O. Box 1806 Concord, NH 03302 New Jersey Stan Repko Department of Corrections (609) 984-4587 Hank Pierre Chief, Whittlesey Road Bureau of CICS (OBCIS) Trenton, NJ 08625 Jim Atkins Karen Hughes Administrative Analyst (CMIS) New Mexico Pashella Reynolds-Forte Corrections Department (505) 827-8631 Robert Sego Management Analyst P.O. Box 27116 Ralph Casados Santa Fe, NM 87502 IS System Supervisor New York G. Ronald Courington Department of Correctional Services (518) 457-2540 Bob Meidenbauer Manager, State Office Building, #2 DP Services 1220 Washington Ave. Albany, NY 12226 North Carolina Bob Brinson Department of Corrections (919) 733-5711 2020 Yonkers Road Raleigh, NC 27604 North Dakota Patrick Foley Department of Corrections & Rehabilitation (701) 328-6607 Program Coordinator 3303 E. Main David Huhncke Bismarck, ND 58502 Cathy Jensen Inmate Records Ohio Peggy Ritchie-Matsumoto Department of Rehabilitation & Corrections (614) 752-1271 Jerry Holloway Systems Analyst 970 Freeway Drive North Umang Nanda, Columbus, OH 43229 Information System North James Jeyarag , Ed White Oklahoma Jim West Administrator, IS Department of Corrections (405) 425-2546 Bill Chown Administrator, 3400 M.L. King Jr. Ave Research Manager Oklahoma City, OK 73136 Oregon Jean Hill Department of Corrections (503) 945-0920 Vickie Ross 2575 Center St. Steve McDowell Salem, OR 97310 Randall Ireson, Research Manager Pennsylvania Andy Keyser Department of Corrections (717) 730-2732 Ron Peters P.O. Box 598 Heather Yates Camp Hill, PA 17001 Special Projects Assistant Rhode Island Steven Chianesi, Department of Corrections (401) 464-3901 Associate Director 40 Howard Ave. Kevin Major, MIS Manager Cranston, RI 02920 South Carolina Dr. Lorraine Fowler Department of Corrections (803) 896-1748 Division Director, Resource & P.O. Box 21787 Information Management Columbia, SC 29221 South Dakota Laurie Feiler Department of Corrections (605) 773-3478 Perry Delzer 115 East Dakota Ave. Pierre, SD 57501 Tennessee Tim Beck Department of Corrections (615) 741-0900 Gary A. Lukowski, PhD Director, 320 Sixth Ave., North 4th Floor Planning, Research Nashville, TN 37243 and Management Texas Linda Burney Department of Criminal Justice (409) 294-6391 Ann Christian P.O. Box 99 Application Project Manager Huntsville, TX 77342 Utah Gae Lynn DeLand Department of Corrections (801) 265-5508 (801) 265-5597 Christine Mitchell, 6100 South Fashion Blvd. Valerie Stagg Murray, UT 84107 Information Analyst Vermont John Perry Department of Corrections (802) 241-2307 (802) 241-2293 R. Barre Davis State Complex 103 South Main St. Waterbury, VT 05676 Virginia Frank Zera Department of Corrections (804) 674-3497 MIS Director P.O. Box 26963 Richmond, VA 23261 Washington Dale Putnam , Steve Collins Department of Corrections (360) 586-6396 Application Service Capitol Center Bldg R. Peggy Smith, PhD Planning P.O. Box 41101 & Research Manager Olympia, WA 98504 West Virginia Henry Lowery Records Supervisor Division of Corrections (304) 558-2036 William K. Davis Commissioner 112 California Ave. Building 4 Carl Graves Computer Charleston, WV 25305 Project Team Wisconsin Mark Loder Department of Corrections (608) 266-8718 System & Application P.O. Box 7925 Development Chief Madison, WI 53707 Wyoming John Lighty, Department of Corrections (307) 777-7405 Jerry Pieper, Herschler Building, Suzanne Rauth 1st Floor East 122 W. 25th St. IT Specialist St. Cheyenne, WY 82002 --------------------------------------- Appendix F Data notes Chapter 1. Profiling and describing offenders Michigan Age at commitment can be calculated. Employment status prior to arrest can be inferred from occupation data element. New Mexico For education level prior to admission, high school dropout and high school graduate can be calculated. North Dakota As of January 1998, the country of birth is not recorded, but will be added soon. Oregon Education level prior to admission is recorded on a separate, non-integrated database. Pennsylvania Age at commitment can be calculated. Employment status prior to arrest is recorded only for the last six months before arrest. Most specific employment status information is recorded on paper. South Dakota Length of employment prior to commitment is recorded for the last employment, which may be just prior to commitment or long before commitment. Vermont Age at commitment can be calculated. West Virginia Age at commitment, citizenship, and illegal alien status can be derived indirectly. Chapter 2. Committing offenders into correctional authority Alaska Alaska records up to five conviction offenses. Only previous incarcerations are recorded as criminal history. Data relating to sentences imposed is incomplete. The only monetary sanctions recorded are fines and restitution. Classification is recorded, but initial classification is not identified. California California does not record the agency with the authority to release the offender from custody because of their sentencing practices. The amount of restitution due to victims of crimes is recorded only if the victim contacts the Department of Corrections. Habitual offender information contains convictions only. Delaware The State of Delaware does not record parole release data because they do not have parole. Indiana Information about the criminal incident is found in the police report in the offender's packet. Criminal history information is stored electronically for priors resulting in DOC custody; manual data storage is used for less serious, very old, or out of state arrests. Kansas Criminal justice status of offenders at arrest is recorded only for Kansas offenders. Massachusetts Criminal justice status is recorded for offenders who were on probation for split releases only. Type of commitment for probation violators is recorded only for those who were serving a split sentence and returned. Risk assessment is only performed on offenders within 1 to 6 years of their release date, and the score is calculated upon commitment to prison. Needs assessment is not done upon admission, but rather upon admission into a program. Medical condition assessment is done upon commitment to prison. Michigan Address of victim is recorded only if the victim requests to be notified under Crime Victims Rights Act. The number of conviction offenses can be calculated. Offense severity level is not applicable because prison houses only felony offenders. Criminal history includes only felony convictions, and priors can only be calculated for offenders with Michigan prison sentences. Offenders currently in prison at time of arrest (criminal justice status) can be calculated. The number of sentences imposed, the total sentence length, and whether a sentence was a mandatory minimum can be calculated. Returned from bond or appeal, transferred from another jurisdiction, returned escapee, and returned AWOL/absconder are not commitment types. The agency having the authority to release the offender from custody is not applicable. The date of expiration of sentence includes credits. The date of sentence expiration without credits applied can be calculated. For medical conditions at admission, those conditions requiring chronic care or psychiatric follow-up are identified. Drug testing at admission is recorded, but not all offenders are tested. Initial classification information, including classification index and risk assessment index, is retained only until the offender's classification changes. Mississippi The offense severity level is not recorded because they only record felony convictions. They are in the process of expanding automated sentence length computation. New Hampshire The medical condition of offenders at admission is recorded, but not for all offenders, especially out-of-state inmates. The special unit housing the offender is recorded on paper for offenders in special treatment programs, (e.g., drug or alcohol programs). Nevada Type of commitment does not include returned escapees, absconders, or detainees. This is considered criminal justice status, not a commitment. Criminal history includes only felony offenses. New Mexico Only firearms are recorded for weapon involved in criminal incident. Up to seven offenses are recorded for number of conviction offenses and number of sentences imposed. All values of an offender's criminal justice status at time of arrest can be calculated. Offender criminal history contains up to six previous offenses. If the offender is a habitual offender is recorded based on the six most recent offenses. A history of escape/AWOL can be calculated for risk assessment. Oregon The total length of sentence imposed can be calculated. Criminal history includes Oregon offenses only. The agency responsible for classification of offenders is not recorded because it is done by the Department of Corrections itself. Pennsylvania The number of conviction offenses is not currently recorded electronically, but the DOC is developing electronic multiple sentencing. A written description of the offense and the offense severity level is only recorded for Pennsylvania offense code (for current and prior offenses). Location of incident only contains the committing county. The number of victims and address of victims is recorded only when victims are registered with the Office of Victim Services. Victim registration is voluntary. The length of community supervision of the sentence is not recorded because Pennsylvania has indeterminate sentencing. information about expected time to be served is recorded for predominant sentences only. For classification, security level at admission is the custody level, and the classification index is based on custody/housing level. A psychological index is the offender's stability level. A risk assessment index is used for housing placement; community risk is not part of the index. A Bed Management System is being developed which will record the type of housing unit in which offender is placed. South Dakota Location of criminal incident includes only the county. The age of the victim is recorded only for victims of sex offenders. Criminal justice status at time of arrest is recorded only if offender is parole violator. For criminal history, the number of felony convictions is recorded electronically, but the specific crime is stored on paper. Specific medical conditions at admission are not recorded, but grouped for needs assessment. Tennessee For whether sentence was a mandatory minimum sentence, Tennessee uses a sentencing grid. For example, LWOP is 85% - 100% of sentence; life is a minimum of 25 years. Vermont Changes in classification information can be calculated. West Virginia The type of conviction offense can be derived from other data elements. For type of facility housing the offender, those in county jails are recorded in a different system. Chapter 3. Managing offenders in corrections facilities Alabama The medical staff has the information about conditions acquired before and after commitment, but it is not in the offender-based system. Alaska Good time credit data are incomplete. For type of facility released to, only transfers or furloughs are recorded. For behavior in custody, only offenders' infractions are recorded; misconduct is not. Idaho Offender registry will be in their system by July 1998. Indiana Program participation includes work assignments. However, since almost everyone is eligible for work assignments, offender eligibility evaluations are not done. Evaluations are done for voluntary programs, such as substance abuse, anger management, etc. Drug test tracking (a 5% monthly sample of offenders) is recorded electronically at the aggregate level by the Central Office, and manually at each facility, by offender. Not all programs in which an offender participated can be identified. Special good-time credit is given for education. Death is recorded as a release type, but not the specific cause of death. Massachusetts Program participation is based on risk assessment and is voluntary. Only a substance abuse residential program is recorded as ongoing in-prison program. A new medical information system is operating that collects data electronically. Prior to this system, data was in paper form. To identify HIV/AIDS cases developed in custody, offenders must consent to be tested. Special good time credits are not applicable. Michigan The total amount of good time credits available is not recorded. It can be calculated, but requires extensive calculations. The change in good time credits can be calculated. The reason for change in good time credits is recorded only for infractions and misconduct, not for new crimes. The type of medical treatment recorded is for mental health or chronic physical problems needing treatment. The date medical treatment started is recorded only if offender is hospitalized or placed in treatment facility. Medical conditions developed in custody can be calculated. Death is recorded as a release type, but not the specific cause of death. Time served in custody can be calculated, but with difficulty. If offender is required to register as a sex offender can be calculated. The result of official response to misconduct in custody is recorded as the offender's status pending a hearing. Only hearings and appeals are recorded for the type of legal procedure against offenders as a result of misconduct. Montana The entity who authorized an internal movement is monitored by each facility. Montana does not have good time. Nebraska Drug testing since admission is random drug testing only. Nevada Escape is not considered a type of release. Victim notification is recorded when the data are provided by the victim. Recording the medical care of offenders started recently. New Hampshire The reason for transfer/internal movement is recorded electronically for a change in security level, protective segregation, a medical condition other than HIV/AIDS, or a psychiatric referral; but a transfer/internal movement for a parole board hearing, HIV/AIDS cases, or a request by an offender is recorded on paper. Releases from custody due to death of offender are recorded on paper. New Jersey Releases from custody do not include work release and offenders who abscond or go AWOL. Time served in custody can be calculated. New Mexico Death is recorded as a release type, but not the specific cause of death. The type of facility released to is recorded by general category. Victim notification upon the offender's release is recorded for some offenses. North Dakota None of the program participation information is recorded electronically. Oregon The type of program in which the offender participated is recorded only since June 1996. Information about the medical care of offenders is maintained in a separate, confidential database. Death is recorded as a release type, but not the specific cause of death. History of misconduct/infractions is recorded electronically only for the last year; previous years' information is stored on paper. Pennsylvania Pennsylvania is developing a Transportation System that will record who authorized transfers to another facility. They are also developing a Bed Management System to record movements within a facility and the date of the movements. These systems will eventually record information for 100% of offenders. The reason for a transfer is currently recorded, but the reason for an internal movement is not. Medical care of offenders is recorded only for tuberculosis. For conditions developed in custody, hepatitis and chronic medical conditions are recorded in paper format. Good time credit information is not recorded because Pennsylvania does not have good time. Natural death is recorded as a type of release, but illness is not recorded. South Carolina There is limited access to the data about conditions developed in custody. Death is recorded as a release type, but not the specific cause of death. South Dakota For change in sentence length due to modifications, the offender's record is adjusted to reflect sentence modifications. Death is recorded as a release type, but not the specific cause of death. The State Police are responsible for an offender actually registering as a sex offender. The date of a misconduct/infraction event is recorded as the date of the disciplinary hearing. Utah Good time credit information is not recorded because Utah does not have good time. West Virginia Changes to sentence length and expected release date can be calculated. Time served in custody can be calculated. Chapter 4. Supervising offenders on release and maintaining public safety Alaska Data elements about released offenders and crimes committed by offenders in the community are maintained only if they are returned to prison. They do not distinguish between probation and parole as types of supervision. The date an offender absconded is the date the record was entered into the system. If an offender on release commits a new crime, the date of criminal incident can be a series of years - if the event occurred over a period of years. They do have a 'date of occurrence'. Georgia The Parole Board records residence information about offenders on release. Idaho Information about the victims of new crimes is confidential, and is available only to the Parole Commission. Only felonies are recorded for adjudication of a violation/new crime. Indiana Employment on release information is tracked by an agent. Responses to release violations is recorded electronically for felonies and manually for misdemeanors. Information about the new crimes and the victims of new crimes is found in the police report in the offender's packet. Kansas Information about violations/new crimes committed by offenders on release is recorded only for Kansas felonies. Massachusetts Information about violations of release conditions and the responses to release violations are collected for a study on recidivism. The date the offender is returned to prison is collected for all offenders. Michigan Most information recorded on released offenders is for parolees only. The actual time in supervision can be calculated. The type and date of a new crime can be calculated. Only if offender was sentenced to a new crime is recorded, not violations. The same information is recorded about the new criminal incident as was for the original prison sentence. South Carolina The address of a victim of a crime is recorded if the victim chooses to register in a notification program. West Virginia Parole is the only type of supervision recorded. The actual length of time in supervision can be calculated. Chapter 5. Facility management information California The medical care of offenders is recorded only if the medical treatment is provided by a contract agency outside the prison. Indiana Not all program assessments are recorded electronically. Massachusetts Program assessment is not done for all programs. Michigan The number of facilities can be calculated. Mississippi The number of offenders in a program is recorded electronically. The number of program staff, program accreditation, program cost, and source of funds is recorded on paper. Pennsylvania For types of programs offered to offenders, most information is recorded in paper format; only the number of offenders in a program is recorded electronically. The availability and storage of program assessment information depends upon the individual program. Wisconsin For types of programs offered to offenders, only the number of offenders in program is recorded electronically. Chapter 6. Reporting capabilities Alaska Only offenders' infractions are maintained on-line. California Records of previously released offenders since 1977 are permanently available on-line. Only paper records of these offenders are archived. Kansas Kansas is currently implementing a pilot program/software application for collecting and maintaining electronically information on released offenders. Massachusetts The Department of Correction collects data on offenders released by parole or certificate of discharge to the street. The follow-up period is one year from release date. Data are collected for probationers with split sentences. Michigan Michigan only collects data on crimes committed by offenders under supervision if a new prison sentence is imposed. Mississippi The system maintains records of previously released offenders if they were in the custody of Mississippi DOC. It is uncertain if the system maintains data on crimes committed by offenders under supervision in the community. Nebraska Data are collected on crimes committed by offenders under supervision only if the offender is convicted and sentenced. New Jersey The State Department of Probation and Parole tracks released offenders, and shares some information with the Department of Corrections, but the information is not considered part of the DOC's information system. Nevada If an offender is incarcerated, his entire disciplinary history is on-line. Prison records are maintained in the system for all paroled offenders. Some information is maintained for every discharged offender, e.g., name, ID number, demographics. Only paper records of released offenders are archived. Information about offenders after they are released from prison is not collected. North Dakota The system maintains an on-line history of offender's misconduct/infractions only for loss of good time. Rhode Island Paper records are archived, but not electronic records. South Carolina All misconduct/Infraction records are kept on-line for the current commitment. Previous commitments' infraction history can be linked to current commitment. The system maintains data elements on released offenders and collects data on crimes committed by offenders only for supervised furlough releases. South Dakota The system maintains records on offenders released to the community and offenders that are returned to custody. Virginia The system maintains an on-line history of external transfers, but not internal movements. West Virginia The system maintains an on-line history of transfers, but not internal movements. General information Michigan Michigan did not include in their questionnaire any data elements that are stored in paper format. Their responses only reflect data elements in their automated systems. Their paper files contain data elements that apply to the State of Michigan. New Jersey New Jersey has two offender-based information systems, CMIS and OBCIS. They completed a questionnaire for each system. We combined the data from both questionnaires to represent the highest degree of capacity. For example, if a data element was not collected by the CMIS system, but was collected by the OBCIS system, we entered 'yes', the data element was collected. If both systems collected a data element but for a different percent of offenders, or stored in different formats, we entered the response with the greatest percent of offenders or the one stored in electronic format. Wyoming Wyoming maintains three separate information systems: one for male offenders; one for female offenders; and one for probationers and parolees. ------------------------------------- Appendix I Glossary Availability index — a measure of the extent to which departments maintain high-availability data elements (electronically for more than 75% of offenders). The index ranges from 0% to 100%. A rating of 100% means that a department maintains all of the data elements in a group in high-availability form (is at full availability); while a rating of 0% means a department does not collect any of the data elements in the group. Availability indicator — a measure of the extent to which information systems maintain data elements electronically for most offenders. High availability — a data element is in electronic form for more than 75% of offenders. This high percentage indicates extensive coverage on an element. The electronic form indicates the data potentially can be extracted, linked, and easily shared electronically. Medium availability — a data element is in electronic form but for less than 75% of offenders. It indicates a medium level of availability because the scope of coverage is less. It also indicates that information about a comparatively large percentage of offenders is more likely to be missing than in the high-availability indicator. Low availability — a data element is available only in paper form. Data elements available in low-availability form cannot be extracted, linked, and shared electronically. For the purposes of using offender-based data elements to generate statistical information, low-availability data elements present large obstacles for departments' capacities. Unknown availability — a data element is maintained by the information system, but the form in which the data element is maintained or its scope of coverage was not indicated on the survey. No availability — a data element is not maintained in the information system in any form. Availability rating — the percentage of full availability on the availability index that a department achieves for a group of data elements. The rating is derived by dividing a department's availability scores for these elements by the data's full-availability index score. Availability score — a value assigned to a group of data elements based on their level of availability. High-availability elements receive a score of 3 points, medium-availability elements receive a score of 2 points, low- and unknown-availability elements each receive a score of 1 point. No-availability elements receive a score of zero. The values of individual elements in a group are summed to create a department's availability score. Common core of data — dimensions of data elements that are maintained electronically for more than 75% of offenders by a majority of departments. The common core consists of 14 dimensions of corrections processing containing 100 data elements. Each of the four stages of processing has at least one dimension in the common core. Dimensions of corrections processing — 28 subsets of information contained in the four stages of corrections processing. They are relatively homogenous groupings of data elements that measure events in these four major phases of the corrections process. Stage 1, profiling offenders has 3 dimensions. Stages 2 and 3, committing offenders and managing offenders, have 10 dimensions, and Stage 4, supervising offenders, has 5 dimensions. Full availability — a rating of 100% on the availability index or the score that a department would receive if it maintained all of the data elements in a group in a high-availability form (electronically for more than 75% of offenders). Priority information areas — six important information areas identified by the Inventory's advisory committee: offender profile, recidivism, program effectiveness, internal order, public safety, and operational costs. These area were used to guide the development of the Inventory. The first five areas were organized into four stages of offender processing through corrections systems. The sixth area, operational costs, was used to develop data elements related to facility management. Offender profile — a priority information area that covers a wide range of Information covering offenders demographic characteristics and risk potential, as well as their offenses, criminal histories, sentences, types of admissions, and releases from prison. For clarity, this priority area was divided into two stages of corrections processing—one that described demographic and social characteristics of offenders, and another that described both the behaviors and decisions leading to commitment to prison and the assessment and placement decisions made upon entry into prison. Recidivism — a priority information area that includes data describing the rearrest, reconviction, and return to prison of released offenders. This area was combined with public safety into the fourth stage of corrections processing—supervising offenders. Program effectiveness — a priority information area that includes program participation by offenders, treatment, medical problems, and medical care. This area was combined with program effectiveness into the third stage of corrections processing—managing offenders. Internal order — a priority information area that includes information related to offender misconduct, violations of rules, safety considerations, use of restraint, and drug and alcohol use. This area was combined with program effectiveness into the third stage of corrections processing— managing offenders. Public safety — a priority information area that includes data describing the harm to the public by released offenders, the crimes they commit, and information about victims of these new crimes. This area was combined with recidivism into the fourth stage of corrections processing— supervising offenders. Operational costs — a priority area that includes non-offender-based data, such as those that measure staffing ratios, program effectiveness, and costs of operating facilities. Stages of corrections processing — method of organizing the 207 offender-based data elements, contained in the six priority information areas identified by the advisory committee, into substantive categories that describe major stages of corrections processing, beginning with intake of offenders into the system through supervision of released offenders in the community. Stage 1. Profiling offenders — the first stage of corrections processing. It contains 29 data elements that describe offenders' demographic characteristics, socio-economic status, family characteristics, and living arrangements. Stage 2. Committing offenders — the second stage of corrections processing. It includes 70 data elements that describe the offenses and sentencing decisions leading up to commitment into prison and elements describing the assessment and placement of offenders upon commitment. The data are organized into three broad categories that provide information about the offenses leading to the conviction and sentences, about the sentences imposed by the court, and about the assessment and confinement decisions made by corrections officials upon receipt of an offender from the court or other authorities. Stage 3. Managing offenders — the third stage in corrections processing. It contains 63 data elements that describe the movement of prisoners, the procedures and actions that corrections officials take to manage offenders in their custody, behaviors of offenders leading to disciplinary actions, and official responses to misconduct. The data are organized into three broad categories that describe routine management and program participation, the release of offenders from custody, and internal order. Stage 4. Supervising offenders — the fourth stage of corrections processing. It contains 45 data elements that describe where offenders are in the community, what they are doing there, and whether they have a record of criminal activity after release. The data are organized into broad categories that describe the supervision of offenders released from custody and details about new crimes committed by released offenders. End of file pm 10/98