CDC logoSafer Healthier People  CDC HomeCDC SearchCDC Health Topics A-Z
NIOSH - National Institute for Occupational Safety and Health

Skip navigation links Search NIOSH  |  NIOSH Home  |  NIOSH Topics  |  Site Index  |  Databases and Information Resources  |  NIOSH Products  |  Contact Us

NIOSH Publication No. 2005-106:

Mixed Exposures Research Agenda - A Report by the NORA Mixed Exposures Team

December 2004

back Home  
Biomarkers next

2 Analysis of Research Needs

Identifying Mixture Hazards

Identifying and characterizing mixed-exposure hazards have always been challenging. However, as we continue to shift away from a manufacturing-based economy, the challenge will be even greater. The National Academy of Science report entitled Safe Work in the 21st Century—Education and Training Needs for the Next Decade’s Occupational Safety and Health Personnel [Institute of Medicine 2000] identifies several trends and projections related to the U.S. workforce:

Given these trends, chronic occupational disease is likely to be of growing importance as the age span of the workforce increases. We know from existing surveillance data that exposures encountered at an early age can have life-threatening consequences well after exposures have ceased. The death of two sandblasters from silicosis in 1998, each with less than 5 years of employment as sandblasters, serves as a tragic illustration of this fact [CDC 1998]. One of the deceased workers was employed as a sandblaster in his twenties between 1984 and 1988. He died at age 36. The second worker was only 30 when he died. He had started working as a sandblaster at about the age of 18 between 1986 and 1990. Fortunately, most young workers will live long enough to pursue a variety of occupations. However, the lifetime exposure profiles of such workers will be increasingly complex as younger workers cycle into and out of different jobs and the job materials and processes change. Without better control of health hazards, the risk of chronic disease will also increase as workers enter the workforce at an earlier age and leave at a later age.

To get a clear view of exposure potential within the workforce and prioritize research areas, we must draw on the knowledge of those doing the work. Occupational hygienists, epidemiologists, toxicologists, and occupational health professionals will continue to be of critical importance in the development of exposure assessment strategies and analysis of data. However, understanding how work is done, what materials are used, and which mixed-exposure hazards pose the greatest concern in the workplace make up the foundation on which all other research rests. Therefore, central in our research agenda should be the development and use of research methods and partnerships that involve workers more actively in the research process. Two methods that have been successfully used toward this end are participatory research methods and research that involves the training and use of workers as shop-floor gatherers of both qualitative and quantitative exposure data.

Participatory research involves a co-learning process in which research subjects and professional researchers become active partners in the process of identifying occupational health problems and interventions that are likely to take hold within the group being studied. The general approach involves more emphasis on developing a system for addressing problems as they arise in a real-world context [Rosecrance and Cook 2000; Schurman 1996]. Such methods hold great promise in tackling the very complex and idiosyncratic problem of mixed exposures.

The practicing occupational hygienist in the field has long known the value of talking to workers to understand a given work process. In addition, the Occupational Safety and Health Administration (OSHA) and Mine Safety and Health Administration (MSHA) standards such as the Hazard Communication Standard or other agent-specific standards and well-known diseases such as asbestosis have increased worker awareness and interest in occupational health. However, the complexity of process materials and problems with hazard communication tools such as material safety data sheets (MSDSs) limit worker knowledge of mixed-exposure hazards. A study carried out by Worksafe Australia and the New South Wales Technical and Further Education (TAFE) involving apprentice painters describes the limitations of MSDSs in the context of solvent thinners [Winder and Ng 1995]. In their review of MSDSs for 20 paint thinner products, 83 chemical solvent ingredient names were listed as hazardous ingredients. These 83 ingredients were reported as present in wide ranges (as opposed to a percentage of the by-product formulation), making precise product formulation unclear. In addition, the 83 trade-specific or generic thinner components could be reduced to 32 solvents or 6 classes of solvents. The authors emphasize the importance of product formulation in characterizing exposure risk. They also conclude that information about mixed-solvent exposure is lacking and that educational programs are needed to help workers understand various chemical formulation risks.

Although the above research focused on paint thinners, the painting trades in general offer a rich illustration of the complexity of mixed-exposure risk. Industrial painters, for example, routinely begin their jobs by blasting steel surfaces covered with coatings (usually lead-based). Abrasives such as silica sand, steel grit, and copper or coal slag are used to blast old paints off steel and concrete surfaces using high pressures (about 100 pounds per square inch [psi]). The dust generated as a result will consist of an array of hazardous agents (including fibrogenic dusts or metals), depending on the abrasive and substrate. The individual metals and dusts can cause a wide range of serious health effects; however, the combined effects of these agents have not been studied. Paint systems that are applied to the freshly prepared surface produce paint mists and solvent vapors that are even more complex mixtures of hazards.

The construction industry serves as a good model for mixed exposures. In this industry, both old and new sources of exposure pose a threat to a wide range of trades and occupations. For example, pipe fitters may be exposed to the fumes of new high-nickel alloy welding rods while working on construction of new semiconductor facilities, and on another job they may be exposed to asbestos applied to process piping a generation ago. Masonry workers repairing old mortar joints are exposed to dust containing silica as well as to hazardous materials mixed into the mortar decades ago. Regardless of the source of mixed-exposure hazards, they translate into the potential cause for future occupational disease.

The mining industry also serves as an excellent model for mixed exposures. In this industry, miners may be exposed to the particulate matter released from diesel engines that is combined with an irritant gas such as nitrogen dioxide and an asphyxiant such as carbon monoxide. Miners may also be exposed to mixtures of solvents in cleaners or to metals and thermal degradation products generated during welding. And like in other industries, these mixed exposures translate into a potential cause for both present and future occupational disease.

The historical one-chemical-at-a-time approach to occupational health is inadequate. Safety and health practitioners using substance-by-substance or hazard-by-hazard approaches generally make conclusions about worker risk or lack of risk without sufficient caveats about the inability to evaluate additive or synergistic effects. However, the problem of understanding the true health effects of real-world mixed exposures is mind-boggling unless systems are in place for clarifying research priorities within major occupational groups. Workers and organizations that represent or train them are essential building blocks for developing such systems. To accurately assess mixed-exposure potential, professional researchers must collaborate with workers, the organizations that train and represent them, and entities that influence how work is done. Collaboration between scientists, engineers, management, workers, and others is needed for identifying and ranking exposure hazards. Use of such approaches has been described in the literature [Anderson-Murawski et al. 2002; Feron et al. 1995b].

Screening methods may be used to systematically evaluate multiple exposures. First, known occupational exposures are ranked by frequency of occurrence. Then combinations are ranked by the frequency with which two or more exposures occur together. Finally, the resulting list of combinations is reviewed to identify those for which present knowledge suggests that interactions may occur.

A key criterion for identifying mixed-exposure research priorities is the anticipated outcome. Ideally, the NORA Mixed Exposures initiative would result in the evolution of a much improved ability to predict important potential mixed-exposure threats in materials before they enter into commerce. Of course, this presumes that new combinations of such stressors would be known and reviewed in advance, and that employers would act on such knowledge. The likelihood of this scenario depends on the degree to which workers, occupational safety and health professionals, and those in industry responsible for selecting work process and materials are involved in the hazard prioritization process.

Another gap is the need to provide information about common mechanisms of toxicity. The American Conference of Governmental Industrial Hygienists (ACGIH) specified that for chemical mixtures, one should total the exposure-dose contributions from multiple agents that affect the same organ system. Only recently (1998) has the ACGIH published the Critical Effects associated with each chemical in their threshold limit value (TLV ®) booklet [ACGIH 2003]. However, when there are more than one critical effects listed, it is unclear on which effect(s) the TLV was based. In addition, the mechanism of action is not specified in the TLV book. Thus, the documentation for each TLV must be consulted for this information. Likewise, the Food Quality Protection Act of 1996 (Public Law 104-170) requires the U.S. Environmental Protection Agency (EPA) to consider all nonoccupational sources of exposure, including drinking water and exposure to other pesticides with a common mechanism of toxicity when setting tolerances. These approaches create a need for additional knowledge about toxicity mechanisms in addition to which organ system might be affected.

New Knowledge Needs

Potential for Intervention

Effects Studies

Experimental and epidemiological research on mixed exposures addresses two interrelated issues:

Knowledge gained in both areas is used to improve risk assessment and mitigation measures. Many research tools and strategies address both issues. For example, recent advances in mechanistic models and cellular and molecular research tools (such as genomics, proteomics, and bioinformatics) have enhanced our ability to address questions about possible health effects in the areas of both specific and generalized mixed exposures.

Other methods and tools focus more on one type of mixed-exposure issue. For example, research targeted for specific mixed exposures commonly employs basic toxicological strategies such as cellular, animal, and human studies to focus on exposure patterns, metabolism, response mechanisms, biomarkers, susceptibility, and health outcomes. Such research typically relies on exposure-dose-response relationships to identify and characterize interactions. It tends to be retrospective, typically addressing existing or historic exposures. But such research can also be applied to anticipated or hypothesized exposures. On the other hand, research targeted for generalized application typically focuses on developing more rapid, effective, and inexpensive ways to predict interactions across a variety of exposures, and it often involves cellular models, physiologically based pharmacokinetic (PB/PK) and physiologically based pharmacodynamic (PB/PD) models, chemical structure-activity relationships (SARs), and mathematical tools and data analyses for generalizing relationships across classes of stressors and exposures that lead to different classes of effects. Such research tends to be prospective, although known interactions may serve as a starting point. In either case, it is important to consider multiple effects beyond the critical (most sensitive) effect to evaluate possible combined responses to multiple stressors and exposures.

Experimental Approaches

Mixed exposures are too complex and variable to prescribe any single approach as most appropriate for understanding related health effects. Research in this area must offer a way to focus on key health issues considering an overwhelming number of permutations of types and sequences of exposure to multiple stressors in the workplace, as well as interactions among workplace and nonworkplace exposures. The fundamental methods and tools used for this research build on those used for other basic and applied research, and improvements will certainly continue in concert with advances in those research fields. However, the uniqueness of mixed exposures warrants targeted research to understand and predict combined human responses. This research strategy will involve both adaptations to existing approaches and the development of new methods and tools for characterizing joint health effects and their key contributors and modifiers.

Experimental and epidemiological research opportunities span a wide range of biological levels and study species—from molecular, cellular, and tissue studies to whole-animal studies, and from microorganisms and standard test animals to human subpopulations and populations. These studies, together with mathematical and visualization approaches, can be used to evaluate and predict possible human responses to multiple stressors and exposures. For example, especially useful for mixed-exposure research are physiologically based mathematical and statistical approaches such as PK/PD models. These represent integrated exposure-dose-response relationships including response surfaces and integration of approaches to group classes of interactions, toxicity endpoints, stressors, and exposure types.

Laboratory Research

Cell Models

Cell culture systems using either established cell lines or primary cultures are attractive because of their simplicity and low cost. A wide range of cellular response phenomena can be observed through assays of cell function, general cytotoxicity (that is, survival, multiplication, surface adhesion, confluency, etc.), phenotypical changes, gene expression, and protein production. Cell models are also useful for studying interactive mechanisms such as P450 interactions in PB/PK models [Olin 2004]. They may also be useful as bioassays, assessing whole mixtures. The teaming of cell models with rapidly developing molecular biology investigative tools is expected to play a significant role in research on mixed exposures. Large-scale studies on the nature of chemical interactions, studies aimed at lumping responses among chemical classes, and high-volume prospective screening of chemicals for interactions will probably rely more on using cultured cells than on intact animals.

Cell models have limitations as well as advantages. Immortalized cell lines are necessarily altered from their source cells. Primary cultures have limited life spans. No culture technique fully mimics the in vivo environment. The value of information derived from cells in culture corresponds directly to the level of confidence that in vitro responses reflect the in vivo responses and dose-response relationships. Although intracellular phenomena can be identical in the two settings, responses that are influenced by other cells in vivo may not be reflected well in cultures of a single cell type. This issue places a premium on validating cell responses against in vivo responses to avoid generating large databases on mixed exposures of uncertain applicability to humans.

Animal Models

The use of intact animals allows adverse effects and interactions among exposures to be evaluated in the presence of the integrated responses from all organ and tissue systems. Although responses occur in single cells, adverse health effects in intact persons seldom, if ever, occur without the participation of multiple cells, tissues, and organs in pathogenic, defensive, or reparative responses. Animals can be studied using nearly the full range of morphological, physiological, and biochemical assays applied to humans. In many cases, large databases are available for comparison, although caution must be exercised to consider differences among the animal strains and experimental designs used in different studies. Intact animals are probably the only model adequate for evaluating mixed stressors (other than chemicals), such as physical stressors (for example, extreme cold or heat, exercise, radiation), personal factors (for example, nutritional deficiencies, aging, etc.), hormonal changes (for example, menstrual cycles, pregnancy), biological stressors (for example, infectious agents), and psychological stressors. Intact animals are also required to study reproductive (for example, fertility, teratological) and postnatal development and growth phenomena, although the latter is seldom a workplace issue.

Statistical Tools

To address mixed-exposure issues, the research tools must be deployed in experimental designs tailored to the question being asked or the hypothesis being tested. The large number and complexity of potentially important mixed exposures place great premium on the design of fundamental research strategies aimed at understanding and predicting the effects of combined exposures. No single research strategy will meet the need. A number of different strategies have been used in the past [Mauderly 1993], and continued development in this field is needed. This section briefly describes common fundamental strategies for mixed-exposure research. These examples are illustrative but not exhaustive. All of these strategies have both strengths and limitations, and any of them could be the best, depending on the question being asked and the resources at hand. In general, the same fundamental strategies could be applied whether the issue is combinations of dissimilar (for example, physical and chemical) or similar exposures (for example, complex chemical mixtures in a single exposure medium). It is important to note that continued development of research strategies is needed, as well as development of research tools.

Research Organized Around Prioritized Lists of Exposures

Another strategy is to prioritize a larger but still limited list of exposures and use a range of experimental protocols to verify the assumptions of additivity or independence of effects of the exposure combinations [Feron 1995b]. This is primarily a prioritization strategy; most, if not all, of the other strategies described in this section could be applied to the list of exposures. This strategy is useful for focusing research on key components of highly complex mixtures. However, this strategy denies the complexity of the real exposures, relies on foreknowledge of the most important components, and faces overwhelming, even if limited, permutations.

Studies of Sequences of Exposures

Mixed-exposure issues include sequences of multiple exposures as well as multiple simultaneous exposures. Several experimental designs can be used to study sequences, but the unifying feature is the administration of exposures at different times [Mauderly 1993]. The experiments may involve simply reversing the order of two exposures, or they may incorporate a factorial design in which single exposures, simultaneous combined exposures, and the two (or more) sequential exposures are administered. Like any factorial design, sequential experiments become intractable when the number and thus the possible sequences of exposures increase.

Dissection of Effects of Complex Exposures

This strategy focuses on apportioning causation among multiple (typically many) components of mixed exposures [Mauderly 1993]. It begins with a known effect and a known or assumed combination of exposures, and it attempts to identify the causal factors. This strategy has proved useful in many cases—and especially for mixtures for which the mechanism of action is similar (for example, mutations). However, the strategy depends on the ability to reproduce the exposure (that is, the mixture) and its isolated components. This strategy can also be resource-intensive if the number of components is large, the separation difficult, or the biological test system complex. Experimental designs that aim to determine the components of a mixture responsible for an effect are often termed bio-directed fractionation [Schuetzle and Lewtas 1986; Kleinman et al. 2000; Rudell et al. 1999].

Multivariate Analysis of Variable-Exposure Versus Response Databases

This strategy focuses on statistical analysis of matrices of exposure-response data in which identical measures of response are applied to multiple mixed exposures that differ in composition. The approach takes advantage of differences among the exposures (for example, in the composition of the exposure material) to identify the components most strongly associated with the effect(s). In a sense, this strategy is similar to bio-directed fractionation, except that it depends on variations in the complex exposure rather than on dissection of a constant complex exposure. The biological response system can range from simple (for example, single mutation in cultured cells) or complex (for example, multiple health outcomes in animals or humans). This strategy can address complex exposures and can use either epidemiological or laboratory data. However, its success depends on the consistency of populations and response measures across exposures, the accuracy with which individual exposures are known, and the degree of detail and similarity with which the different exposures are characterized [Eide et al. 2002; McDonald et al., in press].

Epidemiology

In research on mixed exposures, appropriate roles will exist for studies involving humans or data collected from humans, as is the case for research on single occupational and environmental exposures. Valuable data on subclinical responses can be obtained from humans exposed experimentally (that is, clinical studies) to concentrations and combinations of physical and chemical exposures considered to be without significant risk—with appropriate precautions and involvement of institutional review boards. Perhaps the most common human research is epidemiology, in which adverse health outcomes are linked to exposures retrospectively or prospectively.

All epidemiological studies deal in mixed exposures, along with the other contributing factors. The challenge to our research methods is to identify the exposures that contribute to the disease process and distinguish them from exposures that do not contribute. This is particularly difficult for epidemiological studies with mixed exposures because detailed exposure data is usually lacking, the relative concentrations of the mixture components will be variable over time, and the effects often involve chronic disease endpoints with long latency periods.

Epidemiological studies can be broadly categorized by purpose as descriptive or analytical. Descriptive studies involve observation of demographic, secular, or geographic trends in the occurrence of health outcomes. Analytical studies attempt to determine the relationship between outcomes and exposures or other risk factors and require clear specification of the outcomes and risk factors of interest. These risk factors include individual occupational exposures but more frequently, combinations that constitute mixed exposures. Epidemiological studies are generally either cohort or case-control, distinguished by whether the populations are first defined by exposure (cohort) or by health outcome (case-control). Cross-sectional and prospective studies can be especially suitable for examining mixed exposures. By design in a cross-sectional study, exposure and outcome measures are assessed simultaneously. Direct access to the study subjects and to the environments in which they may be exposed to potential causal factors allows detailed assessment of the components of mixed exposures.

Modeling Approaches

In mixed-exposure research, it is important to obtain quantitative information about the time-course fate and locations of chemicals and metabolites in the body (that is, PK) and time-course of receptor interactions and toxic responses (that is, PD). Such information is important for understanding the mechanistic basis for interactions among chemicals and thus predicting interactions and extrapolating from cell and animal studies to humans. Mixed-exposure research must include studies integrating computational technology and mathematical/statistical modeling with mechanistically based, time-course toxicology studies. This field is rapidly evolving and lends itself well to taking strong advantage of recent advances in cellular/molecular biology, mathematical models of biological responses, and mathematical lumping strategies for aggregating responses to classes of chemicals. The potential for significant advances in this field have been described by Yang et al. [1998].

A principal aim of dose-response modeling is to develop predictive tools for health risk assessment—that is, to be able to extrapolate likely biological effects observed in experimental situations to realistic human exposure situations (for example, to low doses, different species, routes of exposure, etc.). Such extrapolation is possible only with a quantitative understanding of the underlying mechanisms of absorption, distribution, metabolism, and elimination. The importance of understanding mechanisms for the effects of mixtures is twofold:

Over recent years, perhaps the most important development in this area has been the development of methodologies for PB/PK modeling of the chemical behavior in the body that takes into account the underlying physiology of the species of concern.

Physiologically Based Pharmacokinetics (PB/PKs)

PB/PK modeling is an approach that attempts to predict biological effects from the perspective of the entire biological system; it allows for development of a biologically accurate toxicokinetic description of an experimental mode that incorporates flow and dose relationships, realistic tissue volumes, solubility parameters for individual species and chemicals, and metabolic pathways with measured kinetic parameters. PB/PK models take the known pharmacodynamics of the chemicals (identified through in vitro studies or studies in other species) and information about possible interactions and use that known information to predict the overall toxic effect level of any dose or ratio of the mixture. These models take into account all the processes of a cell that could influence toxicity, including transport processes, diffusion exchanges, metabolic and eliminatory clearances, and receptor binding.

This technique can be used to extrapolate data between chemicals or to generate predicted chemical interactions that may be tested in the laboratory. This approach can be more resource efficient than traditional testing for multiple possible mixtures, many of which may prove not to have had any relevant interactions [Bond and Medinsky 1995]. Chemical-specific factors such as blood-tissue and tissue-tissue partition coefficients, elimination rate constants, and metabolic rate constants are determined in vitro and then used to create the predictive model.

PB/PK models can be adapted to make toxicokinetic predictions for specific organisms or target organs, and the model can be used in some circumstances to predict the concentrations necessary for a toxic effect. PB/PK modeling is also useful in making flow and dose predictions. Although modeling does not replace well-planned laboratory experiments, it is a useful tool that can facilitate experiment planning, optimize the use of laboratory data, and help design cost-effective studies [Blancato 1994].

PB/PK models have been developed for several components of the jet fuel JP–8 (such as benzene, xylene, toluene, and nonane). In addition, models of the interactions of up to five component mixtures of chemicals have been studied [Haddad et al. 1999; Tardif et al. 1997]. A key result of these studies is that a complete description of the interactive processes can be obtained by simultaneously tracking all the binary interactions in the mixture (that is, interactions of one chemical with another). Higher-order interactions are automatically taken into account in this way. Analysis of the blood kinetic data suggested that competitive metabolic inhibition of P450 2E1 was the most likely interaction mechanism for these compounds, and the metabolic inhibition constants for each binary interaction were determined. These results can be generalized to an arbitrary number of similarly acting mixture components (such as the hydrocarbon components of fuels) by considering such complex mixtures as pseudo-binary systems consisting of the compound of interest plus a single interacting complex vehicle with well-defined composite properties. Such composite properties (such as inhibition constants in the present example) are model-based statistical averages of the values for each interacting component. Such pseudobinary systems could be investigated by modifying the techniques developed for true binary interactions such as response surface analysis (see Mathematical Modeling Tools below).

Qualitative or Quantitative Structure-Activity Relationships (QSARs)

Approaches using QSARs attempt to predict the effects of a chemical mixture by making analogies with other similar compounds. They are useful for chemical mixtures for which limited dose-response data are available. SARs identify a common substructure or similarity in form among compounds with similar modes of toxic action; they then use the presence of this substructure in another mixture of unknown toxicity to predict toxicity in that compound. QSARs attempt to define quantitative structure parameters that correlate with an experimental concentration that produces the identical effect, such as an LD 50 (the lethal dose of a compound for 50% of the animals exposed). QSAR techniques are used to predict vital chemical parameters for unknown compounds such as partition coefficients, metabolic rate constants, and elimination constants as well as possible phar-macodynamic parameters such as binding affinities and maximum turnover velocities for target enzyme systems. QSAR techniques can also help to determine parameter values for PB/PK models (for example, partition coefficients and penetration coefficients), especially those to which the model output is not particularly sensitive. Thus the need for experimental determinations is considerably reduced. The QSAR techniques are limited by the availability of underlying structure-based data used by the models.

Lumping Analysis

The technique known as lumping analysis is borrowed from the petroleum industry, in which chemicals with defined similarities are lumped together into pseudo components that represent the entire group to make analysis scientifically manageable. This technique allows modeling approaches on very large mixtures (20 or more components). The sheer numbers of mathematical manipulations required for such mixtures would otherwise prove overwhelming to most computer systems. The complexity of the analysis is reduced by treating compounds with similar structures or mechanisms of actions as one chemical.

A new approach called structure-oriented lumping has been developed to model the composition and chemistry of complex mixtures at a molecular level. The central concept is to represent an individual molecule or a set of closely related isomers as a mathematical construct of certain specific and repeating structural groups. A complex mixture such as petroleum can then be represented as thousands of distinct molecular components, each having a mathematical identity. This enables the automated construction of large, complex reaction networks with tens of thousands of specific reactions for simulating the chemistry of complex mixtures. Furthermore, the method provides a convenient framework for incorporating molecular physical property correlations, existing group contribution methods, molecular thermodynamic properties, and the structure-activity relationships of chemical kinetics in the development of models [Quann 1998].

Mathematical Modeling Tools

Factorial Design Studies

The most common strategy for studying mixed exposures is to determine the effects of combinations of two particular exposures using a simple factorial exposure matrix. This strategy focuses on whether the combination of two exposures yields greater or lesser than additive effects compared with the single exposures. An appropriate biological system is exposed to exposure A, exposure B, or the combination of A+B, often accompanied by a sham control exposure. This is the most direct approach for testing hypotheses about specific two-factor interactions and mechanisms of interactions. A simple factorial design can be applied in experiments using response models ranging from very simple, short-term assays (for example, cells) to long-term, complicated models (for example, life-span carcinogenicity or noncancer animal bioassays). However, this strategy is seldom used for exploring interactions among more than two or sometimes three exposures. When toxicity studies become too complex, a step-wise approach may be used. For example, if whole response surfaces are to be studied for mixtures of five or more chemicals, factorial designs become too complex. To manage this situation, simplified statistical designs, for example, fractional factorial designs, can be used as a starting point to study deviations from additivity. Thus, the response surface analyses can be economized [Groten et al. 1997].

Isobole Analysis

An isobole is a counter line that represents equivalently effective quantities of two agents or their mixture and was used as early as 1870. A hypothetical straight line can be developed for additivity and concave lines (upward for synergism and downward for antagonism) for interactions. The isobole approach is widely used to evaluate the effects of binary mixtures. This method is tedious and tends to produce large standard deviations. The approach also requires large data sets, the precise doses of each of the components in the mixture, and the existence of extensive studies with the single compounds to yield reliable results. In addition, the analysis can be done only when clear effects levels are observed [Cassee et al. 1998].

A similar approach constructs theoretical dose-response curves for dose-additive and independent combinations of components and compares them to the observed response. Although this method requires less data and is more straightforward than the isobole approach, it still requires complete dose-response curves with fixed concentrations and statistical interpretations [Cassee et al. 1998]. A related approach is response surface analysis, which sometimes allows more rapid analysis of the toxicological effects of mixtures with many fewer animals.

Zero-interaction response surfaces describe dose-response relationships for which no interactions between multiple exposure concentrations exist. They define zero interaction according to a particular criterion throughout the complete dose range. This means that they can replace the tedious experimental determination of dose-addition isobolograms (isoboles are specific cross sections of response surfaces). They predict expected combination effects from single-agent dose-response relations but not combination effects that are not zero-interactive. Response surface methods have been incorporated into a number of commercially available computer programs, such as CombiTool [Dressler et al. 1999].

In addition to their application in the design of experiments (see above), response surface techniques can be used as a visualization tool to elucidate the kind and extent of component interactions. Experimental data can be directly compared with zero-interaction response surfaces to assess the likelihood and direction of possible interactions, depending on whether they lie above or below the surface. In addition, response surfaces can be constructed to take into account an interaction mechanism (hypothesis) and can thus be used for exploring the validity of hypotheses for mixtures interactions.

A major limitation of response surfaces is that they readily represent the combined effect of only two compounds or classes of compounds, although the possibility exists of using similar methods to visualize interactions of a particular chemical with the rest of the (complex) mixture as a whole.

Other Mathematical and Statistical Tools

Other approaches are also being developed to identify the structural classes of chemicals and combinations of chemicals within complex mixtures that contribute most strongly to biological effects. Several are adaptations and variations of multivariate analyses [Eide 2001]. For example, regression modeling of mutagenicity data from highly complex, petroleum-derived mixtures of polycyclic aromatic hydrocarbons (PAHs) using partial least squares projection to latent structures was found useful for associating mutagenicity with a limited number of chemicals and predicting responses to other mixtures [Eide et al. 2001].

Rapid development is occurring in the modeling of highly complex biological data, driven in part by the tremendous volume and complexity of data produced by contemporary genomic/proteomic technologies. The bioinformatics field is developing rapidly, and many of the resulting data displays and analytical strategies aimed at identifying response associations (that is, cluster analyses) will be useful in research on mixed exposures. Many of these techniques use visualization of graphical response surfaces.

New Knowledge Needs

Potential for Intervention

Exposure Analyses

Risk is estimated by integrating a health assessment and an exposure assessment, thus making high-quality exposure assessments essential. Achieving this requires improvements and the integration of methods, measurements, and models for exposure. Research in these areas has been recommended or is under way for individual stressors [NIOSH, 2002]; and in some cases, the findings can be applied directly to mixed exposures. However, in many cases, mixed exposures present a unique challenge. Exposure analyses need to be incorporated into epidemiological studies to obtain robust associations or to eventually achieve knowledge of cause-effect relationships. Knowledge of exposure needs to be applied to the design of more realistic animal toxicological studies. Better exposure analysis is needed to estimate the number of people exposed to mixtures, the agents to which they are exposed, and the duration and time-course of the exposure. From these estimates, dose-response models for complex mixtures may predict adverse effects. Exposure analyses also identify the sources and pathways most likely to contribute to risk, facilitating efficient and effective interventions.

Methods

Methods enable measurements, making methods of fundamental importance. When contemplating what methods are needed, the goal of the exposure analysis must be kept in mind because this drives the approach. For example, is the goal associated with routine monitoring to identify releases or low-level frequent events? Is it part of an epidemiology study and if so, how quantitative does it need to be? Is it part of a survey, how quantitative does it need to be, and does the source need to be identified? Does the exposure analysis need to be highly quantitative or are there surrogate indicators or questionnaire items that are adequate?

Complex Chemical Mixtures

Measuring single chemicals and complex mixtures of chemicals have similar elements. However, complex mixtures present additional challenges, especially in collecting samples, preparing and extracting samples for measurement, and measuring the compounds present. The first step, collecting a representative sample, is quite difficult primarily because the inherent variability and heterogeneity of mixtures (chemically, physically, spatially, and temporally) create challenges. For example, a sampling method optimal for an aerosol mixture rich in nonreactive hygroscopic materials is different from one rich in nonreactive volatile organic compounds. This problem calls for the development of new sampling technologies capable of sampling mixtures with improved precision and accuracy that are as representative of the original environment as possible. Obtaining a representative assessment of a worker’s exposure is complicated further when the potential exists for absorption through the skin. Ultimately, a form of biological monitoring; for example, of blood, urine, or exhaled breath, may be required to estimate a worker’s total exposure by all routes.

The next stage is preparing the collected sample for measurement. Analytic procedures for pure chemicals in a pure water or simple organic matrix are highly developed. However, environmental and occupational matrices are complex. For example, a procedure that works well with a breath sample, will likely not work well for an aerosol sample. Consider the case of extracting a mixed sample with chemicals varying widely in solubility and other physicochemistries.

The final step is to take the measurement. Newer technologies, whether they are biologically (as in biosensors or biomarkers) or chemically based (as in more advanced mass spectometry technologies growing from traditional analytic chemistry approaches) can also be quite difficult for mixtures. In spite of major advances, some classes of substances are still difficult to identify quantitatively. For example, only a very small fraction of the organic components of ambient aerosols have been chemically identified. Thus, it may be possible to measure only a portion of the mixture. If this portion causes the health effects, this is acceptable. However, false negatives might result, weakening reliance on such procedures.

The analysis of complex mixtures of substances is still a daunting task. High-resolution chromatography and mass spectrometry are examples of current techniques commonly applied to the analysis of complex mixtures. Combining such technologies to form multidimensional techniques provides even more powerful analytical tools. For example, gas chromatography-mass spectrometry (GC-MS) is well established for characterizing and quantifying the various volatile chemicals that constitute mixtures such as petroleum distillates. Techniques for characterizing and quantifying the nonvolatile, polar, and thermally labile components of complex mixtures are less well developed and need to be elaborated.

The broad application of sophisticated analytical technologies to the analysis of complex mixtures is hindered by the expense of the equipment and the need for highly skilled operators. Also, the time required for chromatographic analysis, which is based on differences in partitioning of the various sample components between a mobile and stationary phase, increases with increasing mixture complexity. The development of new technologies, such as microsensor arrays, holds the promise of providing rapid, specific responses to a variety of significant endpoints, such as those based on electrochemistry and immunochemistry.

Measurements/Monitoring

Various approaches can be used to measure exposures. First, emphasis needs to be on deciding the exposure metrics. This includes identifying the full spectrum of stressors being measured (for example, inhaled chemicals, noise), the time frame (for example, exposures to chemicals having acute effects or peak-exposure effects should be measured over short averaging times), organization of work issues (such as extended or novel work schedules), and identifying data quality objectives (for example, precision and accuracy needed, sample size needed). However, given the state of the science, these elements are often difficult to determine. Without more knowledge of exposure variability, it can be difficult to decide on an optimal measurement strategy. For example, is it better to measure a large number of workers once or a small number of workers frequently? Another key question is whether a stationary monitor can adequately represent worker exposures or are personal exposure measurements required. Most likely, the answer depends on the exposure scenario. This must be determined in advance so that the optimal approach can be chosen.

Exposure Modeling

All exposure scenarios cannot be measured for reasons such as limited finances and limited availability of measurement methods. Hence, exposure modeling is necessary. Optimal models are built using a combination of measurement data and theoretical information and are evaluated with measurement data. Modeling becomes even more important with mixtures because of the difficulty (and in some cases the impossibility) of measuring complex mixtures. For example, exposures could be better predicted if a complex exposure model were available based on the chemicals in the environment of interest; the physiochemical properties of the chemicals; the relevant fate, transformation, and distribution characteristics under realistic conditions; and activity patterns of the potentially exposed people. With a scientific basis to estimate the number of people likely to have exposure to other stressors (for example, noise, certain pharmaceuticals), the total exposure would be better understood as input into health models for eventual risk assessment.

A need exists for research on mixed exposures that (by virtue of their chemical or physical properties) react in the work environment (before entering the body), producing a more hazardous chemical or resulting in greater ease by which the agent enters the body. For example, a mixed exposure involving ultraviolet light and certain chlorinated hydrocarbons can produce the toxic agent phosgene [Ng et al. 1985; Wang et al. 2002]. Another example is the mixed exposure involving fine particles and radon gas that can result in increased lung burden of alpha and beta radiation emitters.

New Knowledge Needs

Potential for Intervention

back Home  
Biomarkers next

NIOSH Home
 |  NIOSH Search  | Site Index  | Topic List | Contact Us