Skip Navigation
Johns Hopkins Bloomberg School of Public HealthCAAT

Technical Report No. 2

Structure-Activity Relationships in Predictive Toxicology

A Report of the CAAT Technical Workshop of June 21-22, 1990

Editor: Shelly S. Sehnert


The Workshop on Structure-Activity Relationships in Predictive Toxicology conducted at the Johns Hopkins University School of Hygiene and Public Health was the third in the series of scientific workshops held under the auspices of the Johns Hopkins Center for Alternatives to Animal Testing (CAAT). The goal of this workshop was to disseminate state-of-the-art information to scientists concerned with reducing the number of animals used in testing, minimizing stress and discomfort to laboratory animals, and maximizing the scientific data obtained from in vivo experimentation.

The philosophy of the Johns Hopkins Center for Alternatives to Animal Testing is that modern research involving the use of animals must be conducted with responsibility and respect as scientists endeavor to reduce, refine, and replace current in vivo testing practices. The workshop on Structure-Activity Relationships in Predictive Toxicology focused on two aspects of this perspective: computational chemical modeling to predict efficacy or toxicity, and alternate experimental techniques that permit bioeffects to be predicted accurately in vitro.

Toxicology in the last decade of the twentieth century is a rapidly evolving science that supplements classical toxicological approaches with technological advancements. Development of accurate, predictive databases that will guide the toxicologist in assessing the potential risks associated with exposure to environmental or therapeutic pharmaceutical agents depends on knowledge of the molecular mechanisms that govern the interactions between cells and xenobiotics.

The Johns Hopkins Center for Alternatives to Animal Testing depends on the financial support of public and private institutions to conduct its activities. This project would not have been possible without the sponsorship of the U.S. EPA, Mary Kay Cosmetics, Inc., The Procter and Gamble Company, and Hoffmann-LaRoche, Inc. The foresight and cooperation of these institutions is gratefully acknowledged.

Finally, organization of the workshop and preparation of this report was made possible through the competence and dedication of Ms. Marilyn Principe, Special Assistant to CAAT, whose assistance is greatly appreciated.

Shelly S. Sehnert, Ph.D.
October 1990


Attention has recently been given to the quantitative relationships between the structure of xenobiotics and their bioactivity for three main reasons: to characterize the physicochemical properties of chemicals relevant to their biological effect; to explain the mechanism of toxic response; and to predict the relative toxicity of a large number of compounds. Traditionally, in vivo toxicological testing has been the hallmark of the latter effort; however, the increasingly large number of chemicals which must be assessed for their toxic effects presents to the toxicologist an insurmountable task that whole animal testing cannot address.

In vitro toxicity testing is a technically complicated activity and many pitfalls and sources or error exist which can lead to erroneous conclusions. A critical factor in the successful development of new testing methodologies and their practical application is proper and adequate training of individuals participating in these activities. To this end, it is recommended that personnel involved in all aspects of in vitro toxicity testing be adequately trained in cell cultures procedures and basic toxicology.

Recent advances in computational science have made it possible for scientists to begin to make correlations between the physicochemical properties of molecules and their biological effects. Coupled with a rapidly expanding database of the structures and functions of biological macromolecules and receptors, the ability of researchers to develop comprehensive, predictive models for toxicological assessment has been greatly enhanced since Hansch and co-workers first developed the concept of Quantitative Structure-Activity Relationships (QSAR) in 1969.

QSAR studies involve the graphical entry and storage of structures, generation of three-dimensional molecular models, molecular structure descriptor generation, analysis of the descriptors, development of quantitative relationships between descriptors and biological responses using multivariate statistical or pattern recognition methods. Classes of structural descriptors include topological, geometrical, electronic, and physicochemical representations of the molecules. New descriptors recently devised and tested encode the surface area and partial charge information simultaneously.

To quote Enslein, QSAR methodologies "...have been applied to a substantial variety of toxicological endpoints, including carcinogenicity, mutagenicity, teratogenicity, skin irritation, rat oral LD50, inhalation LC50, maximum tolerated dose, fathead minnow LC50, and Daphnia magna EC50. Models currently under development include equations for subchronic and chronic NOEL, NOAEL, LOEL, and LOAEL, and for Chinese hamster ovary chromosome aberrations."

Enslein summarized some of the major concerns in computational predictive toxicology:

Unavailability of data - The applicability of every QSAR model is limited by the database from which it was developed. The predictors in publicly-available use are based on unrestricted data. There are, however, far more data on every toxic endpoint in company files. While it is understandable that companies may not want to release results on chemicals for which there are commercial possibilities, it would seem practical to release information on compounds that are no longer of interest to an organization. The entire chemical and toxicology community would benefit the inclusion of these data in QSAR models; unnecessary replication of tests would also be reduced.

Data quality - A QSAR model can only be as good as the data from which it is derived. A limitation lies in the fact that most data points are not replicated, and thus, are accepted as truth, when they may, in fact, be in error. This problem is coupled with the fact that some biological "assays" are inherently inaccurate.

Mixtures - Current QSAR models cannot handle mixtures of chemicals, since insufficient data exist for the development of models based on mixtures. We are currently limited to the assumption of non-interaction of components. It is possible to design experiments that address this question.

Non-systematic testing plans - Ideally, one would want entire series, tested, except in some rare instances, with rather inexpensive tests. This is not, and cannot be, the case. This limitation results in inability to make predictions for compounds with certain substructures.

Model validation - This is the subject of ongoing controversy. Of the two methods - internal and external validation - external validation, on the surface, would appear to be the preferable, less biased choice. Because of the relatively higher cost of external validation, it is important to resolve this controversy.

Estimate validation - While some of the existing prediction systems include methods for internal estimate validation, there will always be some chemicals for which prediction will be in error irrespective of the overall accuracy of a given model. The most serious part of this problem is that it will not be possible to know with certainty for which predictions this will be the case. All one can hope to do is to minimize the probability of such events occurring.

Mechanisms - The mode of toxic action is not known for many of the endpoints currently modeled. Considerably better models could be developed were such information available.

In summary, structure-activity models have historically been based on empiricism. New advances using quantum mechanical calculations and molecular graphics will allow the design of a model more closely approach the effects modeled in vivo. Enhanced understanding of the relationship between chemical structure and toxicological effect based on quantifiable physicochemical parameters has become a valuable asset in predictive toxicology.


Predicting Toxicity From Structure-Activity Data: Past, Present, and Future
Dr. Kurt Enslein
   Health Designs, Inc.

Although the concept of relating chemical structure to biological activity is novel, it was not until the introduction of readily available digital computers that the application of this concept became practical. The earliest work published is that of Free and Wilson (1964) who calculated QSAR equations for the rat oral LD50 for indanamine derivatives. The Free-Wilson method requires the solution of a set of simultaneous equations, as many equations as there are chemicals for which biological data exist, and with as many terms as there are substituent positions on the parent molecule. The least squares solution of that system of equations would have been difficult without digital computers.

Hansch and co-workers in 1969 introduced the linear free energy relationship (LFER) method for relating activity to chemical structure as applied to drug design. Wishnok  et al in 1976 and 1978 applied these principles to QSAR models of nitrosamine carcinogenicity. Hansch  et al in 1980 applied the LFER principles to the mutagenicity of substituted (o-phenylenediamine) platinum dichlorides. These models all used least square regression techniques. It should be noted that the above examples were based on a congeneric series of chemicals.

Starting a completely different line of development, Hodes in 1977 used atom-centered fragments to build models of antileukemic activity to reduce the number of candidate chemicals that needed to be tested. He has access to a database of over 100,000 chemicals, and has generated tens of thousand of descriptors. Due to the size of the data matrix, he had to use relatively ad hoc statistical methods to derive his equations. In 1981, Tinker used an adaptation of Hodes' program to derive models for mutagenicity based on heterogeneous data. Klopman and Rosenkranz in 1984 began to apply atom-centered fragments to various toxic endpoints, including carcinogenicity and mutagenicity, again using heterogeneous databases. Both Tinker and Klopman applied methodology developed for very large databases to much smaller collections of compounds.

Jurs and co-workers in 1979 began to apply adaptive network techniques (sometimes called iterative least squares) to toxicological problems. These methods had been developed in the late 1950s among others. In the late 1960s and early 1970s, researchers in chemistry including, Isenhour, Kowalski, and Jurs, applied these techniques to a variety of problems such as carcinogenicity and mutagenicity. There has recently been a revival of an enhanced form of this methodology.

Enslein and Craig in 1978 published a QSAR model of rat oral LD50, based on a heterogeneous data set using a mixture of continuous and dichotomous descriptors, using least square regression. This was a simple generalization of the use of binary parameters by Free, Wilson, Cramer, et al (1974) combined with continuous descriptors, and would not have been practical without the use of computers.

It is interesting to note that while the early QSAR toxicity models dealt with closely related compounds and the later ones with heterogeneous data sets, more recent models employ more closely related compounds, such as aromatics, aliphatics, or alicyclics, etc. The refinement has become possible with the increasing size of databases.


The hydrophobic nature of a drug or toxicant may be represented by the logarithm of the partition coefficient determined from the distribution of a chemical between two immiscible solvents, one polar and one non-polar. From its development by Hansch, (Hansch, C., Maloney, P.P., Fujita, T., Muir, R.M. [1962], Nature, 194: 178-180.), the 1-octanol/water partition coefficient has evolved into a common reference system for determining lipophilicity in medicinal chemistry and toxicology.

Quantitative Structure-Activity Relationships
Dr. Corwin Hansch
   Pomona College

Predictions of biological potency of xenobiotics may be obtained on the basis of correlations with generalized classes of compounds, or determined based on fundamental physicochemical parameters, yielding more precise estimates for fewer numbers of chemicals. A more general parameter than the Hammett-Taft sigma constant used to account for electronic variation associated with alterations in molecular structure has been developed. The quantum chemical approach of electronic effects has the potential to be more flexible and useful. Hydrophobicity, as estimated by 1-octanol/water partition coefficients determined experimentally, plays a major role in intensifications of biological potency and may be a more useful predictor than the electronic character of a compound.

A general QSAR for correlating mutagenicity of aromatic and heteroaromatic nitro compounds has been developed. This set of compounds is relevant to environmental health since they are common intermediates in industrial synthetic processes and have been implicated as carcinogens present in diesel exhaust and emissions from combustion sources. A review literature yielded data on over 200 nitroaromatic and heterocylic compounds showing mutagenicity in the Ames Test (TA98). From these data, a quantitative structure-activity relationship was derived based on 173 congeners. This study concluded that main correlates of mutagenicity of the chemicals are the hydrophobicity (modeled by 1-octanol/water partition coefficients) and the energies of the lowest molecular orbitals calculated using the AM1 method.


The statistical, mathematical, and computational techniques used in developing and testing a QSAR model have been widely discussed. Several approaches and applications were presented.

Application of an Expert System To Predict Toxicological Activity
Dr. Herbert S. Rosenkranz
   Case Western Reserve University

CASE is an artificial intelligence system that has demonstrated capability for predicting the biological activity of molecules as well as providing clues relating to mechanism of toxicity. CASE selects its own descriptors automatically from a learning set composed of active and inactive molecules. The descriptors are easily recognizable single, continuous structural fragments that are embedded in the complete molecule. CASE was designed to handle non-congeneric compounds, unbiased with respect to the descriptors, which consist of either activating (biophore) or inactivating (biophobe) fragments. Each of these fragments is associated with a confidence level and a probability of activity that are derived from the distribution of these biophores and biophobes among active and inactive molecules. Once the training set has been assimilated, CASE can be queried regarding the predicted activity of molecules of unknown activity. Thus, entry of an unknown molecule will result in the generation of all the possible fragments ranging from 2 to 10 atoms accompanied by their hydrogens. These will be compared to the previously identified biophores and biophobes. On the basis of the presence and/or absence of these descriptors, CASE predicts activity or lack thereof.

CASE fully documents the basis of the predictions and also has the capacity for comparing the structural basis of different endpoints, thus allowing a determination to be made of whether different biological endpoints (e.g., mutagenicity in the Ames assay and carcinogenicity in rodents) operate through a common mechanism. This capability of CASE is useful in the design of molecules with increased beneficial properties and decreased toxicity.

Using peer-reviewed databases such as the National Toxicology Program, the CASE methodology has been able to resolve the structural basis of mutagenicity in salmonella, of cell transformation, of DNA reactivity, of cancer causation in rats and mice, and the induction of sister chromatid exchanges and chromosomal aberrations. Surprisingly, CASE was able to discern the structural basis for the carcinogenicity of non-mutagenic substances. More recently, the structural features of induction of a-globulin nephropathy were also elucidated. Application of an expert system such as CASE to other toxicological endpoints requires the availability of peer-reviewed databases. The accuracy of the computational predictions is, in turn, limited by the accuracy of the database.

Computer Assisted Structure-Activity Studies - Acridines
Dr. Peter C. Jurs
 Pennsylvania State University

The use of charged partial surface area (CPSA) descriptors for the study of acridines is a new approach to computational chemical modeling. These newly devised and tested descriptors encode the surface area and partial charge information simultaneously. When the biological activity or physicochemical property being studied is expressed quantitatively, (e.g., LD50), then the calculated descriptors are submitted to multiple linear regression analysis which yields quantitative predictive models. When the biological activity or physicochemical property being studied is expressed qualitatively, (e.g., membership in classes) then the calculated descriptors are analyzed by pattern recognition methods. In either case, the resultant models or discriminants can be used later for predictive purposes. Such SAR studies can be carried out with the ADAPT (Automated Data Analysis and Pattern recognition Toolkit) computer software system.

A specific example of this approach to SAR studies involves the DNA binding activity of acridine-4-carboxamides. DNA binding is a necessary, but not sufficient condition, for antitumor activity as measured by several in vivo and in vitro assays. A data set containing 52 acridine derivatives along with their DNA binding constants for the poly (dA-dT) and poly (dG-dC) sites of DNA was obtained from the literature. The descriptors that were calculated and ultimately used in the regression models were designed to encode a specific binding site model that was previously proposed in the literature. Since it is believed that the acridines interact with the DNA by intercalculation, surface area and charge descriptors were calculated for regions of the acridine backbone that are theorized to be in close proximity to the DNA structure. The ability of these descriptors to successfully model the DNA binding constants for this data set supports the binding model that was previously proposed. Statistically valid models were generated for both the poly (dA-dT) and poly (dG-dC) binding. For the AT site, an 8 descriptor model was generated with R=0.872, s=0.255, n=51, where 7 of the 8 descriptors were CPSA descriptors. For the GC site, a 9 descriptor model was generated with R=0.878, s=0.267, n=50, where 6 of the 9 descriptors were CPSA descriptors.

Figure 1. The acridine data set. Basic structure and substituents found in the data set. Characteristics of the reported biological activity for binding to poly (dA-dT) and poly (dG-dC).

Quantitative Structure-Activity Relationships Based on Statistical Design
Dr. Maria-Livia Tosato
 Istituto Superiore di Sanita

In order to construct a model that may provide reliable predictions of toxicity, certain conditions should be fulfilled because of the constraints that arise from the theoretical foundation of QSARs and the current limited knowledge about the mechanisms of toxicity. Traditional QSARs are statistical models based on a series of compounds that relate structural properties (the independent variables). The relationship between the two variables can be used to predict similar relationships for compounds whose biological effects are unknown. The accuracy of a QSAR is predicated upon appropriate selection of the initial set of compounds (the training set) and the rigor used to validate the model. Typical independent variables for a series of compounds may include physicochemical parameters (e.g., spectroscopic data, quantum mechanical indices, substituent constants, and reactivity constants); therefore, the predicative accuracy of a QSAR is limited by the accuracy of the physicochemical data. Cross validation of the model reduces the probability of chance correlation of variables.

Once the data set has been established, statistical data analysis, the model validation process, and the selection of a training set of compounds become key factors. Selection of an appropriate training set is a crucial step in QSAR study since the range and reliability of a model are strictly dependent upon the chemicals used for calibrating the model. The basic elements that are required to perform a design-aid selection, illustrated in Figure 2, are the definition of design variables to be spanned in the selection and the application of a statistical design (e.g., a factorial design) for identifying a small set of compounds that together map the domain of the class.

Figure 2. Outline of the strategy for a QSAR based on statistical design. The compilation of QSAR compatible classes from inventories of industrial organics cannot be assisted a priori from fundamental theory. The systematic classification of organic chemicals, coupled with knowledge of the molecular mechanisms of toxicity, provides a sound starting reference point. The structural range covered by a QSAR will vary with the chemical mechanism involved in the toxic endpoint, thereby including several chemical calsses when the mechanism is non-speicifc or limited ranges of structural variability for highly specific endpoints. In all cases, the assessment of the actual range of a QSAR will be an a posteriori result of data analysis (Step 5) and model validation (Step 6).

The various components of a QSAR study based on statistical design can be integrated after inclusion of additional components in the framework. In essence, the strategy is based on the assumption that if chemicals are grouped into homogeneous classes it may be possible to identify in each class, regardless of its size, a small set of representative compounds. If so, test data (toxicity-related and environmental fate-related) for such sets of compounds will provide adequate databases for constructing reliable models. Once validated, this model may permit prediction of the missing data for all the compounds in the class. Approaches must be developed to remove current hindrances to the development of QSARs relevant to risk assessment so that QSARs can become proper vehicles to extrapolate data from compound to compound in the screening of "old" and "new" chemicals.

3-D QSAR of Angiotensin-Converting Enzyme Inhibitors
Dr. Scott A. DePriest
 Washington University

The active site on an enzyme has three-dimensional binding towards specific substrates. This is an area that cannot be addressed with traditional QSAR approaches.

The key to linking the functional group contributions determined by traditional QSAR and the pharmacophore geometry provided by the Active Analog approach can be found in a method called Comparative Molecular Field Analysis, or CoMFA. The underlying assumption in a CoMFA analysis is that the biological activity of a molecule can be related to the steric and electrostatic fields presented to the active site. The coupling of the Active Analog Approach, with its ability to determine the bound geometry of drugs at a receptor, and the 3D-QSAR technique, CoMFA, which predicts the affinity of a drug for that receptor, provides a new future drug development.

Angiotensin-converting enzyme (ACE), is an enzyme that cleaves the C-terminal dipeptide, His-Leu, from angiotensin I to produce the octapeptide angiotensin II, a potent vasoconstrictor. Inhibition of ACE has become an important target for the treatment of hypertension. The search for potent ACE inhibitors has produced a variety of classes of compounds including captopril (Squibb) and enalapril (Merck). Previous structure-activity studies of ACE inhibitors have defined the structural requirements for ACE inhibition to be: a C-terminal carboxyl group; an amide carbonyl; and various zinc binding functional groups including carboxylate, sulfhydryl, phosphate, and phosphoramide.

Using this model combined with the Active Analog Approach and systematic conformational search, Mayer et al defined a unique geometry (using a 10° scan factor and a resolution of 0.25 Å for binding to the ACE active site based on 28 structurally different inhibitors. The original data set of 28 compounds has been expanded to include 7 new inhibitors having affinities in the nanomolar range. Using a more efficient search algorithm, the Mayer model has been refined to locate a single geometry capable of binding all 35 compounds using a 4° scan factor and a resolution of 0.10 Å.

The defined active site geometry of the ACE inhibitors is, in essence, an alignment rule, and thus, a basis for comparison of a non-congeneric series by CoMFA. Initial studies of the mode of ACE inhibition using CoMFA have given correlations with high predictive value for test compounds from a database of over 30 classes of inhibitors. The model was derived using 68 molecules that spanned the diversity of the database and consisted of five components (cross-validated r2 = 0.76). This model was used to predict the activity of an additional 20 inhibitors having a measured affinity range of 1.6 x 10-4 to 2.2. x 10-9 M. The predictive r2 for the series of 20 molecules was 0.79. Considering the variation in physicochemical properties among these structures, the explanation of approximately 80% of the variance in affinities is remarkable. However, close inspection of the correlation equation from this CoMFA analysis does not explicitly consider the nature of the zinc ligand bond. Recent CoMFA analyses of the original 68 inhibitors have given an r2 = 0.90 (using 8 components). Current analyses incorporating traditional QSAR parameters such as 1-octanol/water partition coefficients, dipole moments, and molar refractivities, in addition to CoMFA, as well as the addition of descriptors for the zinc binding ligand, have produced models of varying predictive power.

The abundance of structural variation and experimental data for the ACE inhibitors makes this series of molecules an excellent methodological development and should continue to yield insights into the utility of CoMFA. In the absence of structural data for angiotensin-converting enzymes, a competent model of ACE inhibition would allow for rapid screening of compounds that might potentially inhibit the enzyme. Additionally, the model would dramatically reduce the number of pharmacological tests involved, which would, in turn, reduce the usage of animals required for pharmacological testing.

Computer Modeling Studies of Enzyme Inhibitor Interactions
Dr. Regine S. Bohacek

X-ray crystallographic data for a protein or macromolecule that is of potential interest as a toxicological or therapeutic target provides a unique opportunity for rational drug design before synthesis, thus reducing the number of compounds that must be screened for efficacy or toxicity, resulting in the need for fewer whole animal tests. X-ray crystallographic data can provide new insights into the interactions between enzymes and inhibitors and can delineate some of the properties of molecules that increase potency. Study of the relationship between inhibitors and their binding sites has led to the development of a quantitative method based on hydrogen bonding and hydrophobic contact for assessing interaction. This method has been validated using x-ray and neutron diffraction data on the packing of proteins. Enzyme-inhibitor interactions may be studied using energy calculations based on molecular mechanics, providing information on other important factors such as ligand strain energy that are not apparent in the complementary model.

In order to apply these methods to the design of novel compounds, it is necessary to create a three-dimensional model of the compound interacting with the enzyme active site beginning with a template approximation based on known inhibitor data. The optimum orientation and conformation of the compound can be determined by Monte Carol conformational searching with energy minimization. The minimum energy complex is then evaluated for complementarity and strain energy.


Prediction of toxicity ideally must include molecular alterations and bioeffects resulting from normal enzymatic processes that metabolize xenobiotic compounds to more electrophilic forms. The process of metabolism can alter the structure of the ultimate molecule that interacts with the cellular target, in addition to modifying the pharmacokinetics, biodistribution, and life of a xenobiotic.

Metabolism-Directed Drug Design
Dr. Thomas A. Baille
 University of Washington

One of the most significant developments to occur in the field of toxicology over the past two decades has been the realization that many drugs, industrial chemicals, pesticides, pollutants, and other foreign compounds are not toxic per se, but elicit their adverse effects only after metabolism to chemically-reactive intermediates that interact with endogenous components of the cell. These reactive intermediates, which may be produced by oxidation, reduction, or conjugation processes, typically are electrophilic in nature and bind covalently to a variety of nucleophilic cellular constituents including proteins and nucleic acids. In some cases, the consequences of such xenobiotic-macromolecule interactions are lethal to the host, and result in either cellular necrosis or the initiation of neoplastic states.

In other cases, however, less dramatic consequences ensue. For example, the enzymes that catalyze the formation of reactive metabolites from innocuous foreign compounds themselves may serve as targets for either covalent modification or metabolic complex formation, resulting in irreversible and reversible enzyme inhibition, respectively. Such phenomena represent the basis for several unknown drug-drug interactions where elimination of a second drug is impaired as a result of inhibition of a key enzyme (e.g., cytochrome P450) by the first drug. Whatever the nature of the adverse effect associated with a given foreign compound, it is important to gain a detailed understanding of the metabolic fate of the compound in order that the potential role of reactive metabolites as mediators of the toxic response may be evaluated. Such information is not only of fundamental importance in the compilation of structure-toxicity databases, but often is of direct, practical value in providing "leads" for the rational design of new drugs with enhanced therapeutic indices.

Two drugs that undergo metabolism to chemically reactive intermediates provide examples of exploitation of the understanding of the responsible metabolic pathways in order to search for analogs with decreased toxicity. N-methylformamide (NMF) is a candidate antineoplastic agent that has been shown to possess activity against a variety of murine tumors but which causes liver necrosis in both laboratory animals and human subjects.

Studies on the metabolic fate of NMF have indicated that oxidation of this formamide leads to an electrophilic intermediate that is either trapped by reaction with glutathione or binds covalently with cellular macromolecules and causes hepatic injury. Interestingly, the glutathione conjugate produced from NMF has been found recently to be a potent inhibitor of the growth in vitro of TLX5 lymphoma cells, suggesting that this or related adducts may contribute significantly to the desirable antitumor response attributed to NMF. Consequently, derivatives of this glutathione conjugate may prove to be effective antineoplastic agents while exhibiting decreased hepatoxicity relative to NMF.

A second example is stirpentol, a novel antiepileptic agent being developed in France. Despite possessing a favorable anticonvulsant profile, stirpentol acts as an inhibitor of cytochrome P450 enzymes, and thereby causes a range of drug-drug interactions, a feature that is likely to limit effective use of this drug in polytherapy. Metabolic investigations with stirpentol have indicated that oxidative attach again leads to the formation of a reactive intermediate; although in this case, the electrophilic specie binds tightly to prosthetic hem moiety of cytochrome P450, and thereby causes non-competitive inhibition of the enzyme. Based on this finding, analogs of stirpentol which have been designed cannot undergo metabolism to such a reactive intermediate or that incorporate an alternative functionality at which metabolic attack would be predicted to occur more readily and yield reactive products.

Metabolic information can afford a unique insight into the origin of foreign compound-induced toxicities, and can provide a basis for the rational design of new, safer therapeutic agents.


New developments in in vitro toxicology are permitting rapid assessment of the toxicity of chemicals without the use of classical in vivo whole animal studies. Research leading to understanding of the molecular mechanisms of tumor promotion, cell-cell recognition, and intracellular signalling is in the forefront of modern toxicological science. In vitro cell culture systems are being supplemented with organ slice techniques that greatly reduce the number of animals required while including aspects of cell culture systems previously only available in whole animal models.

Estrogen Activity and Carcinogenicity
Dr. James D. Yager
 Johns Hopkins University

Clinical observation that women with a history of long-term use of oral contraceptives were at increased risk for developing liver neoplasms led to examination of the hypothesis that the synthetic steroidal estrogens found in oral contraceptives were promoters of hepatocarcinogenesis. The results of these studies demonstrated that mestranol and ethinyl estradiol (EE) are strong promoters of hepatocarcinogenesis and stimulate liver DNA synthesis and that tamoxifen is a promoter of hepatocarcinogenesis.

In subsequent studies to determine the mechanism(s) of stimulation of DNA synthesis, it was found that EE has both indirect and direct effects associated with increased liver growth. On the one hand, in vivo, EE treatment enhanced the levels of a serum/plasma growth factor(s), possibly in the epidermal growth factor (EGF)/transforming growth factor-alpha (TGF-alpha) family, with activity stimulatory for hepatocyte DNA synthesis. On the other hand, using rat hepatocytes in primary culture, EE pretreatment dramatically enhanced their DNA synthetic response upon subsequent treatment with EGF; EE alone had only a small stimulatory effect on DNA synthesis. Current research has been focused in two areas: metabolism of EE in cultured rat hepatocytes and the mechanism through which EE enhances EGF responsiveness in cultured hepatocytes.

The mechanism behind the effect of EE on EGF receptor binding protein half-life is an area of active research, beginning with determination of the structure function activity relationships of various estrogens including the synthetic non-steroidal diethylstilbesterol (DES) and several naturally occurring non-steroidal estrogens, including the zearalanols. Additionally, studies are in progress using the estrogen responsive human liver cell tumor cell line HepG2 as a possible cell culture model to study the mechanisms of estrogen growth stimulation in liver-derived cells. In RPMI 1640 medium supplemented with insulin and 2.5% stripped serum, the HepG2 cells grow to saturation at which point growth plateaus. In contrast, with the presence of 10 µM mestranol, growth continues throughout the 8 day culture period. EGF also stimulates cell growth and experiments are in progress to determine whether mestranol EGF interacts to enhance growth beyond that which occurs with each agent alone. EE, alpha-zearlanol, and diethylstilbesterol (a known hepatocarcinogen/promoter) also stimulate the growth of HepG2 cells. These data suggest that the HepG2 cell line may represent an appropriate model for the study of the mechanism of growth effects of estrogens on liver cell. Success with this cell line will perhaps provide a better system than the primary cultures of rat hepatocytes with which to study the mechanisms and structure-activity relationships of the estrogen-EDG interaction.

Structure-Activity in Organ Cultured Liver Slices
Dr. Klaus Brendel
 University of Arizona

Development of an in vitro method for screening chemical toxicity that incorporates more of the actual properties of whole organs than is possible with isolated cell culture systems would be a highly desirable development. The greatest advantage of such a method would be to include the range of cell-cell interactions that are by necessity absent in traditional in vitro techniques while still observing minor alterations in normal cellular physiology that may be induced by xenobiotic compounds. This type of model could provide an interface between the traditional isolated cell culture system and whole animal studies.

Organ slices have been used in biochemistry for many years. Some of the basic pathways of intermediary metabolism have been elaborated with the help of organ slices. Since gas and nutrient exchange in slices is dependent on diffusion which is a function of spatial parameters, it is important to have slices of identical sizes if toxicological comparisons are to be made. The production of identical slices using manual techniques is problematic and dependent on the experience of the operator. Incubation of slices in the past was done simply by linear shaking of suspensions. Due to clumping and attachment to the vessel, and mechanical injury introduced by techniques employed to avoid the above, this procedure allows for short term maintenance only. Development of equipment to precision cut tissue has led to the enhancement of techniques to maintain such mechanically cut slices under conditions of minimal mechanical stress for longer periods of time. Maintenance in culture media of precision cut liver, kidney, heart, lung, pancreas, and brain slices in dynamic organ culture is now possible. This newly developed technology includes a mechanical .slicing apparatus in which it is possible to produce slices of nearly identical dimensions in a controlled environment with minimal tissue trauma.

Liver slices kept in dynamic organ culture maintain many physiological and biochemical parameters measurable in liver tissue for at least 24 hours. All functions are somewhat depressed right after the slicing process, but recover in culture. ATP level is back to normal after 3-4 hours and retained at that level for up to 16 hours. Intracellular K+ levels are back to control after 2 hours and are maintained at this level for 20 hours. Ca++ ion is actively prevented from entering cells. Cytosolic enzymes do not leak into the supernatant culture medium past the initial preincubation period. Protein synthesis and secretion of albumin are maintained at levels comparable to those in the perfused organ. The levels of cytochrome P450 do not seem to decrease significantly over time in culture. Ureagenesis and gluconeogenesis are similar to levels found in vivo. Histology of organ cultured slices together with histochemical, enzyme histochemical, and immunohistochemical approaches coupled with morphometric techniques add to the biochemical procedures used for viability assessment.

Changes in the above parameters over time can be used to assess the viability of the cultured slices. If these changes are resulting from the presence of a toxicant, they are taken as indicators of toxicity. Series of organic compounds of different structure can then be evaluated for structure-toxicity relationships. Since this technology is applicable to many different tissues and tissues from different species, structure-toxicity relationships can be determined. Measurement of intracellular and extracellular events make it possible to probe for potential mechanisms of toxicity. Modification of culture conditions and pretreatments of the animal donors for the tissue allow for additional insights.

Three series of compounds illustrate the potential powers of this new in vitro system for structure-toxicity evaluation. The first of this seriesh deals with the hepatotoxicity of dichlorbenzene isomers in rat liver slices, o-dichlorobenzene (O-DCB), m-dichlorobenzene (m-DCB), and p-dichlorobenzene (p-DCB) were tested for their cytotoxic effects on precision cut rat liver slices. The slices were maintained in dynamic organ culture for up to 6 hours. Toxicity was evaluated by intracellular K+ content, LDH leakage, and protein synthesis. Incubation for 2, 4, or 6 hours with 1.0 mM of any DCB did not result in toxicity in rat liver slices from control rats. When liver tissue obtained from phenobarbital-induced Fisher 344 rats were incubated with 1.0 mM of either o-DCB or m-DCB, toxicity was seen based on all parameters, but p-DCB exhibited less toxicity. The toxicity of o-DCB was blocked by metyrapone but not by SKF 525-A. Conversely, m-DCB toxicity was blocked with prodadifen but not with metyrapone. These results indicate the possible involvement of different cytochrome P-450 isoenzymes in the metabolism and toxicity of o- and m-DCB and demonstrate the utility of such a system for studying the toxicity and metabolism of structurally-related isomers.

The hepatotoxicity of o-substituted bromobenzenes in rat liver slices has been studied using these techniques. Analogs of bromobenzene substituted in the ortho position with a series of substituents varying from electrophilic (cyano) through those of intermediate character (bromo) to electron donating groups (methyl and methoxy) were evaluated in liver slices from untreated and phenobarbital pretreated animals. Dibromobenzene is the most toxic in this series followed by the bromobenzonitrile and bromobenzene itself. Bromobenzene substituted with electron donating groups are less toxic than the parent bromobenzene. Metabolism of substituted bromobenzenes also shows a clear dependence on type of substituent and how the resulting molecule is converted to the epoxide, phenols, and secondary metabolites.

Certain branched chain fatty acids that have antiepileptic properties are also teratogenic and hepatoxic. A number of putative metabolites of dipropyl acetic acid were compared with the parent molecule in regard to their relative toxicity. Hepatotoxicity is a common phenomenon in this series; dipropyl acetic acid is probably toxic by itself rather than through a toxic metabolite

Homologous series of other compounds have been studied in organ slices other than the liver, and these studies have led to the same conclusion: The intact nature of the system closely simulated the in vivo situation. Hepatic parameters such as selective toxicity due to oxygen gradients resulting from physiological perfusion of the liver, tissue oxygen depletion and phenomena connected with biliary secretion can easily be studied in liver slices.

Protein Kinase C and Inflammation Modeling Relationships
Dr. Peter Blumberg
   National Cancer Institutes

The classical in vivo assay used to screen compounds for tumor promotion ability was the mouse ear skin inflammatory assay. Although this assay allows detection of many compounds which have tumor promoting ability, the assay is arduous and mechanistically nonspecific. Short term in vitro cell culture assays which also use the inflammatory response as a predictor of promotional activity, identifies chemicals that are promoters but cannot be used to differentiate tumor promotors from chemicals that elicit the inflammatory response, although they are not promotors. Similarly, in cell culture assays non-specific binding could not be differentiated from specific binding.

Just as simplified n vitro assays can be more useful than the whole animal assays in trying to elucidate structure activity relationships, biochemical assays may be better than the cell system for identifying structure-receptor binding relationships. Protein kinase C is an example of the utility of biochemical assays. It is a Ca++ and phospholipid-dependent enzyme that is activated by diacylglycerol. Extracellular signals trigger the transient production of diacylglycerol from inositol phospholipids. Tumor promoting phorbol esters can substitute for diacylglycerol when the promoter intercalates into the cell membrane, thus permanently activating protein kinase C. Diacylglycerol when added to the cells in culture or painted on skin typically does not activate protein kinase C since its lipophilicity prohibits transfer through the aqueous phase in order to activate the enzyme. In the biochemical assay, these problems can be overcome by the addition of organic modifiers.

Potency can depend on the site of action of the molecule such that the local concentration of the ligand in the lipid may be the critical concentration. The biochemical competitive binding assays permit reading screening of compounds to identify those that work via similar molecular mechanisms, thereby classifying molecules with the same biological activity but different structures. Compounds may be inflammatory, but not promoting, since there are multiple receptors, some involved in promotion, some in inflammation.

The effects of inhibitors on intact enzymes, specific regions, and competitive binding as a function of the structure of a tumor promotor can be studied using biochemical assays, and help to elucidate the molecular mechanisms of inhibitory action.


Databases and In Vitro Toxicology
Dr. John M. Frazier
   Johns Hopkins University

The toxicological response may be viewed as a series of events developing from the molecular target interactions by the toxic agent initiating a biological response. The chain of events begins at the molecular level and propagates through the system, producing cellular and organ responses that result in population effects and ecological responses. Classical toxicological testing using whole animals included not only the processes associated with molecular interaction of the agent in the setting of exposure, but included cellular, genetic, and adaptive responses as well. The predictive power of in vivo tests relies on the accuracy of the extrapolation from the animal to the human condition. All methods of predicting toxic effect may be confounded if multiple molecular mechanisms exist or toxicokinetics modify dose at the cellular target.

The predictive power of any QSAR method depends on the breadth and accuracy of the database. Databases are used in the development of predictive techniques, such as the use of in vivo ocular irritation testing data to develop a predictive structure-activity relationship. Databases are used for validation of new techniques, for example, LD50 in vivo data are used to validate an in vitro cytotoxicity test.

A priori, the confidence that can be placed in QSAR predictions depends on: 1) the strength of the correlation between the predicted values and those obtained in in vivo; and 2) the probability that a totally new chemical will fall into the class of chemicals used to develop the correlation. The first factor is evaluated during validation studies by applying various statistical techniques to a large set of chemicals where QSAR predictions are compared to the in vivo toxicity data that is to be predicted. Frequently different chemical classes will exhibit different predictive relationships; therefore, it is necessary to correctly classify a new chemical in the appropriate chemical classification. Uncertainties in proper classification will contribute to the overall reduction in confidence of QSAR predictions. Assistance in classification provided by additional tests enhances the reliabity if this assignment. Databases must be integrated in order to optimize utility and accuracy.

Figure 3. Safety/Hazard Evaluation. All available data concerning toxicity of product under evaluation are taken into consideration prior to final toxicological decision to market product.


The QSARs of the future will help scientists approach the problems of including metabolic activation, toxicokinetics, and receptor structures into predictive models using statistical methodologies such as neural networks, regression techniques, and classification trees. The predictive power of a model built on the relationship between structure and biological effect, whether theoretical or in vitro, relies on the availability of accurate data, a current limitation of all toxicological modeling concepts.

The relationship between molecular structure and toxicokinetics needs expansion, so that the significance of biodistribution, absorption, excretion, metabolism, and metabolic activation may be in predictive toxicological models. As the molecular mechanisms of toxicological effect are elucidated, development of models that include simultaneous action of more than one agent can be explored.

The utility of any model lies in the accuracy with which extrapolation from the model predicts relative risk to human health. The use of theoretical models complements but cannot entirely replace, other existing forms of risk assessment. The integration and enhancement of epidemiological, in vitro, and computational toxicological data will continue to greatly reduce the number and type of whole animal studies performed while augmenting the toxicologist's ability to protect the public health.


Dr. Thomas A. Baile
Dept. of Medicinal Chem.
School of Pharmocology
University of Washington
Seattle, WA 98195
Dr. Regine S. Bohacek
556 Morris Avenue
Summit, NJ 07901
Dr. Klaus Brendel
Dept. of Pharmacology
School of Medicine
University of Arizona
Health Sciences Center
Tuscon, AZ 85724
Dr. Arthur Doweyko
400 Farmington Avenue
Farmington, CT 06032
Dr. John M. Frazier
Assoc. Director of CAAT
Div. of Toxicological Sci.
Johns Hopkins University
School of Hygiene and Public Health
615 N. Wolfe St.
Baltimore, MD 21205
Dr. Corwin Hansch
Chemistry Department
Pomona College
645 N. College Avenue
Claremont, CA 91711
Dr. John Kapeghian
556 Morris Avenue
SEF 2006
Summit, NJ 07901
Dr. James Stevens
P.O. Box 18300, F-2037
Greensboro, NC 27419
Dr. Maria-Livia Tosato
Instituto Superior de Sanita
Viale Regina Elena, 229
00161 Roma, Italy
Dr. Martin Bernstein
444 Saw Mill River Road
TRAC Dept. L1 243
Ardsley, Ny 10502
Dr. Peter Blumberg
Building 37, 3B 25
National Cancer Institute
Bethesda, MD 20892
Dr. Scott A. DePriest
Center For Molecular Design
Washington Unversity
Lopata Hall, Box 1099
1 Brookings Dr.
St. Louis, MO 63130
Dr. Kurt Enslein
Health Designs Inc.
183 E. Main Street
Rochester, NY 14604
Dr. Alan M. Goldberg
Director of CAAT
Assoc. Dean of Research
Johns Hopkins Univeristy
School of Hygiene
615 N. Wolfe St.
Baltimore, MD 21205
Dr. Peter Jurs
Chemistry Department
Pennsylvania State University
152 Davey
University Park, PA 16802
Dr. Herbert S. Rosenkranz
Dept. of Env. Health Sciences
School of Medicine
Case Western Reserve University
Cleveland, OH 44106
Dr. James D. Yager
Div. of Toxicological Sciences
Johns Hopkins University
School of Hygiene and Public Health
615 N. Wolfe St.
Baltimore, MD 21205


ADAPT- automated data analysis & pattern recognition toolkit
AM1- Austin model 1
CASE- Computer assisted structure evaluation
CoMFA- Comparative molecular field analysis
LFER- linear free energy relationship
LOAEL- Lowest observable adverse effect level
LOEL- Lowest observable effect level
NOAEL- No observable adverse effect level
NOEL- No observable effect level
QSAR- Quantitative structure activity-relationship