SULI
CCI
PST
FaST

Student Abstracts at LBNL:

3D Simulation for the ATLAS Education and Outreach Group. BRIAN AMADIO (Rensselaer Polytechnic Institute, Troy, NY, 12180) MICHAEL BARNETT (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

ATLAS is a particle detector under construction at the Large Hadron Collider facility at the CERN Laboratory in Geneva, Switzerland. The project will be the most expansive physics experiment ever attempted. The ATLAS Education and Outreach Group was started to provide information to students and the general public about the importance of this project. A three-dimensional interactive simulation of ATLAS was created, which allows users to explore the detector. This simulation, named the ATLAS Multimedia Educational Laboratory for Interactive Analysis (AMELIA), allows users to view detailed models of each part of the detector, as well as view event data in 3D. A similar project is called ATLANTIS, which allows users to examine events in only two dimensions. Currently ATLANTIS allows more sophisticated analysis of events. AMELIA will provide similar functionality, but in a more intuitive way, which will be much friendlier to the public.

A Comparison of DNA Damage Probes in Human Mammary Epithelial Cells with 150 kVp X-Rays. CHRISTY WISNEWSKI (University of California, Davis, Davis, CA, 95616) ELEANOR BLAKELY AND KATHLEEN BJORNSTAD (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

In this study we investigated 53BP1 and H2AX, DNA damage markers, to look at genetic mechanisms underlying responses to radiation insult. Two human mammary epithelial cell (HMEC) lines, one subtype of HMEC 184 with a finite lifespan and S1 with an infinite lifespan were investigated to research the role of immortalization in DNA marker expression. Cells were irradiated with 50 cGy, fixed after 1 hour with 4% paraformaldehyde, and processed through immunofluorescence. Cells were imaged using an immunofluorescent microscope and digitally captured using Image Pro Plus software. 8-bit images were analyzed using Image J and counted. The 184 cells showed more positive response within the irradiated samples than the S1 samples. It was observed that the S1 had a previous peak time of 30 minutes with an alternative DNA damage probe; this could explain the decrease in signal for S1 for both probes used in this research. We also noted that the H2AX response was more punctate in the 184 cells, whereas the 53BP1 response was punctate in both cell lines. We hope to expand the dose and time course studied in the hope that this will broaden the knowledge obtained from the preliminary data of this research. It is important to understand whether the process of transformation to immortalization compromises the DNA damage sensor and repair process.

A Miniature Quartz Crystal-based Device for Particulate Matter Monitoring with Real-time Data Acquisition. ZHUO HUANG (Sacramento State University, Sacramento, CA, 95819) MICHAEL APTE (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Exposure to particulate matter (PM) through inhalation has been associated with adverse health problems. Accurately monitoring of the mass concentration and chemical composition of PM are necessary for exposure assessment. Many current instruments in use involve complex operation and labor-intensive work to obtain necessary data for studies, or involve costly systems to monitor a large population. One feasible solution to address these drawbacks is to develop low-cost, compact, and miniaturized real-time devices. The miniature system for particle exposure assessment (MSPEA) developed at Lawrence Berkeley National Laboratory (LBNL) is one approach available to the aerosol research community. MSPEA PM mass detection uses a quartz crystal microbalance (QCM). All the components including the quartz crystals for this system are off-the-shelf items, and can be easily obtained. The particle deposition mechanism used by this device is thermophoresis, while particles are retained on the sensors using and van der Waals forces. The QCM is constructed using an unexposed reference crystal oscillator and a PM-exposed sensing crystal oscillator, and a mixing circuit to that combines the oscillators’ outputs into a beat frequency signal. A computer is used to incorporate the data acquisition operation, but eventually a microprocessor can replace the computer to miniaturize the device for personal monitoring at low-cost. Another monitoring feature of the MSPEA system employing ultraviolet and near-infrared optics is briefly discussed in this paper.

A New GUI for Global Orbit Correction at the ALS Using MATLAB. JACOB PACHIKARA (University of Texas at Arlington, Arlington, TX, 76019) GREGORY J. PORTMANN (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Orbit correction is a vital procedure at particle accelerators around the world. It is very important to have a user friendly application. The orbit correction routine currently used at the Advanced Light Source (ALS) is a bit cumbersome and this paper describes a new Graphical User Interface (GUI) developed for global orbit correction using MATLAB. The correction algorithm uses a singular value decomposition method for calculating the required corrector magnet changes for correcting the orbit. The application has been successfully tested at the ALS. The GUI display provided important information regarding the orbit including the orbit errors before and after correction, the amount of corrector magnet strength change and the standard deviation of the orbit error with respect to the number of singular values used. The use of more singular values resulted in better correction of the orbit error but at the expense of enormous corrector magnet strength changes. The results showed an inverse relationship between the peak-to-peak values of the orbit error and the number of singular values used. The plots on the GUI help the ALS physicists and operators in understanding specific behavior of the orbit. It is a convenient application to use and is a substantial improvement over the previous orbit correction routine.

A Novel Approach to Estimating Thermal Conductivity. GORDON WU (University of California, Berkeley, Berkeley, CA, 94720) TIM KNEAFSEY (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Scientists at Lawrence Berkeley National Laboratory (LBNL) are currently researching natural gas recovery from gas hydrates in hopes that this will one day become a viable source of energy. Natural gas hydrates are water crystals located below permafrost and submarine environments that contain methane gas. Knowledge of heat flow through the hydrate-bearing reservoir must be understood and thermal conductivity is a fundamental property of a material that indicates its ability to conduct heat. The technique for estimating thermal conductivity calls for applying a temperature changes and using thermocouples to accurately measure the rate of temperature change. The thermal data were analyzed using Microsoft Excel and iTOUGH2. The program iTOUGH2 computes a best-fit to our measured data by optimizing the thermal conductivity through automatic model calibration. ITOUGH2 estimates the thermal conductivity based on previous output values and given parameters. The four materials used were dry sand, polyvinyl chloride (PVC), high density polyethylene (HDPE) and Pyrex borosilicate glass, and were chosen because they have thermal conductivities close to that of hydrate bearing sand (2.7 W/m K). Good matches were obtained between the simulations and the measured data showing the validity of the technique. It is important to realize that the substances that we were using can vary in thermal conductivity depending on the temperature, the porosity of the particular substance, and the composition of the sample. Now that the technique is validated, it can be used in other experiments to measure thermal conductivities.

A Search for Cerium Doped Lanthanum Oxide Scintillators. LATORIA WIGGINS (North Carolina A&T State University, Greensboro, NC, 27411) DR. YETTA PORTER-CHAPMAN (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

The need for new and improved radiation detectors, scintillators, is at an all time high due a progression in detection knowledge. Commonly used scintillators such as BGO and LSO have undesirable properties such as low luminosity, and slow decay times. Discovering new scintillators required literature searches, synthesizing and the characterization of compounds. The research at hand concentrated on cerium (III) doped lanthanum oxides. Compounds were synthesized using solid-state chemistry techniques such as ceramic and hydrothermal methods. Characterization consisted of x-ray diffraction, fluorescence spectroscopy and pulsed x-ray measurements. Several new inorganic scintillators were founded, however, findings concerning lanthanum oxide synthesis warrant further investigation of the compound.

A Statistical-Based Analysis of Cloud Properties at Various Locations Across the Globe. PARMINDER SINGH (State University of New York at Buffalo, Buffalo, NY, 14214) SURABI MENON (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

The impact of anthropogenic activities (fossil-fuel combustion, biomass burning, and transportation) on climate includes contributions from both greenhouse gases (GHG) and aerosol particles. The warming associated with GHGs is well known but the negative forcing associated with aerosol-cloud interactions, also referred to as the aerosol indirect effect (AIE), is more difficult to evaluate. This forcing includes changes to cloud droplet number and cloud optical depth from an increase in aerosols that reduce droplet sizes and increase cloud reflectivity; and the impact on cloud water and lifetime that again increase cloud reflectivity. To understand these effects we use data from Moderate Resolution Imaging Spectroradiometer to analyze signatures of aerosol-cloud interactions over several regions across the globe for July 2000. The specific products we examined were: aerosol optical depth, cloud droplet effective radii, cloud optical depth, cloud top pressure, cloud top temperature and liquid water path (derived from cloud droplet effective radii and cloud optical depth). We choose 20 different ocean regions based on proximity to continental areas. Regions closer to the continent have higher aerosols in general (more polluted) than those farther away from the source (cleaner regions). We use cluster analysis, log-linear regressions, correlation coefficients, probability density functions and means to understand cloud response to aerosols at different locations. While features of both indirect effects were observed at most locations, for cleaner regions cloud property changes were more susceptible to aerosol effects. In more polluted regions, presence of dust or aerosols not conducive to cloud formation and liquid water variability may mask the signal we expect. The meteorological analysis of air mass origin and an independent measure of liquid water path can better constrain our analysis for future studies that would use this data to evaluate model representations of the AIE.

Adsorbent-Coated Polyurethane Foam as a Denuder and Size-Selective Inlet for Ambient Air Samplers. JEFF DUARTE (University of California, Davis, Davis, CA, 95616) LARA GUNDEL (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

For practical use, ambient air samplers for assessing human exposure to airborne particles must decrease in size and cost. One major step toward this change is replacement of large, adsorbent-coated glass denuders with small, cheaper adsorbent-coated PUF (polyurethane foam) denuders for capturing SVOC (semi-volatile organic compounds). The purpose of a denuder is to capture SVOC from the sampled air on its extractable adsorbent coat while allowing particles to pass through for collection on a filter. The purpose of this project was to determine if PUF could meet the latter requirement. It was hypothesized that PUF denuders could pass PM2.5 (particles with an aerodynamic diameter of 2.5 microns or less) and match or even exceed conventional glass denuders in SVOC capture because they have more surface area and are more compact. A bifurcated particle sampler that excluded particles larger than 2.5um was used on multiple 24-hour sampling runs in several configurations. Data were initially collected without any denuders, then in another configuration containing glass denuders on both sides to determine the variability of particle capture between the two columns, and then lastly with glass denuders on both sides as well as a PUF denuder on one side. The PUF denuder was placed downstream of the glass denuder so as to have normal SVOC capture. This allowed the focus to be solely on whether or not the PUF was allowing PM2.5 through. In earlier work with PUF denuders by M. T. Minjares, it was found that the filter downstream of the PUF had one third less mass then the filter with no PUF. Minjares had no upstream glass denuders, so her result was thought to be caused by either PM2.5 collection by the PUF or SVOC adsorption by the Teflon filter. In this follow-up experiment, upstream SVOC was collected by the glass denuders. The average PM2.5 concentration difference between the filters in the two columns in a non-denuded configuration was 8.6%. The average PM2.5 concentration difference between the two filters in the configuration with glass denuders on both sides and the PUF denuder on one side was 10.2%. With a mass measurement uncertainty of 3.6%, the difference between these two results is insignificant. The conclusions from this project are 1) the PUF does pass PM2.5 well and 2) the Teflon filter adsorbed SVOC. This is contrary to the prevailing belief that Teflon does not measurably adsorb SVOC, and it was causing the artifact that Minjares observed.

Adsorbent-Coated Polyurethane Foam as a Denuder for Particle Samplers. JEFF DUARTE (University of California, Davis, Davis, CA, 95616) LARA GUNDEL (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

For practical use, ambient air samplers for assessing human exposure to airborne particles must decrease in size and cost. One major step toward this change is replacement of large, adsorbent-coated glass denuders with small, cheaper adsorbent-coated PUF (polyurethane foam) denuders for capturing SVOC (semi-volatile organic compounds). The purpose of a denuder is to capture SVOC from the sampled air on its extractable adsorbent coat while allowing particles to pass through for collection on a filter. The purpose of this project was to determine if PUF could meet the latter requirement. It was hypothesized that PUF denuders could pass PM2.5 (particles with an aerodynamic diameter of 2.5 microns or less) and match or even exceed conventional glass denuders in SVOC capture because they have more surface area and are more compact. A bifurcated particle sampler that excluded particles larger than 2.5um was used on multiple 24-hour sampling runs in several configurations. Data were initially collected without any denuders, then in another configuration containing glass denuders on both sides to determine the variability of particle capture between the two columns, and then lastly with glass denuders on both sides as well as a PUF denuder on one side. The PUF denuder was placed downstream of the glass denuder so as to have normal SVOC capture. This allowed the focus to be solely on whether or not the PUF was allowing PM2.5 through. In earlier work with PUF denuders by M. T. Minjares, it was found that the filter downstream of the PUF had one third less mass then the filter with no PUF. Minjares had no upstream glass denuders, so her result was thought to be caused by either PM2.5 collection by the PUF or SVOC adsorption by the Teflon filter. In this follow-up experiment, upstream SVOC was collected by the glass denuders. The average PM2.5 concentration difference between the filters in the two columns in a non-denuded configuration was 8.6%. The average PM2.5 concentration difference between the two filters in the configuration with glass denuders on both sides and the PUF denuder on one side was 10.2%. With a mass measurement uncertainty of 3.6%, the difference between these two results is insignificant. The conclusions from this project are 1) the PUF does pass PM2.5 well and 2) the Teflon filter adsorbed SVOC. This is contrary to the prevailing belief that Teflon does not measurably adsorb SVOC, and it was causing the artifact that Minjares observed.

AMELIA: ATLAS Multimedia Educational Lab for Interactive Analysis. DAVID MEDOVOY (Columbia University, New York, NY, 10027) MICHAEL BARNETT (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

AMELIA is an educational software program designed to allow the public to view, on a home computer, a 3D model of ATLAS detector at CERN, as well as visualization of recorded particle tracks. Models of the detector’s geometry, created in ‘3ds Max,’ are loaded by the software, which is written using the C++ language and the ‘Irrlicht’ visualization engine. Particle track data in the JiveXML file format is loaded and displayed simultaneously. The use of the standard JiveXML format allows for file-type compatibility with other software, such as the 2D visualization tool ATLANTIS. The ‘camera’ is fully movable by the user, and custom cutaway views can be created based on the camera’s position, to facilitate viewing the interior parts of the detector, as well the particle tracks within. Tracks are color-coded based on particle type, and will soon be individually selectable. Programs exist to visualize particle track data in 3D, and to simplify scientific data for outreach purposes, but only AMELIA is designed for both. Further, AMELIA is the only project of its kind designed to take advantage of technology developed for video games. An early 2007 public release is anticipated.

Amplification and Tagging of Sulfolobus solfataricus Genes for Recombinant Expression. STEPHANIE PETERSON (Del Mar College, Corpus Christi, TX, 78404) STEVEN M. YANNONE (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

In organisms that thrive at moderate temperatures, many biological processes such as DNA repair occur through transient protein interactions. Understanding these interactions and the temporary protein complexes that they form is vital to understanding how cells function especially in how they repair damaged DNA. Protein interactions within hyperthermophiles like Sulfolobus solfataricus may be stabilized at moderate temperatures. The work presented in this study provides the initial steps towards thermally trapping otherwise short-lived protein complexes. Genes associated with DNA repair were selected from the S. solfataricus P2 genome and were modified with a directional tag on the N-terminus and a six-histidine tag (6x-hist) on the C-terminus using gene specific primers. Genes were amplified, cloned into entry vectors, and transformed into E. coli cells. Colonies were then selected and grown in liquid culture. Plasmid DNA was isolated using alkaline-lysis extraction method and constructs were confirmed with restriction digestion. Sixteen out of twenty-nine constructs were successfully confirmed by restriction digestion and fragment pattern on 1% agarose gels. These constructs will be further studied through two different expression systems: E. coli expression and S. solfataricus expression. E. coli expression should provide insight into independent protein structure and function. Native expression will not only provide information about the structure and function but will also identify obligate protein partners in their native organism. In addition, this approach will identify the root causes of difficulties that arise from recombinant expression.

An Evaluation of Vista Performance on Current Windows Computer Systems. JAMES ARRIAGA (BigBend Community College, Moses Lake, WA, 98823) CHARLIE VERBOOM (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

With the impending release of Windows Vista in January 2007, the Information Technology Division at Lawrence Berkeley Laboratory is faced with a decision involving the future transition of computers from the Windows XP operating system to the Windows Vista operating system. Research involved a review of benchmarking tools, the selection of a standard set of tests, and the application of those tests to workstation hardware in use at Berkeley Laboratory. To evaluate the impact of installing the new operating system, performance benchmarks were run on 3 desktop systems with both a standard configuration and with selected hardware upgrades. This provided a comparative analysis to determine the effects of Vista with the Authentic, Energetic, Reflective, and Open (AERO) graphical user interface and how it will react on hardware specifications as determined by Microsoft on systems commonly used at Berkeley Laboratory. The new enhanced user interface built into Vista requires more powerful hardware resources than current computers running XP can handle. The comparison of the minimum and recommended requirements for computer hardware with the Vista operating system installed indicated that legacy systems were more likely to be replaced than upgraded. Computers manufactured within the last 2 years that meet the published guidelines should run Vista with or without the AERO interface.

Arsenic Removal from Ground Waters: An Investigation of the Effects of Temperature. MARIA MELISSA QUEMADA (University of California, Berkeley, Berkeley, CA, 94720) ASHOK J. GADGIL (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Arsenic, a naturally occurring element is a major contaminant in ground waters. Approximately 40 million people in Bangladesh and tens of millions more in neighboring countries are being poisoned by arsenic in their drinking water. The World Health Organization has set a standard of 10-µg/L arsenic in drinking water, while Bangladesh standard remains 50-µg/L. In California, approximately 600,000 households use water with arsenic concentrations higher than the required standard. Lawrence Berkeley National Laboratory has developed a technology for arsenic removal using coal ash coated with ferric hydroxide (media). This process enables the arsenic bind to the iron oxide complex that is coated around the ash particles, and thus lowers the arsenic concentration in drinking water. This technology has high efficacy and very cost effective. My goal in this project is to test the performance of this technology on U.S. waters over a range of temperature values and to test the arsenic removal capacity of the media using coal fly ash that are commonly found in the US. The temperature values I investigated were 4 and 35 degree Celsius. This was accomplished through a series of experiments that allowed me to find the time the process reaches equilibrium in the two temperature values I investigated. Once the equilibrium time was established, the process was repeated to obtain adsorption isotherm curves for the two different temperature values. The equilibrium times that were found were 4 and 16 hours for 4°C and 35°C respectively. The arsenic removal capacity was analyzed using an arsenic field kit test (Quick Test©) and the results were confirmed by Inductively Coupled Plasma Mass Spectroscopy analysis.

Arsenic Removal Using Ferric Hydroxide Coated Coal Ash. CLETE READER (MiraCosta College, Oceanside, CA, 92056) ASHOK GADGIL (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Chronic ingestion of arsenic (As), usually through contaminated drinking water, leads to serious health problems, including cancers, neural disorders, or failure of the kidneys and liver. The present project aims to address the crises in Bangladesh where an estimated 40 million people are consuming water with arsenic levels greater than the World Health Organization’s recommended maximum 10 µg As per liter of water (or 10 ppb). The technology is also being extended for applications in the U.S. regions of low population density where the small municipal water systems makes conventional approaches of arsenic removal impractical. The technology relies on the documented ability of Fe (III) to immobilize arsenic. A coating of the iron complex onto coal ash, which has a surface area of 600m2/g, provides abundant active sites for the adsorption to occur. This study investigates the effect of pH on the arsenic removal capacity of the coated coal ash media with the goal of providing an accurate prediction of the media’s performance at a range of pH levels and arsenic concentrations. It was first necessary to determine the time-to-equilibrium for the reaction, at different pH levels. This was achieved by mixing the media with synthetic ground water spiked with 50 ppb As (V) and analyzing the As concentrations in the water at times ranging from 0.5 – 15 hours using Inductively Coupled Plasma Mass Spectrometry. Data on adsorption for various starting concentrations of As was generated in a similar manner, with time-to-equilibrium determined by above experiments. In the range of pH levels tested equilibrium occurred within 4 hours. Adsorption data were obtained for starting As concentrations of 50 and 150 ppb. Further experiments at additional concentrations will be conducted to develop an adsorption curve that will allow accurate prediction of performance. Preliminary results suggest that the removal capacity of the media is highest at mildly acidic (5.5) pH levels. The low cost, low input, nature of the technology makes it a viable alternative to other removal technologies; characterizing the effect of pH will allow for optimization across a range of pH levels and arsenic concentrations.

Automatic Detection of CRISPR elements. KYNDALL BROWN & MICHAEL LOWE (Jackson State University, Jackson, MS, 39217) NIKOS KIRPEDES (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Due to the growing interest and importance of CRISPR elements in the scientific community, there was a desire for devices that would speedily and efficiently detect CRISPRs. Initially their detection relied on tandem repeat finding tools such as PatScan, Piler, and Reputer. Unfortunately, this process required hours of manual post processing due to the software being inaccurate in distinguishing CRIPSR elements. To solve this problem, the research team at the Lawrence Berkeley National Laboratory (LBNL) created a CRISPR Recognition Tool (CRT), to fast and efficiently detect CRISPR loci. CRT was implemented in Java, an object-oriented programming language. This algorithm used Boyer-Moore searching and skips techniques in locating CRISPRs. The program starts by scanning for a small region of bases that appear within the searchwindow. Assuming that the pattern is part of a CRISPR, it will appear within a range that is relative to the size of the spacer and repeat. Once the range is determined, the program searches for the pattern using Boyer-Moore and the skip method. Once the match is found, and the full repeat length is determined, it is deemed a CRISPR candidate. The program then passes the candidate through three filters with the user specified parameters. Once the candidate passes all requirements it is confirmed as a CRISPR, and the program repeats the process until all CRISPR elements are found. To determine the effectiveness of CRT, its speed and accuracy was tested against PatScan and PilerCR. As a result, the tests proved CRT to be the fastest and most accurate of the three tools. In conclusion, the algorithm efficiently detected CRISPR elements and will be a useful tool in the scientific community.

Bacterial Diversity in Soil and Sediments From a Former Bombing Range (Vieques, PR). ERNIE PEREZ (University of Puerto Rico at Mayaguez, Mayaguez, PR, 680) TERRY C. HAZEN (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Bombing sites used for military training activities can have considerable amounts of contaminants and pose significant risks for people and the environment. Until 2003, the eastern part of Vieques, Puerto Rico, was used as a bombing range by the US Navy. Since then, leaching of explosive compounds from unexploded ordnance represents a serious threat to the marine ecosystem. The contribution of microbial populations to natural attenuation of explosives, including sulfate-reducing bacteria (SRB) has been demonstrated in soils but little is known about their contribution in marine environments. Characterization assays were employed to assess the effects of explosive compounds (TNT, RDX, HMX) on Desulfovibrio vulgaris Hildenborough and five novel SRB isolates from marine sediments in costal waters adjacent to the former military facilities. Pure cultures were combined with media in a covered 96-well micro plate and the opacity was monitored in real time as the bacteria grew in a temperature-controlled plate reader. A dose-response curve was used to estimate minimum inhibitory concentrations (MICs) for TNT, RDX and HMX in 0, 1.5 and 3.0% (w/v) NaCl. Some of the bacterial isolates grew better in explosive-containing environments than in regular media. The chemotactic response to nitrocompounds was evaluated for D. vulgaris using a Palleroni chamber. D. vulgaris responded positively towards TNT, but not to RDX or HMX. Elucidating the diversity and behavior of SRBs to explosives in tropical sediments could help us understand the role of these microbial populations in explosive-contaminated marine environments.

Bombing sites used for military training activities can have considerable amounts of contaminants and pose significant risks for people and the environment. Until 2003, the eastern part of Vieques, Puerto Rico, was used as a bombing range by the US Navy. NATALIA RAMOS (University of Puerto Rico, Mayaguez, PR, 727) TERRY C. HAZEN (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Bombing sites used for military training activities can have considerable amounts of contaminants and pose significant risks for people and the environment. Until 2003, the eastern part of Vieques, Puerto Rico, was used as a bombing range by the US Navy. Since then, leaching of explosive compounds from unexploded ordnance represents a serious threat to the marine ecosystem. The contribution of microbial populations to natural attenuation of explosives, including sulfate-reducing bacteria (SRB) has been demonstrated in soils but little is known about their contribution in marine environments. Characterization assays were employed to assess the effects of explosive compounds (TNT, RDX, HMX) on Desulfovibrio vulgaris Hildenborough and five novel SRB isolates from marine sediments in costal waters adjacent to the former military facilities. Pure cultures were combined with media in a covered 96-well micro plate and the opacity was monitored in real time as the bacteria grew in a temperature-controlled plate reader. A dose-response curve was used to estimate minimum inhibitory concentrations (MICs) for TNT, RDX and HMX in 0, 1.5 and 3.0% (w/v) NaCl. Some of the bacterial isolates grew better in explosive-containing environments than in regular media. The chemotactic response to nitrocompounds was evaluated for D. vulgaris using a Palleroni chamber. D. vulgaris responded positively towards TNT, but not to RDX or HMX. Elucidating the diversity and behavior of SRBs to explosives in tropical sediments could help us understand the role of these microbial populations in explosive-contaminated marine environments.

Characterization of Long Cosmic Ray Muon Tracks in IceCube detector at South Pole. DANIEL HART (Southern University, Baton Rouge, LA, 70813) AZRIEL GOLDSCHMIDT (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Ice Cube is studying high-energy neutrino astronomy. IceCube uses an array of optical modules to detect faint Cerenkov light produced by muons. These muons are the result of nuetrino interactions with matter. Using the information received from data acquisition systems at the South Pole, software was developed using the C language to read this data and use it to produce possible muon paths. Events were filtered through by placing cuts on the calculated paths that passed through the full geometry of IceCube, had velocity within 5% the speed of light, and were of low multiplicity. This resulted in path distance distributions that showed exponential decay of modules to receive light as a function of distance. The probability curves produced, followed along the same traits. However, the distance distributions were not exactly smooth as would be expected, and the selection of paths that were to be considered as neutrino candidates behaved similarly. These impurities are interpreted as the integration of multiple muons in a single event. In further studies, the plan is not only to add in more data from additional days, but also to employ more sophisticated methods for separating true events from those produced by multiple muons.

Cloning of DNA Repair Genes from a Hydrothermal Vent Worm. ANABEY CORNEJO (Contra Costa College, San Pablo, CA, 94806) JILL O. FUSS (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

The primary structure of DNA is highly reactive with molecular by-products of metabolism as well as UV radiation from the sun, and these reactions alter and damage the human genome. Alterations to the DNA structure and chemistry result not only from natural physical agents but also from man made chemicals, although to a lesser extent. If these alterations are not detected and either corrected or removed, a mutation can be fixed in the genome potentially leading to cancer and aging. Through evolutionary adaptation, cells have developed a series of mechanisms which allow them to remove the damage and restore the normal nucleotide sequence and DNA structure. One such mechanism is Nucleotide Excision Repair (NER) which removes oligonucleotides, which are short nucleotide segments that contain damaged bases. NER is further categorized based on the location of the repair. GG-NER or global genome repair refers to NER taking place in DNA not undergoing transcription. TC-NER or transcription-coupled repair refers to NER occurring in the transcribed strand of active genes. If the NER pathway is compromised, a number of human genetic diseases such as Xeroderma pigmentosum (XP), Cockayne syndrome (CS) and Trichothiodystrophy (TTD) may result. These diseases are characterized by causing premature aging or a predisposition for cancer. To investigate the human DNA mechanisms of repair, we are studying an organism highly genetically homologous to humans, in this case Alvinella pompejana or Pompeii worm. The effectiveness behind using the Pompeii worm as a model for studying the process of DNA repair is due to the fact that most of its protein activity occurs at temperatures as high as 80C. The Pompeii worm inhabits geysers found along underwater volcanic mountain ranges known as hydrothermal vents. These underwater formations release jets of water reaching temperatures as high as 300C. Not only does it serve as a great organism for comparative studies due to its genetic similarity to the human genome but also because of its extreme heat tolerance. This would imply that its proteins will be fairly stable at room temperature which will allow for extensive in vitro study. More specifically to investigate DNA repair genes involved in NER, through the cloning of these genes from Alvinella pompejana as well as constructing plasmid vectors for recombinant protein expression, protein purification, protein to protein interaction studies, transfection of cultured human cells with expression vectors followed by assays for reporter gene expression, and analyzing particular proteins as a function of the cell cycle in mammalian cells.

Compact Nanosecond High Current Pulser Design. MICHAEL MALLO (University of Oklahoma, Norman, OK, 73019) SOREN PRESTEMON (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

A pulser is an electronic circuit which generates a high voltage or current pulse with a very short pulse-width. Pulsers can be implemented using various topologies, such as Marx Generators, capacitor banks, coaxial transmission lines, helical lines, striplines, and Blumleins. The goal of this project was to review basic pulser theory and to design, test, and compare several pulsers using various topologies. The final design should deliver a repeatable pulse greater than 1 kA with 10 ns or less pulse-width to a 1 Ohm inductive load (high field microcoil) and be small enough to allow for insertion into an ultra high vacuum accelerator environment. The pulser topologies tested were capacitor bank, coaxial transmission line, stripline, and parallel plate Blumlein. The capacitor bank produced an output voltage of 289 V with a ringing frequency of 17.9 MHz, corresponding to a positive voltage pulse-width of 28 ns. The load impedance of this circuit is unknown. The coaxial transmission line was expected to produce an output voltage pulse of 500 V with a pulse-width of 13.2 ns; the actual output was 500 V, but with a pulse-width of 11.8 ns. The stripline was expected to produce a 1 kV 4 ns voltage pulse through a 1 O inductive load. The parallel plate Blumlein was expected to produce a 1 kV 1.2 ns voltage pulse through a 1 O inductive load. However, the stripline and Blumlein both produced far less voltage than anticipated and voltage pulse-widths of just over 10 ns. Three factors may have lead to this inconsistency in predicted versus measured pulse-widths. First, the diagnostic tool used to measure the stripline and Blumlein voltage pulses was a Tektronix P5102 1 kVRMS 100 MHz 10x high voltage probe. The 100 MHz bandwidth prevents the probe from accurately measuring pulse-widths shorter than 10 ns. Second, the short lengths of these lines may have lead to a greater prominence of end effects, or variations in the electric and magnetic fields at the ends of the transmission lines, in the output pulses. Third, the low 1 O load impedances combined with the stray inductances may have caused longer than expected pulse rise times. The latter two factors warrant further investigation to better understand what electrical and geometric properties lead to end effects and long rise times, and to what extent they affect the output pulse.

Comparison of Intrabeam Scattering High Energy Approximations, and Equilibrium. ROBERT OWENS (North Carolina A&T State University, Greensboro, NC, 27411) MIGUEL FURAN (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

The International Linear Collider (ILC) is a particle accelerator being designed with the hopes of exploring higher energy particles in the universe that have never previously been accessible. Two of the major components of the ILC are the electron and positron damping rings, which serve the purpose of shrinking the emittances of the beams. There are several competing processes that affect the beam emittances. Synchrotron radiation damping serves to decrease the emittances. A major contributor to emittance growth is a phenomenon called Intrabeam Scattering (IBS), wherein particles within a single bunch Coulomb scatter off one another, thereby causing the beam emittance to increase. The IBS emittance growth rates are calculated using computer codes, and often it is too time consuming to use the full theory of IBS. In order to calculate IBS growth rates in the most efficient manner, several high-energy approximations to the full theory have been developed for the energy regime of the ILC. It is important to find the most accurate approximation. We analyzed three approximations of IBS using the software package, Mathematica; Bane’s approximation, a new Diagonal Matrices approximation, and a recent CIMP one-log approximation, while attempting to develop a better two-log approximation to the CIMP formulas. We also analyzed the equilibrium emittances of the beams at different charges to determine if the transverse emittances, bunch length, and energy spread would meet the necessary requirements for the ILC. After comparing the various approximations, the Diagonal Matrices approximation proved to be the closest approximation to the full theory of IBS.

Construction of the La-Bi-O Phase Equilibria: The Search for Inorganic Scintillators. STEVEN VILAYVONG (North Carolina A&T State University, Greensboro, NC, 27411) YETTA PORTER-CHAPMAN (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Scintillators have been around forever, however the demand for scintillators has risen exponentially since World War II. This need for viable scintillators to be used in the detection of ionizing radiation has spurred research around the world. The New Detector Group of the Department of Functional Imaging in the Life Sciences Division of Lawrence Berkeley National Laboratory conducts systematic searches of various compounds to find the most effective scintillator. Much of their work focuses on compounds that contain Bismuth (III) and Lanthanum (III) ions. Bismuth (III) ions have the capability to be luminescent sometimes providing intrinsic scintillation as seen in the commonly used scintillator, Bi4Ge3O12, (BGO). Phases containing lanthanum (III) ions are also investigated to utilize their sites for cerium (III) (another luminescent ion) doping. In this work, various molar ratios of La2O3 and Bi2O3 were reacted by solid state chemistry techniques to find phases that may exhibit scintillation. To be classified as a good scintillator, one must characterize these phases by x-ray diffraction (XRD), fluorescence spectroscopy, and pulsed x-ray measurements. Four La-Bi-O phases, La0.176Bi0.824O1.5, La0.12Bi1.88O3, La4Bi2O9, and an unknown phase were found however, none are good scintillators.

Creating Code for Automed Demand Response. ARRAN BLATTEL (Stanford University, Stanford, CA, 94305) MARY ANN PIETTE (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Energy efficiency, conservation, and peak load management are important approaches to protecting the power grid, saving consumers money, and reducing impact on the environment. Demand Response (DR) is an attempt to curtail energy demand during the handful of days each year when the grid is strained. The local electric company, Pacific Gas & Electric (PG&E), created a voluntary program called Critical Peak Pricing (CPP), in which participants are asked to curb energy use for 12 independent summer days that PG&E deems are most likely to strain the grid. Participants receive lower electric rates on non-CPP days, but much higher rates during peak hours on CPP days, creating an incentive to reduce demand. Automated Demand Response (Auto-DR) is a novel approach focused on fully automating buildings participating in the CPP program, so during a CPP event their buildings will reduce its energy demands without any human interaction. The system works by using a computer program to continuously monitor CPP status posted on PG&E’s server. When the program detects a CPP event in progress, it triggers pre-programmed energy saving strategies to take affect in the building such as dimming lights and reducing AC use. For buildings that are currently automated, post-event surveys are conducted to measure occupants’ response to the changed environment from load reduction. This research also gives feedback to participants as soon as possible so they can see the correlation between their buildings’ energy saving actions and their electrical shed. The main focus of this research is on studying the electrical demand of participating buildings evaluating how much they reduced their energy consumption. Over the past three years, this research has come to show that Auto-DR is a viable form of dynamic energy conservation, by consistently providing load sheds during CPP days. Due to the inherent lack of manual labor required to operate in this program and no reliance upon present personnel, Auto-DR may prove to be more efficient and cost-effective than DR for certain buildings.

CRISPR Recognition Tool (CRT): A Tool for Automatic Recognition of CRISPR Elements. CHARLES BLAND (Jackson State University, Jackson, MS, 0) CHARLES BLAND (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Repetitive sequences make up a significant portion of the genomes of both prokaryotes and eukaryotes. The human genome, for example, is known to be composed of as much as 50% repeating patterns. These repeats come in various forms/sizes and may be found dispersed throughout a genome, clustered near each other or occurring contiguously. The identification of repeats has proven to be of great consequence as they have been connected to a number of human diseases, including fragile-X mental retardation, Huntington’s disease, myotonic dystrophy and muscular atrophy. Repetitive sequences also have various functional roles such as gene regulation and immune system development. Furthermore, they are a useful tool to scientists for DNA fingerprinting and genome alignment. This research focuses on a recently recognized family of repeats that has attracted a lot of interest recently. Clustered Regular Interspaced Repeat (CRISPR) are composed of short direct repeats ranging in size from 21 – 51 base pairs. CRISPR are unique in that their repeats are interspaced by non-repeating sequences of similar size and are found only in the genome on prokaryotes. Because of the importance of repetitive sequences, it is essential to develop fast and accurate methods for their detection. Several tools are available for identifying various forms of repeats, however, because the focus on CRISPR elements is only recent, no published tools are yet available for their automatic discovery. Their detection currently uses a generic repeat searching tools (Patscan) and requires considerable manual post-processing. In this study, we present a tool for reliable, fast, and automatic detection of CRISPR elements. This software program, CRISPR Recognition Tool (CRT), uses a fast linear search method for their detection. Accuracy and speed of CRT is determined by analyzing its performance on finished microbial genomes available in the IMG version 1.5 database. Additionally, CRT is compared to Patscan and a recently discovered and unpublished CRISPR detection program, PilerCR, an offshoot of the Piler program. We found CRT to be superior to both Patscan and PilerCR in terms of both accuracy and speed.

Data Encryption for Windows PC's. BRICE LUCERO (Big Bend community college, Moses Lake, WA, 98837) CHARLIE VERBOOM (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

In today’s society the use of mobile technology is becoming increasingly popular. The demand for the ability to secure sensitive data in a safe and easy manner on Windows operating systems is growing with the use of mobile technology. This project involved an analysis of present and future methods for data encryption for Windows operating systems: XP and the soon to be released Vista operating system. The project scope included Encryption File System (EFS) in Vista and XP along with Bitlocker, a new feature that will be included in Vista. In addition two third party programs, Safe Disk and True Crypt, were reviewed. Tests were run in order to measure the time each method took to encrypt as well as the impact they had on data transfer performance. Ease of use, data recovery methods and known vulnerabilities were also taken into consideration during the review. It was determined that Windows EFS and Bitlocker were effective, easy to use methods for data encryption and have reliable recovery methods when managed though a domain. Safe Disk is protected by one centralized password and has an interface that was easy to learn but offered no recovery method. It is not a recommended method. True Crypt had limited options available for data recovery and a choice of password or key files for authentication. The interface was slightly more complicated then Safe Disk and the native Windows encryption methods. True Crypt is still a recommended method for data encryption.

Design, Fabrication and Measurement of Nb/Si Multilayers and Niobium Transmission Filters. SUNEIDY LEMOS FIGUEREO (University of Puerto Rico, Rio Piedras, Rio Piedras, PR, 931) DAVID ATTWOOD & ERIK GULLIKSON (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

The extreme ultraviolet (EUV) region of the electromagnetic spectrum is being used in multilayer optical systems to design technology projected for use in the fabrication of nano-electronics. Multilayer optical systems with high reflectivity have been produced in the soft x-ray and EUV regions of the spectrum. Due to the limited understanding of the Nb/Si optical systems, our research group fabricated and measured Nb/Si multilayers and Nb transmission filters for the soft x-ray and EUV regions. Multilayer optical systems are used in applications ranging from EUV lithography to synchrotron radiation. The films were deposited using dc magnetron sputtering in the Center for X-Ray Optics at the Lawrence Berkeley National Laboratory. Reflectivity and transmission measurements were performed at the Advanced Light Source beamline 6.3.2. The Nb/Si multilayer mirrors fabricated have a reflectivity of approximately 65% in the extreme ultraviolet region, which makes these systems practical for applications where a high reflectivity is required, such as Astronomy and instrumentation development. Transmission measurements of up to 90% were observed in the soft x-ray and EUV regions as well. Future work in the research group includes the design and fabrication of an Nb/Si multilayer with a B4C interface. The Nb/B4C/Si optical systems are expected to have a higher reflectivity than Nb/Si systems.

Detection of Ionizing Radiation Based on Metastable States of Polymer Dispersed Liquid Crystals. TIMOTHY PHUNG (University of California, Berkeley, Berkeley, CA, 94720) CARL HABER (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Polymer dispersed liquid crystals (PDLCs) may be a suitable medium for future tracking detectors used in particle physics. Such a detector will work in analogy with the bubble chamber by using the metastable states in LC materials. A PDLC cell is fabricated and the optical transmission of the cell is measured as a function of the voltage applied across the cell and the temperature. The optical transmission of the PDLC is found to be temperature dependent below the nematic-isotropic phase transition temperature when a field is applied across the cell as is reported in the literature. When no field is applied, the PDLC cell is strongly temperature dependent near the nematic-isotropic phase transition temperature, which also agrees with previous results. Future research in this area will focus on finding the metastable phenomena that exist near phase transitions of LC materials and on the use of electric fields to shift the transition temperature.

Determination of a Role for Cellular XPG in Repair of Oxidative Damage to DNA. EMILY FOX (City College of San Francisco, San Francisco, CA, 94112) HELEN BUDWORTH (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Mutation of XPG can cause the debilitating diseases Xeroderma Pigmentosum (XP) and Cockayne’s Syndrome (CS), which result from a deficiency in DNA repair. XPG cuts 3’ to DNA lesions during nucleotide excision repair (NER), as well performing the non-catalytic roles of recognizing stalled RNA polymerase II and binding transcription-sized bubbles in transcription coupled repair. Through in vitro tests with purified proteins, XPG has been found to stimulate hNth1, which removes oxidized pyrimidines in the base excision repair (BER) pathway. In this study, whole cell extracts from XPG-deficient cells obtained from patients with XP-G/CS were found to be defective in incision of 5,6-dihydrouracil (DHU). This defect was corrected by the addition of purified XPG, suggesting that the mutated XPG in XP-G/CS cells is unable to stimulate hNth1. In addition, cells from XP-G/CS patients were found to be slightly sensitive to X-rays and hydrogen peroxide, as determined by colony formation survival assays. shRNA against XPG was used to knockdown XPG in normal cells in order to provide another model for XPG deficiency in which the only variation from control cells is reduced levels of XPG.

Determining the Effect of Aerosol Composition on the Accuracy of Aethalometer Real-Time Measurements of Black Carbon. SRYAN RANGANATH (University of California, Berkeley, Berkeley, CA, 94709) THOMAS W. KIRCHSTETTER (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Black carbon (BC), a main component of soot, is studied for its associated climatic and health effects. Filter-based light-transmission instruments are commonly used for measuring properties of black carbon. The aethalometer performs real-time measurements of black carbon concentration. Previous studies indicate that measurements produced by light transmission instruments, and the aethalometer specifically, are affected by the enhancement in particle light absorption due to the light scattering within the filters used to collect the particles. While the extent of this enhancement varies with particle loading and particle composition, the aethalometer algorithm does not consider these effects. This result could jeopardize the quality of measurements of BC concentration made with the aethalometer. This behavior was studied in the laboratory using controlled generation of BC and light scattering aerosols. An inverted diffusion flame produced BC aerosols with steady physical characteristics. A nebulizer produced salt particles which were mixed with BC from the flame. These particles were diluted with filtered air prior to sampling. The aethalometer sampled pure BC aerosols and BC + NaCl in individual experiments. In both cases, the aethalometer reported a decreasing concentration despite sampling a constant BC concentration. However, different decreasing trends in concentration were observed, depending on the composition of the aerosols sampled. This difference in instrument response means that different empirical corrections are required, which is not a practical solution to the problem. Continued investigation with aerosols of different composition is the next expected step. These results may be first steps in showing an empirical correction for the aethalometer is not practical.

Development of a beam profile diagnostics device for the VENUS ECR ion source beam line. CARY PINT (University of Northern Iowa, Cedar Falls, IA, 50614) DANIELA LEITNER (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

This work describes the design and development of the instrumentation for a beam profile diagnostics unit for the low energy beam transport line of the superconducting Electron Cyclotron Resonance (ECR) ion source VENUS (Versatile ECR ion source for Nuclear Science). VENUS is currently being commissioned at LBNL and serves as the prototype ECR injector source for next generation heavy ion accelerators. In order to enhance simulations of beam transport from extraction in VENUS, a measurement device (called a harp) consisting of a grid of thin conducting wires is placed into the beam line, directly downstream from extraction, to measure the beam profile. Utilizing the diagnostics unit developed and described in this work, the first measurements of the beam profile for a simple helium beam are presented. By changing the Glaser current to focus the ion beam onto the harp, the helium beam profiles illustrate that the extracted beam has the same symmetry as the plasma surface from which they are extracted, and not the uniform circular symmetry that is assumed in most simulation models. These results give quantitative insight into the enhancement of initial conditions needed for using simulations to give a physically accurate description of beam transport from extraction of an ECR source.

Development of a Multi-Pollutant Personal Sampler (MPPS). MARIA MINJARES (Our Lady of the Lake University, San Antonio, TX, 78207) LARA GUNDEL (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

The effects of indoor and outdoor air pollutants on human health have long been a concern to health care workers, environmental scientists, and citizens alike. Previous work has consisted of developing methods for separating and trapping particulate matter (PM) and gaseous pollutants. Currently, the multi-pollutant personal sampler (MPPS) consists of denuder with polyurethane foam coated with a ground sorbent, XAD-4, followed by a filter to collect PM < 2.5 µm diameters (PM2.5). Indoor and outdoor air sampling was conducted at Lawrence Berkeley National Laboratory to determine how much PM2.5 the polyurethane foam would retain. The results obtained from sampling indoor ambient air proves our hypothesis that the PM2.5 will pass through the 80 pores per linear inch (ppi) XAD-4 coated PUF. However, the 80 ppi XAD-coated PUF retained 30% of PM2.5 in its structure during outdoor air sampling. Further experimentation is needed to improve the MPPS geometry so that > 95% of PM2.5 passes through the XAD-coated PUF to the filter.

Development of Charged-Coupled Devices for Precision Cosmology and the Supernova Acceleration Probe Satellite. JESSICA WILLIAMSON (University of Alabama in Huntsville, Huntsville, AL, 35899) DR. NATALIE ROE (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Dark energy, which is believed to be a cosmic energy density that is gravitationally repulsive and does not appear to cluster in galaxies, has been invoked to account for the recent measurement that the rate of the universe’s expansion is accelerating. To better understand these phenomena, scientists utilize type Ia supernovae as calibrated candles. Lawrence Berkeley National Laboratory (LBNL) is developing the Supernova Acceleration Probe (SNAP), a space-based telescope that will be used to identify and measure supernovae. The SNAP focal plane will consist of an innovative camera that integrates two cutting-edge imaging sensor systems, one of which is the LBNL high purity charged coupled device (CCD) for the visible light range. We report on the development of a novel technique for extending the spatial and photometric fidelity performance of the LBNL CCDs. Presented are our results obtained from measurements using a 10.5 µm pixel pitch, 1.4k×1.4k format, p channel CCD fabricated on high-resistivity silicon at LBNL. The fully depleted device is 300 µm thick and backside illuminated. Measurements of the device’s transverse diffusion of charge carriers, pixel to pixel uniformity and intrapixel uniformity will be reported. will be reported. Also presented are new, preliminary results from the first implementation of CCD Phase Dithering, a novel technique for achieving sub-pixel spatial resolution in undersampled, pixelated image data as will be obtained by the SNAP satellite.

Development of Sixth Grade Decomposition Curriculum to Meet National Science Education Standard: Science as Inquiry (Alternative Project). AMY MORRIS (Vanderbilt University, Nashville, TN 37235) MARGARET TORN (Lawrence Berkeley National Laboratory, Berkel. AMY MORRIS (Vanderbilt University, Nashville, TN, 37235) MARGARET TORN (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

A curriculum on decomposition was developed to meet the National Science Education Standard: Science as Inquiry by helping students acquire abilities necessary to do scientific inquiry and understandings about scientific inquiry. Additionally, the curriculum was developed to collect data on changing decomposition rates across the nation for scientists studying climate change. To meet the Science as Inquiry Standard, the students will study the carbon cycle and feedback effects in relation to climate change to identify questions that can be answered through a scientific investigation of decomposition; conduct a scientific investigation to determine the decomposition rates of local leaf litter and a common substrate; use balances, ovens, probes, and computers to gather, analyze, and interpret data; use mathematics to make data tables, graphs, and equations to describe their data; use evidence to develop explanations and predictions; and present a final project to the class. To provide data on decomposition rates for scientists studying climate change, sixth grade classrooms across the nation to will follow a standard experimental protocol. The protocol will be repeated annually to provide data on how decomposition rates are changing. This summer, the experimental protocol was developed, as well as a timeline for executing the curriculum throughout the school year, sample worksheets to supplement the protocol, and a method for assessing students’ abilities to do scientific inquiry and understandings about scientific inquiry. Further development of activities to teach content and experimental skills to students is needed. Students will benefit from a year-long project that emphasizes the Science as Inquiry Standard and will be able to help their world by collecting data for professional scientists to use to study climate change.

Electrochemical Remediation of Arsenic Contaminated Groundwater. SCOTT MCLAUGHLIN (University of California, Berkeley, Berkeley, CA, 94720) ASHOK GADGIL (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Arsenic in drinking water impacts 100 million people worldwide, 50 million of whom are in danger of severe poisoning. The most dire situation is in Bangladesh, where 45 million cases of arsenic related poisoning makes it the largest case of mass poisoning in human history. Available methods of treating arsenic are too expensive, not effective enough, and often difficult to implement, making them inadequate for a poor, largely undeveloped country such as Bangladesh. Electrochemistry promises an innovative, effective, and inexpensive method for arsenic remediation of drinking water. The method is an improvement upon a known method of using Fe3+ to remove arsenic. The Fe3+ combines with As(V), forming an insoluble complex which then can be easily filtered out. The innovative step of electrochemistry allows for control over the amount of Fe3+ produced as well as electrochemical oxidation of the As(III) into reactive As(V) anion [H2AsO4]-, making the method far more effective. Tests on a simple laboratory setup show a drastic improvement in arsenic removal efficiency compared to arsenic removal based on simple rusting of metallic iron. Application of 70 mA current over 10 minutes in our electrochemical cell reduced the arsenic concentration in 850 mL synthetic ground-water from 1000 ppb to less than 5 ppb, even without system optimization. This is compared to a similar setup with a rusted iron coil without application of electrochemistry which only removed down to 250 ppb in an entire hour. We completed a major goal for this summer in understanding the effects of experimental conditions on the system so that reproducible and consistent results can be obtained. Currently, tests are being performed at various current densities and durations to find the optimal electrochemical parameters for efficient oxidation of Fe into Fe3+ and effective removal of arsenic. Once the process is well understood, the method will be able to be very efficiently applied to a water filter applicable to areas with arsenic in the groundwater.

Enhancing the Target Chamber for the Second Phase of the Neutralized Drift Compression Experiment. GUILLERMO GARCIA (University of Southern California, Los Angeles, CA, 90089) MATTHAEUS LEITNER (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

The objective of a controlled fusion power plant for worldwide energy production has driven the Neutralized Drift Compression Experiment (NDCX) to investigate characteristics of ion-beam manipulation. This report focuses on enhancing the diagnostic target chamber for the second phase of the NDCX project. A target capsule, loading dock, robotic arm and target housing were developed to prepare the diagnostic target chamber for integrated compression and focusing experiments with energy transfer of 1 eV on target with a 500 MW, 1 ns ion beam. Each component was developed to incorporate the design constraints established by the diagnostic target chamber. A LabVIEW program was created to monitor and control movement of the robotic arm. The diagnostic target chamber was assembled and calibration of the robotic arm showed that the system had successful interaction between the LabVIEW program and the newly developed components.

Environmental Education Unit Plan. AMY WEST (Lesley University, Cambridge, MA, 2138) MARY CONNELLY (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Research has shown that incorporating environmental education into a schools curriculum greatly affects student’s attitudes and motivation towards education. Further research into different methodologies will help conclude which techniques are most beneficial for students in the elementary school classroom when teaching about the environment. Studying several different types of environmental education camps at the Lawrence Hall of Science and the UC Botanical Gardens, will provide a strong background as to which methodologies work best in an elementary school setting. As a result of observing and participating in two environmental camps at Lawrence Hall of Science, one camp at the UC Botanical gardens and one camp in the Sierra mountains, the research has concluded that hands on, inquiry based learning is the best methodology to use when teaching environmental science. In conclusion, hands on, inquiry based environmental education in elementary school curriculum would be the best, most beneficial way to teach students about environmental science.

Evaluating Changes in Black Carbon Concentrations from California Diesel Emissions. JEFFERY AGUIAR (University of the Pacific, Stockton, CA, 95211) THOMAS KIRCHSTETTER (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

In this paper we show how changes in diesel fuel properties influenced black carbon (BC) concentrations and emission factors during the past 37 years. In our analyses we use data from the San Francisco Bay Area, where diesel traffic can be considered the primary source of BC aerosol. We estimate BC concentrations from archived Coefficient of Haze (COH) data, collected routinely since 1967 at a number of Bay Area locations. COH values are a measure of the attenuation of light by the collected filter deposit and are proportional to BC concentrations measured by commonly used optical methods. Our analyses of monthly and annual COH-derived BC concentrations show that the Bay Area BC mass concentration is an order of magnitude greater in winter than in summer. Our estimated diesel BC emission factors changed from about 10 g kg-1 in the late 1960s to less than 1 g kg-1 in 2000. The seasonality is caused by unchanging monthly diesel BC emissions modulated by synchronous area-wide changes in inversion heights. Despite the continuous increase in diesel fuel consumption, annual area-wide BC concentrations decreased from 3.5 µg m-3 in 1967 to about 0.9 µg m-3 in 2000. We attribute the BC concentration decrease to the changes in diesel technology and fuel composition—particularly sulfur content that occurred in the period. BC emission factors are possibly more influenced by fuel property than engine technology. The intention for the diesel sulfur reduction was to reduce the emissions of sulfur oxides, which lower the effectiveness of exhaust particle control devices. The observed BC reduction is, therefore, an unintended benefit of the fuel sulfur reduction and steady improvements in diesel technology.

Gene Cloning and Expression in the Hyperthermophile Sulfolobus solfataricus. MEGAN HOCH (Del Mar College, Corpus Christi, TX, 78404) STEVEN M. YANNONE (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Sulfolobus solfataricus is a hyperthermophile Archaeon that lives in a very extreme environment, and for this it is considered an extremophile. This organism lives in acidic hot springs of Yellowstone National Park. S. solfataricus has become a model system for studying human DNA repair. Protein interactions that are needed to study DNA repair are sometimes transient. Protein interactions that are transient in high temperatures where Archaea live might be more stable at room temperature. This would allow a better look at and understanding of the protein complexes involved. Also, it is accepted that Archaea are more closely related to humans than bacteria, and the DNA replication and translation of the two is very similar. A group of genes was selected for cloning, and primers were designed for each gene. The genes were amplified using the polymerase chain reaction (PCR) method. The PCR products for the gene of interest, were cloned into a directional topoisomerase I (TOPO®) cloning vector. These vectors were transformed into E. coli cells from Invitrogen. The cells were then plated on Luria-Bertani (LB) agar plates using sterile techniques. Clones were picked from the plates and a culture was grown overnight. The plasmid DNA was separated from the cells using alkaline lysis. Restriction enzyme digests were set up to confirm that the correct gene was inserted into the vector. The digest was visualized on a 1% agarose gel. This study successfully cloned sixteen out of the original twenty-nine genes selected. Some of the clones that grew on the LB plates did not contain any gene at all; it was simply the vector alone. After several restriction enzyme digests, there were about eight genes whose digests were not clear enough to confirm the prescense of the correct insert. These constructs will need to be further digested with different restriction enzymes to confirm the gene. Currently methods are being developed for expressing these genes in E. coli and S. solfataricus. In future studies, proteins expressed from S. solfataricus will be studied and characterized to understand the protein-to-protein interactions that are occurring.

High-Throughput X-ray Protein Crystallography. BINH NGUYEN (Contra Costa College, San Pablo, CA, 94806) DR. MINMIN YU (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Proteins play an integral function in cells, hence understanding them is critical, especially in medical research. Structural analysis of proteins is particularly important to the field of Structural Genomics, which aims to study protein molecules in nature to provide a fundamental understanding in biology. Knowing the three dimensional structures of proteins will enable the grouping of fold patterns and family of proteins that can lead to clues of how the proteins work. High-throughput protein crystallography allows structural determination of the protein via x-ray diffraction. Proteins are crystallized and the resulting crystals are analyzed with X-ray to create diffraction patterns that can be determined into three dimensional structures. The first important step in this process is we subject our proteins to series of crystallization matrices to find the right crystallization condition. We utilized various crystal screens from Hampton Research and Emerald Biosystems. The initial crystal hits-crystal formation-leads to further optimization of the crystallization solution so as to obtain crystals that have reasonable diffraction quality. Our crystals are analyzed mostly with the synchrotron at the Advance Light Source (ALS) within the Lawrence Berkeley National Lab (LBNL). Through protein 3-D structures, the folding topologies and local conformations of the proteins can be analyzed. The Li-Wei Hung Lab is currently analyzing various proteins for the Integrated Center for Structure and Function Innovation (ISFI) and TB Structural Genomics Consortium (TBSGC). These proteins are derived from interesting targeted DNA sequences of various sources, mostly species causing human diseases. Solved protein structures are deposited on our ISFI/TBSGC database (www.tbgenomics.org, http://techcenter.mbi.ucla.edu) and the Protein Data Bank (http://www.pdb.org).

Increasing the Durability and Reliability of Radiation Detectors used in Radiopharmaceutical Chemistry. SIMARJIT KAUR (Contra Costa College, San Pablo, CA, 94806) JAMES P O'NEIL (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Radiation Detectors are a necessary part of radiopharmaceutical synthesis in order to determine the quantity of radioactivity throughout the synthesis. They are used to track not only the progress of the chemical reactions and the yield at each step but also to ensure the safety of the personnel involved in this process. Radiation detectors are usually installed in places where the potential of chemical exposure and general physical abuse is quite high. To make the radiation detectors more robust and reliable, a very easy and cost-effective method of “epoxy potting” was devised. The radiation detector is placed in a mold of appropriate dimensions and filled with epoxy (3M Scotch-Weld DP270, black). After the epoxy cures, the radiation detector is protected within a solid light resistant block. This particular epoxy was chosen for this task because it is chemically inert and provides both electrical and mechanical insulation of the detector components from the harsh surroundings of the hot cell.

Inquiry-Based Learning at Its Best. SARAH BAUM (Lesley University, Cambridge, MA, 2138) MARY CONNOLLY (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

The Lawrence Hall of Science (LHS) is one of the leading forces in the inquiry-driven, direct experience approach to science and mathematics instruction for grades K-12. LHS has developed several inquiry-based curriculum projects that are used throughout the United States. This public science center also provides educational exhibits and classes, year-round outreach programs, as well as diverse summer camp programs for children of the region. The investigative approach stems from human beings’ natural curiosity to explore what is seen on a daily basis. This strategy when applied to a classroom helps students connect concrete ideas to their own experiences through open-ended investigations and discussions. My goal throughout the summer was to observe how instructors use guided inquiry techniques with a variety of age groups to delve into life, physical, and earth science. My research reflects an exploratory sample of age groups and content areas. Working alongside LHS instructors has allowed me to study inquiry-based education by observing, comparing, analyzing, and applying themes and elements central to the process.

Investigating the Use of a Diffusion Flame to Produce Black Carbon Standards for Thermal-Optical Analysis of Carbonaceous Aerosols. DIANA ORTIZ MONTALVO (University of Puerto Rico, San Juan, PR, 931) THOMAS W. KIRCHSTETTER (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Combustion generated particles impact climate and public health due to their ability to scatter and absorb solar radiation and alter cloud properties, and because they are small enough to be inhaled and deposit in the lungs where they may cause respiratory and other health problems. Specific concern is focused on particles that originate from the combustion of diesel fuel. Diesels particles are composed mainly of carbonaceous material, especially in locations where diesel fuel sulfur is low. Diesel particles are black due to the strongly light absorbing nature of the refractory carbon components, appropriately called black carbon (BC). This research project focuses on the uncertainty in the measurement of BC mass concentration, which is typically determined by analysis of particles collected on a filter using a thermal-optical analysis (TOA) method. Many studies have been conducted to examine the accuracy of the commonly used variations of the TOA method, which differ in their sample heating protocol, carrier gas, and optical measurement. These studies show that BC measurements are inaccurate due to the presence of organic carbon (OC) in the aerosols. OC may co-evolve with BC or char to form BC during analysis, both of which make it difficult to distinguish between the OC and BC in the sample. The goal of this study is to develop the capability of producing standard samples of known amounts of BC, either alone or mixed with other aerosol constituents, and then evaluate which TOA methods accurately determine the BC amounts. An inverted diffusion flame of methane and air was used to produce particle samples containing only BC as well as samples of BC mixed with humic acid (HA). Our study found that HA particles are light absorbing and catalyze the combustion of BC during TOA. It is expected that both of these attributes will challenge the ability of TOA methods in distinguishing between OC and BC, such as the simple two step TOA method which relies solely on temperature to distinguish between OC and BC. The samples prepared in this study were analyzed using two TOA methods to compare the estimates of BC concentration. Future work will focus on the preparation of a variety of BC standards and comparing measurements of the prepared samples using a range of TOA methods.

Investigation of a Rhodium Catalyst in the Reduction of Carbon Dioxide and Pyruvate for Future Use in a Direct Electrochemical Methanol Production Cell. LINDSAY DIERCKS (University of Iowa, Iowa City, IA, 52242) JOHN KERR (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

In a world with dwindling oil reserves and increasing energy demands, reduction of carbon dioxide to methanol using solar generated electricity is a probable and environmentally conscious solution. The part of the carbon dioxide to methanol process that this research is specifically involved in is the reduction of carbon dioxide to formate. The stereo-selectivity of Lactic Dehydrogenase (LDH) was investigated in the reduction of pyruvate to lactic acid as was the reduction of carbon dioxide to formate with only a Rhodium catalyst. Reactions were run in a glass electrochemical cell, and products of the reactions were analyzed on a Capillary Electrophoresis (CE). The pyruvate reaction showed the presence of lactic acid, however it was not certain if pyruvate was also present. The absence of pyruvate on the CE analysis of the pyruvate reaction could mean that there was a one hundred percent conversion of pyruvate to lactic acid, but that result has yet to be reproduced. The carbon dioxide reaction shows the presence of formate, but not oxalate- also a product of carbon dioxide reduction. It has yet to be determined if formate is the sole product of the reduction of carbon dioxide with a Rhodium catalyst.

Isolation of Independent Spontaneous Thymidine Kinase-Deficient Mutants and an Estimation of the Mutation Rate at the Thymidine Kinase Locus in a Human B-Lymphoblast Clone. LAWRENCE CHYALL (University of California, Berkeley, Berkeley, CA, 94720) AMY KRONENBERG (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Programmed cell death, or apoptosis, is tightly regulated by signals originating from both within the cell and its surroundings. The BCL-2 family of proteins helps modulate the balance between the life and death of cells. BCL-XL (a BCL-2-like protein) assists in limiting apoptosis by titrating the concentration of pro-apoptotic proteins through the formation of a heterodimer. TK6-Bcl-xL gly-159-ala #38 is a TK6 human B-lymphoblast cell line that was engineered to express BCL-XL gly-159-ala, a mutated form of the BCL-XL protein that does not have anti-apoptotic activity. A fluctuation experiment was used to estimate the mutation rate of TK6-Bcl-xL gly-159-ala #38 cells. The mutation rate was found to be closer to the historical results for TK6-neo #1 cells than cells expressing the wild-type BCL-XL protein, TK6-Bcl-xL #4. The plating efficiency of TK6-Bcl-xL gly-159-ala #38 cells was found to be the same as historical results for TK6-neo #1 cells and TK6-Bcl-xL #4 cells. Thirty-four early-arising and sixty-three late-arising spontaneous TK1-deficient mutants of the TK6-Bcl-xL gly-159-ala #38 cell line were isolated. DNA from each of these mutants was extracted for future analysis.

Measurements of Gasoline and Diesel Vehicle Pollutant Emissions in the Caldecott Tunnel. JOHN MCLAUGHLIN (University of California, Berkeley, Berkeley, CA, 94720) THOMAS KIRCHSTETTER (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Motor vehicles contribute significantly to air pollution, including carcinogenic soot and ozone precursors. Past measurement studies in the Caldecott tunnel in the San Francisco Bay Area from 1994-2001 demonstrate decreasing emissions from light-duty (LD) vehicles over the last decade, due to fleet turnover and changes in fuel composition. Continued observation of LD vehicle emissions and establishing trends for heavy-duty (HD) diesel trucks, especially for particulate matter, is desired. In summer 2006, pollutant concentrations were measured at each end of the Caldecott tunnel. The difference yielded pollutant emissions from the vehicles traveling through the tunnel. The study measured nitrogen oxides, carbon monoxide, carbon dioxide, hydrocarbons including aldehydes, ammonia and particulate matter (PM) including fine particulate mass (PM2.5) and black carbon (BC). The PM was characterized by size, number and optical properties. An aethalometer gave real-time BC concentrations, whereas time-integrated BC concentrations were determined from analysis of quartz filters. The filters were analyzed using a thermal optical analysis (TOA) method which measures the amount of sample carbon that evolves with increasing temperature, producing a thermogram. In TOA, changing optical properties of the filters helps to distinguish between BC and organic carbon. The preliminary results, based only on a portion of the data, indicate a continued decreasing trend in LD emission factors. From 1999 to 2006, light-duty NOx, CO and BC emission factors per mass of fuel burned have decreased from 6.55 to ~3.2 g/kg fuel, 50 to ~23 g/kg fuel and 35 to ~22mg/kg fuel, respectively. Further analysis for HD emission factors will be used to set a baseline for PM and NOx emissions prior to stricter regulations on diesel engines in 2007.

Mechanizing Photoelectron Spectrometer at HERS. XIORANNY LINARES (University of California, Berkeley, Berkeley, CA, 94720) ZAHID HUSSAIN (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Superconductors have been a main topic of scientific research since their discovery in the 1950s. In the attempt to understand superconductors, researchers later discovered cuprates. The discovery of these "high-Tc" superconductors increased scientists’ possibilities of developing room-temperature superconductors. This development would increase the efficiency of electric generation, transmission, distribution, and utilization. This would result in a reduction of generated power requirement, and thus, a decrease in greenhouse emissions to the atmosphere. However, the complexity of high-tc superconductors and their inability to follow the theories of conventional superconductors has thus far prevented the creation of room-temperature superconductors. At the Lawrence Berkeley National Laboratory, the research group under the leadership of Zahid Hussain tests high-Tc superconductors to describe their properties and how they work. They study electron systems using a state-of-the-art High-Energy-Resolution-Spectrometer (HERS) for angle-resolved photoemission experiments. In these tests, x-rays are flashed on a sample, and electrons are then emitted via the photoelectric effect. The emitted electrons are analyzed by a state-of-the-art angle-resolved electron-energy analyzer (Scienta SES-200) that rotates about the sample. The electron analyzer measures the angle and kinetic energy of the electrons, which, through the conservation of energy and momentum, determines their velocity, scattering rates, and energy. The information found is used to create a graph portraying the electron’s momentum vs. its kinetic energy. This graph shows the Fermi surface of the material. The Fermi surface is the surface of constant energy in a space that at absolute zero separates the unfilled orbitals from the filled orbitals. Its shape determines the electrical properties of the metal since the current is due to changes in the occupancy of states near Fermi surface. In this way scientists can determine the properties and composition of the material. The research group’s testing process is long and time consuming since they have to rotate the analyzer manually to obtain their results. To improve their process, I will make a system that rotates the analyzer automatically and collects data without the need of constant supervision. I will research the most efficient way to rotate the analyzer, make accurate drawings of the system, make the parts needed for the system, assemble it, test it, and integrate it for future use.

Miltilayer Mirrors for Extreme Ultraviolet Lithography. JONNY RICE (Norfolk State University, Norfolk, VA, 23504) DAVID ATTWOOD (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Extreme Ultraviolet (EUV) lithography promises to be an efficient way to manufacture faster and more powerful microchips. Using light of wavelength 13-14 nm instead of the currently used range of ~200 nm results in the ability to produce smaller feature sizes for the silicon network that makes up the microchip. However, the normal lithography optics cannot be used with this range of light because the scattering of light at this wavelength reduces the rate of success of the process. The answer is multilayer mirrors, mirrors coated with alternating layers of optical materials, since the internal reflections between the individual layers and off-phase refractions within them cause interference that reduces the scatter. The most efficient multilayer optics reflect at ~70%, which means a multilayer mirror must have a high reflectance and be durable enough to withstand the intense energy required for the process, since the initial energy wave will be reduced by 60-70% at each interaction with one of several multilayer optics within the system. Current testing for EUV lithography involves measuring peak reflectance ang uniformity of reflectance of mirror samples as well as reductions in reflectance caused by prolonged exposure to radiation. Future research will involve testing new combinations of materials for multilayers as well as testing coating layers designed to lengthen the lifespan of the multilayer optics.

Minimizing Thermal Fluctuations and Vibration Effects on a High Resolution Beamline. ALFREDO TUESTA (University of Notre Dame, Notre Dame, IN, 46556) NICHOLAS KELEZ (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Thermal fluctuations from the environment and vibrational impacts from surrounding equipment pose a threat to the performance of high resolution beamlines. An optimal solution is unknown because methods for addressing these issues have never been empirically tested. This research focuses on the support structures, or stands, of MERLIN (meV resolution beamline) but the results can be applied to any beamline. The team measured the ground vibrations at several locations of the Advance Light Source (ALS) where MERLIN will be installed. Two stands, one empty and one filled with Zanite® Polymer Composite (Zanite), were also tested to determine the amplification of vibrations from the ground to the top. The change in the temperature of the top and middle of the empty stand was also recorded along with the change in ambient and floor temperatures for two and a half days. The ground measurements show little change when nearby equipment is turned off. Additionally, the RMS displacement is lower than the team’s goal of 0.5-1 micron. The empty stand shows voltage peaks at 40, 60, and 120 Hz, the later two which are damped in the stand filled with Zanite. The air temperature at the ALS changed 1.2°C making the stand temperature fluctuate approximately 0.8°C yielding a 9.8 microns axial deflection. This suggests that the stand must be insulated or filled with a material that will increase its thermal mass in order to decrease its deformation. Zanite should provide this thermal mass, however, more time was necessary to confirm this. Due to experimental error and equipment failure, the results for the ground vibration measurements are sometimes unclear or erroneous. More time was required to rectify these issues and to continue measurements of other methods for solving the thermal fluctuation and vibration effects at the ALS.

Multimodality Nanoparticle for Targeted in Vivo Imaging with Xe NMR and Fluorescence. LESLEY LARA (Contra Costa College, Richmond, CA, 94805) FANQING FRANK CHEN (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

One of the major challenges for antibody-based therapeutics is the lack of sensitive and convenient methods for in vivo imaging that track the distribution, metabolism, movement of the drug delivery system, and provide an effective means to monitor the treatment efficacy of the drugs. The lack of sensitivity also made early detection of cancerous tumors unrealistic. Currently, radiolabels are the most sensitive labeling technology; however, radioactivity labels are undesirable for large-scale use due to the harmful effects of ionizing radiation to both the technicians and the patients. Current generation MRI contrasting reagents work in a very high concentration range of several millimolars, and there is a high false positive rate. To solve this problem, we have constructed a novel class of imaging reagent that uses near-infrared CdSe nanocrystals. The nanocrystals are clustered with Gd-based MRI contrasting reagents for regular MRI imaging, or with a novel zero- to low-field MRI agent. This dual modality nanoparticle composite would be detectable with both deep tissue near infrared in vivo imaging and MRI/zero-field MRI. To target this to breast cancer, the nanoparticle also uses single-chain antibody against ErbB2, which is a protein in the EGFR family over expressed in 15% to >50% of breast cancers, depending on the stage of the disease. The nanoparticle is highly fluorescent with a high quantum yield and the clustering of the Gd chelating compound or zero-field MRI agent is demonstrated to be at least 500 per nanoparticle. This new class of nanoparticle based imaging solution can be applied to diagnostic and monitory imaging of other cancers, or even other diseases.

Nanoplasmonic Molecular Ruler for DNA-Protein Interaction. YUVRAAJ KAPOOR (Contra Costa College, San Pablo, CA, 94609) FANQING FRANK CHEN (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

One of the major challenges of quantitative biochemistry and molecular biology is to monitor enzymatic activity within a femtoliter volume in real time. We have constructed a novel nanoscale plasmonic probe-based molecular ruler, which can perform label-free, real-time, and sensitive monitoring of DNA length during nuclease enzymatic reactions. The bionanoplasmonic molecular ruler was fabricated by tethering specificly-designed double-stranded DNA to single Au nanoparticles. Nuclease enzymatic activity was tracked via the evolution of the plasmon signal of a single Au-DNA nanoconjugate, which reflects DNA size changes introduced through site-specific DNA digestion by endonuclease. The scattering spectra of individual Au-DNA nanoconjugates are measured continuously in real time during nuclease incubation. The scattering spectra of Au-DNA nanoconjugates show a blue-shift of the plasmon resonance wavelength, as well as decrease in intensity and a time-resolved dependence on the reaction dynamics. With a series of enzymes that generate DNA incisions at different sites, the shifts of the plasmon resonance wavelength are observed to correlate closely with the positioning of the nuclease-targeted sites on the DNA, demonstrating DNA axial resolution in nanometer precision (5 nm of wavelength shift per nm of DNA length change, or 1.4 nm wavelength shift per base pair difference). DNA length differences of as little as 2 nm (6 base pairs) after nuclease digestion are differentiated by the corresponding plasmon resonance shifts of the Au-DNA nanoconjugate. Based on the mapping relationship between the DNA length and the plasmon resonance wavelength of the nanoconjugate, we further develop the nanoparticles into a new DNase footprinting platform. This DNase footprint mapping is demonstrated through the binding of DNA repair enzyme XPG to DNA bubbles. This work promises a novel molecular ruler that can monitor nuclease enzymatic reactions with single-particle sensitivity in real-time. It suggests the possibility of developing ultra-high density nanoarrays for parallel enzyme activity measurement in functional proteomic studies or biofunctional nanoprobe for intracellular enzymatic studies.

Nitride Membranes: Surface Debris Prevention and Strength Testing. PATRICK BENNETT (University of California, Berkeley, Berkleey, CA, 94720) ERIK ANDERSON (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Fresnel Zone plate lenses are used to focus soft x-rays through a series of alternating zones of opaque and transparent material. Nitride membranes act as a support upon which zone plates are manufactured. The membranes are created by coating a silicon wafer with nitride and then etching away the silicon with potassium hydroxide (KOH) to form a window. Since window strength is an important factor in throughput, it would be desirable to identify and optimize production factors that affect strength. To perform analysis of window strength, an apparatus was designed and constructed that would increase pressure on nitride membranes until breakage, allowing for comparative analysis between different production steps. During testing, it was found that the windows being produced were contaminated with debris. In order to reduce contamination, sources were identified using a process of elimination. By replacing contaminating production steps, surface debris was greatly reduced. Furthermore, debris was eliminated completely using a solution of hydrogen peroxide and sulfuric acid. This is not an ideal solution, however, as it is hazardous and its effects on window strength are unknown. Preliminary results from pressure testing indicate that strength of membranes is dependant on mount orientation. While these results were unexpected, more testing needs to be done to determine the nature of this relationship. Most likely, window strength will be related to both the absolute window size as well as the relative size of window to surrounding silicon support frame.

Numerical Simulations of Electric and Magnetic Fields for the Pulse Line Ion Accelerator. SAMUEL PEREZ (Contra Costa College, San Pablo, CA, 94806) ENRIQUE HENESTROZA (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

The Pulse Line Ion Accelerator (PLIA) is a (slow-wave) helical structure whose purpose is to accelerate charged particles. It uses a voltage pulse to generate a traveling electric field along its length. In order to efficiently produce a working PLIA and know how to use it, a simulation of the PLIA was created and the electric and magnetic fields it produces were plotted using the Electromagnetic Field Solver “CST Microwave Studio”. In addition to the plots offered by Microwave Studio, more specialized plots were created using the particle code WARP. The geometry is simple: a helix embedded in dielectric material inside a conducting box with a cylindrical vacuum through its center. The voltage pulse is applied on a loop around one end of the helix, thus coupling inductively to the helix. A coarse mesh was used at first so that adjustments could be made quickly. Once the simulation correctly produces a traveling wave, the helix was changed to its final dimensions and a finer mesh was used to generate sufficient data. The plots created by CST Microwave Studio show that the electric field travels the length of the PLIA at about 1/60 the speed of light, the expected value given the permittivity of the dielectric material, the helix dimensions, and the beam pipe radius. The field does not begin to move until after the voltage pulse ended. There are actually two fields, a positive and a negative field, which move with the negative field ahead of the positive field. The field is also defocusing in the transverse direction so that positive charges will be sent outward into the walls of the PLIA. In order to successfully use the PLIA, the beam will have to enter the structure so that it only interacts with the positive field. It is also necessary to have powerful solenoids around the PLIA to focus the beam. Continued work on the PLIA could prove it as an inexpensive slow-wave accelerator for use in Fusion research.

Optimization of an HPLC Method for Determination of Carbon-11 Specific Activity in [C-11] Methyl Iodide. NATALIA SHAROVA (Contra Costa College, San Pablo, CA, 94806) JAMES P. O'NEIL (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

We have created a “standard mass concentration curve” for determination very small concentrations of the methyl iodide in the carbon-11 labeled methyl iodide for further calculation of a carbon-11 specific activity. The stock solution of methyl iodide was prepared by weighing and volumetric dilution with several precautions such as sealing the vials with Teflon faced septa. This solution was then diluted with water to prepare 5 standards with the methyl iodide concentration range 3.0 – 0.03 nmoles per injection. All standards were injected in the HPLC in triplicate and the responses were analyzed by the PeakSimple data system. Collected data allowed us to create a “standard mass concentration curve” and calibrate the PeakSimple for the specific methyl iodide components. Along the experiment we came to a conclusion that standard dilutions for this experiment could be done with water; however, water as diluent had its disadvantages that limited the minimal achievable concentrations of methyl iodide and increased uncertainty of the results. In order to increase the reliability of the standard curve, the experiment should be conducted within the time period that does not exceed 12 hours, and all standard samples has to be stored in small sealed glass vials at low temperatures (˜ -5o C) while not being used.

Optimization of the CLAIRE Ion Beam Extraction and Transport System Using Computer Simulations. NAN XU (University of California, Berkeley, Berkeley, CA, 94720) DAMON S. TODD (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Center for Low Energy Astrophysics and Interdisciplinary Research (CLAIRE) is a proposed nuclear astrophysics facility under design at the Lawrence Berkeley National Laboratory. The facility will measure cross sections relevant to stellar burning, namely 3He(4He,)7Be, a reaction which is one of the leading sources of uncertainty when correlating solar neutrino data with theoretical solar models. A beam line concept has been developed to extract and transport a tightly focused (sub-centermeter), high current (100 mA), low energy (50 keV~300 keV) 3He+ ion beam to a high density gas jet target. The beam is first extracted from a plasma ion source, and is then focused by two solenoid lenses. An acceleration column is placed to accelerate the beam to a higher energy. The envelope of the accelerated beam is kept as small as possible by another lens before going through a 60° analyzing magnet for filtering. The last focusing solenoid lens produces the desired beam size on the target. An extensive simulation program was employed to optimize the extraction and the transport of a 100 mA, 3He+ beam at 50 keV. The source extraction electrodes will have to undergo further shaping including rounding of corners, but provide the preliminary source configuration that can be used to design the remainder to the beam transport system. Initial work was done on the acceleration column to insure that accelerated beams can arrive at the source with similar qualities, but further modifications of the simulation are needed and further fine tuning must take place for the final design. The detailed analysis of this simulation will be shown and discussed.

Preparation and Characterization of Self-Assembled Molecular Films. KYAL WRIGHT (Norfolk State University, Norfolk, VA, 23504) MIQUEL SALMERON (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Industrial laboratories and semiconductor manufacturing companies have implemented an initiative to investigate the structure-property relationships, specifically the electrical and mechanical property relationships, of island structures (islands) of organic ultra thin films. Before these properties can be studied, reproducible methods of island formation must be developed to promote controlled structure-property studies. This project will study island formations from alkanethiol self-assembled monolayers (SAM) and conduct preliminary mechanical property tests. There are many parameters that affect the formation of SAM islands. Some of these parameters include the surfactant concentration in solution, the solution temperature, the substrate cleanliness, the substrate’s grain sizes and the duration of deposition. This project used Atomic Force Microscopy (AFM) to investigate the relationships between solution concentrations, substrate deposition times, and substrate grain sizes for the control of SAM island domains. To test the effects of deposition time and material concentration on the formation of the SAM islands on gold substrates, alkanethiol molecules in ethanol solution were used in 2 µM, 5µM, 10µM and 20µM concentrations. Experiments dependent on deposition time used time periods ranging from 30 seconds to 6 minutes. Preliminary tests on the effects of flame annealed gold substrates resulted in the use of substrates annealed for 45 passes at one pass per hertz for the tests involving solution concentration and deposition time. The results of the preliminary substrate investigation and structural investigations support that SAM island domains larger than 100nm can be formed on annealed gold substrates when alkanethiol solutions greater than 10µM concentration are deposited on the substrate between 45 seconds and one minute. Results from the mechanical property investigation indicated that the islands formed from 20µM solution at a deposition time of 45 seconds are quite robust when a maximum load corresponding to 8 volts from the AFM system was applied to a 150nm region. While the results from all the investigations support the theory that the first phase of SAM island formation can be controlled, further investigations and trials for each experiment are still needed to confirm this claim.

Proton Recoil Detectors and Fission Ionization Chambers for Neutron Dosimetry. BRENT WILSON (Merced College, Mered, CA, 95340) PEGGY MCMAHAN (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Neutrons are ionizing particles, which cause damage to human cells, including astronauts, and electronics. They are difficult to directly detect due to a neutral charge; however, there are several different ways to develop a neutron detector to measure neutron flux indirectly. This research involved the creation and development of two neutron detectors: the prototypes of a proton recoil detector and a fission ionization chamber. The intention was to measure neutrons of 5 to 30 MeV, but since the neutron beam was not available, a proton beam was used instead. A proton recoil detector is composed of a solid-state detector, which electronically counts how many protons are being recoiled out of a plastic medium to determine the incident number of neutrons. In this particular prototype, a 50.0 MeV proton beam was used to show that this prototype worked for the first phase of testing. The next phase of testing will include neutron beams with energies between 5 and 30 MeV for actual proton recoil and a plastic medium containing an ample amount of hydrogen, like polyethylene terephthalate (Mylar). A fission ionization chamber indirectly counts the number of neutrons by means of a gas-filled chamber and fissile foil (e.g., thorium), in which fission fragments produce ion charges, so that measuring the total charge count leads to a calculation of neutron flux. In this particular prototype, an americium-beryllium source was used as a neutron emitter for testing the ion charge collection. The chamber was filled with nitrogen gas at one atmosphere pressure and contained two electrodes, biased to -325 V, and the other used for ion collection to electronics. The prototype fission ionization chamber has just been completed, and tests for functionality are currently being conducted. The next prototype of the fission ionization chamber will include evaporation of the 232Th onto a window and neutron tests from 5 to 30 MeV in beam.

Quantifying the Effect of Temperature on Chlorophyll Fluorescence. JESSICA STONE (California State University Fresno, Fresno, CA, 93630) WILLIAM STRINGFELLOW (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

A daily change in chlorophyll concentration was observed in the field (San Joaquin River) algae samples. A lab study of algae growth at various temperatures was needed to determine if the change in chlorophyll concentration was indeed due to temperature changes of the environment, or if this was not a true change in concentration, but just a temperature effect, instrument default, or some other factor on the algae. A Hydralab sonde was used measure the temperature (°C), and chlorophyll fluorescence (volts) every minute for 2 hours and 15 minutes. The algae culture, grown for 4-17 days in single source carbon (SSC) media was diluted to ~1 volt was cooled with an ice water bath, then heated with a hot water bath followed by another cool ice water cycle. The results showed an inverse relationship of chlorophyll fluorescence versus temperature. Cooler temperatures resulted higher the chlorophyll concentration. Thus, temperature is having an effect on chlorophyll fluorescence. Further experiments need to address any possible differences between using a young culture or an old culture. Also, the sonde (Hydrolab unit #41) that was used needs to be compared with data acquired from the Shmadzu spectrofluorophotometer and chlorophyll extraction methods.

Removal of Arsenic from Contaminated Water by Means of Electrochemistry. KRISTIN KOWOLIK (Santa Monica College, Santa Monica, CA, 90405) ASHOK GADGIL (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Millions of people worldwide do not have access to clean water. This problem is especially severe in Bangladesh where water is severely contaminated with arsenic. Chronic arsenic exposure has devastating health effects: cardiovascular diseases, cancers, and eventually death. Many methods of arsenic removal have been studied but most of these are too expensive and impractical to be implemented in poor countries such as Bangladesh. This project investigates electrochemistry as an affordable means of removing arsenic. Experiments are performed using a galvanic cell with iron and copper electrodes. These are immersed in synthetic ground-water spiked with an arsenic concentration of 1 ppm. Current is applied to the system and iron metal is oxidized to Fe(III). As an ionic species, iron will bind free arsenic in solution. After the electrochemical process, the treated water is filtered by means of vacuum filtration. One of the significant major tasks of the project was to develop an experimental protocol (methods, measurement techniques, experimental conditions) to obtain reproducibly consistent results, so this process can be investigated further. We showed that if certain conditions are met such as (1) optimal current density, (2) sanded iron electrode and (3) specific amount current and time, consistent results are obtained. An initial arsenic concentration of 1 ppm can be reduced to a final concentration as low as 5 ppb or less, in 1 L water by application of 90 mA current in 10 minutes. These results are very encouraging and provide great promise that electrochemistry is a powerful, and most importantly, an affordable tool in the remediation of arsenic from contaminated groundwater.

Sequence-Based Identification of Yeast. EBONY GRIFFIN (Jackson State University, Jackson, MS, 39217) DR. TAMAS TOROK (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Fungi are eukaryotic organisms that can survive in various environments. They are responsible for human diseases like athlete’s foot and ringworm. Fungi also produce powerful antibiotics that are used to fight off bacteria. Yeast, which is an integral part of the fungi family, are widely known for their use in the biotechnological world. Their most common usage is fermentation, which is the conversion of sugar to alcohol. This report focuses on a process of identifying yeast through DNA sequencing and analysis. To be aware of yeasts and their role in the environment, they must first be identified taxonomically. Before the experiment began, the yeasts were collected from various environments. The yeast were grown in pure culture and the total genomic DNA was extracted from each sample. The polymerase chain reaction (PCR) was used to amplify the D1/D2 26S rRNA domains. Agarose gel electrophoresis was used to analyze the amplicons. After gel analysis, DNA sequencing was done at the University of California Berkeley DNA Sequencing Facility. Raw sequences were edited online using FinchTV Version 1.3.1. The edited sequences were blasted against a national database hosted by the National Center for Biotechnology Information (NCBI) for preliminary identification. A total of thirteen species were found to be new and seventeen had already been discovered.

Sequenced Based Identification of Yeast. MARTHA JOHNSON (Jackson State University, Jackson, MS, 39217) DR. TAMAS TOROK (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

The study of microorganisms is important due to the rapid advances in microbiology and its substantial contribution to the commercial, scientific, and medical aspects of life. Fungi offer molecular biologists well-developed genetic systems to use as eukaryotic model organisms and have contributed to our understanding of human genetics. Fungi also have significant potential for bioremediation. The use of fungi to recycle nutrients through biodegradation is useful and will help with green chemistry and polluted soils. Yeasts are single cellular fungi that have either aerobic or anaerobic metabolic capabilities. Some of the most common uses of yeasts are in bread making, and in beer and wine fermentation. Researchers at Lawrence Berkeley National Laboratory (LBNL) are investigating the possibility of using specific biomarker DNA sequences to detect and identify yeasts. This technique applied in a high-throughput DNA microarray format will provide researchers with a novel cutting edge for rapid identification of yeast species. The current project focuses on the analysis of the D1/D2 domain sequences of the 26S ribosomal RNA genes of a large number of yeast species that were isolated from various environments. The yeasts were grown on potato dextrose agar (PDA). Following genomic DNA extraction, the D1/D2 domains of the 26S rRNA genes were amplified using the polymerase chain reaction (PCR). After amplicon cleanup and sequencing, the edited sequences were compared to sequences of known yeasts available in a public database hosted by the National Center of Biotechnology Information (NCBI). Ten of the 34 yeasts examined showed DNA sequence homology of less than 98% when compared to DNA sequences of known yeast species. Additional experiments will be performed to examine deletions, insertions, and differences in nucleotide sequence for any evolving alterations. These alterations could lead to greater knowledge of how yeast adjusts to the given habitat.

Structural Study of the DNA Repair Protein XPG using Constructed Proteins with Individual Domains. JONATHAN ROYBAL (University of California, Berkeley, Berkeley, CA, 94720) SUSAN TSUTAKAWA (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Cockayne syndrome and xeroderma pigmentosum are two distinct genetic disorders, characterized by sun sensitivity and severe developmental disorders in the former, and by extreme sun sensitivity and skin carcinogenesis in the latter. Paradoxically, both are caused by different defects in many of the same DNA repair proteins. One such protein, XPG, is thought to be particularly important among these proteins. Structural analysis of XPG would greatly elucidate the mechanistic basis for the distinct functions of XPG. However, XPG has proven difficult to crystallize. Two proteins, XFX delta exon 15 and GST-M4, were created from parts of XPG and other proteins to perform structural studies of the two main catalytic regions of XPG, the “N” and “I” domains, and part of its regulatory region (the R domain). Purification was attempted for each protein via its own protocol using affinity and ion exchange chromatography. GST-M4 was purified to ~80% homogeneity, then split to give separated GST and M4 using Prescission Protease. Conditions to produce homogeneous M4 alone are still being determined. We obtained 5 mg of purified XFX delta exon 15 which was shown to be monodisperse as dimers by dynamic light scattering, found to be active with DNA, and was analyzed via small angle X-ray scattering. Screens for crystallization conditions were then run on purified XFX delta exon 15 samples, yielding crystal showers; optimization of conditions to produce larger crystals will be done in the near future.

Synthetic Biology: Widespread Use Is Closer Than You Think. KEHINDE O'DUNIKAN (University of the Pacific, Stockton, CA, 95211) DR. PETER LICHTY (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Synthetic biology is the synthetic redesign of biological systems. Synthetic biologists’ goal is to make from scratch, programmable organisms that do not currently exist in nature. Possible applications include: development of bioengineered microorganisms that can produce medicine, identify harmful chemicals, disintegrate environmental pollutants, demolish cancerous cells and remedy flawed genes; hydrogen fuel; and development of bioengineered organisms to diagnose and treat certain illnesses. Synthetic biology consists of specifying the DNA that is entered into an organism and from there determining its function in a foreseeable way. With talks about this new science in the media the public has developed some concerns about it. People worry about dual use and terrorism, who will regulate research, the ethics of synthetic biology, it becoming privatized and monopolized and if something will go wrong in the current experiments. Also, some religions see synthetic biologist as trying to play God. Synthetic Biology is so controversial that national conferences were held. The second of its kind, The Synthetic Biology 2.0 (SB2) conference was held in Berkeley, California back in May. There, scientists meet to talk about the latest advances and applications of the science and about the security concerns as well. After the three- day conference they developed a community declaration, which will serve as a form of scientific self-governance, outlying the way researchers and companies should act to warrant openness and security of synthetic biology research. A lot of the everyday products that we take for granted would not be here without the developments in the field of biotechnology and some day we could say the exact same thing about synthetic biology. Biotechnology is used to produce everyday food products such as yogurt, cheese, beer; produce vaccines; to produce laundry detergents and dishwashing liquids; to do DNA Fingerprinting techniques for forensic investigations and paternity tests; to genetically engineer crops and agriculture; and to create human therapeutics and proteins. The possible applications can potentially save millions of lives in the near future and tremendously improve the environment. With the current research of many universities, pharmaceutical companies and even laboratories in our own communities like the Lawrence Berkeley National Laboratory, widespread use of synthetic biology is closer than you think.

Systematic Search for Lanthanum or Bismuth Oxide Scintillators. ALEISHA BAKER (North Carolina Agricultural & Technical State University, Greensboro, NC, 27411) STEPHEN DERENZO (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

More than ever, there is need for new or improved scintillators to keep up with the advancements in radiation detection technology. Known scintillators decay slowly, have low light output, and can be difficult to manufacture. In efforts to discover new ideal scintillators, researchers must search, synthesize, and characterize compounds that exhibit luminescent traits. This research focused on 12 bismuth and lead compounds. Candidates were synthesized by means of a solid-state chemistry technique known as the ceramic method. The products were characterized using x-ray diffraction, fluorescence spectroscopy, and pulsed x-ray measurements. The compounds studied did not show attributes of a high scintillation. However, since several of the band gaps have band gaps greater than 3.5 eV, they may be modified to form semiconductor scintillators in the future.

The Absence of Transforming Growth Factor-ß1 Perturbs Mammary Gland Development in Mice. DEREK FRANCIS (Carnegie Mellon University, Pittsburgh, PA, 15289) HELLEN A. OKETCH-RABAH (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Transforming Growth Factor-ß1 or, TGF-ß is a pluripotent cytokine that has proliferation inhibitory properties, and can induce apoptosis. TGF-ß is very important to breast cancer research due to its ability to regulate the proliferation of mammary epithelial cells. Tgf?1 heterozygotes have <10% of the TGF-ß1 that in wildtype mice. Our group has recently shown that the proliferation rate of Tgfß1 heterozygote mammary epithelial cells was increased by twofold in quiescence and fourfold in proliferating tissue, but ducts and alveoli remained histologically normal. Using a transplant method adopted from DeOme, it has been shown that the Tgf?1 null mouse epithelium matures fully at 4 weeks, as compared to the wild-type at 10 weeks. Immunofluorescence staining technique was used with Tgf?1 null mice and T?RII Anti-Sense mice. We found that the frequency of the proliferation marker KI67 was increased 3.25 fold in Tgf?1 null epithelium as compared to Tgf?1 wild-type indicating an enormous increase in proliferation in Tgf?1 null mammary epithelium. Further more the frequency of proliferating ER+ (Estrogen Receptor Positive) cells was also increased in Tgf?1 null epithelium by 2.29 fold. Thus complete depletion of TGF?1 results in an enormous increased in the proliferation of ER+ cells. However contrary to our expectation the frequency of ER+ cells decreased by 1.5 fold in the Tgf?1 null epithelium. Interestingly the same was observed in the T?RII anti-sense mice epithelium as compared to wildtype further supporting the fact that blockade of TGF-ß1 signaling whether by complete depletion of the ligand (TGFß protein) or by interfering with the signaling pathway though blocking the receptors have the same result-increased proliferation of ER+ cells. Finally the Tgf?1 null epithelium and the anti-sense epithelium both produce milk indicating that blockade of TGF-ß1 signaling results not only in accelerated development of the gland but also accelerated maturation and terminal and glandular differentiation. Taken together our data show that TGF-ß1 is an important regulator of mammary gland development and that removing it from the picture results in increased proliferation of ER+ cells, but more importantly a dysregulation of mammary development with the gland maturing prematurely and lactating in the virgin gland where the function is not required.

The Development of a Calibration System for Pulsed Laser Tests of Silicon Pixel Detectors for the International Linear Collider. KATHERINE PHILLIPS (North Carolina State University, Raleigh, NC, 27607) MARCO BATTAGLIA (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

The International Linear Collider is the next large scale facility in accelerator particle physics, designed to study the Higgs sector and new physics beyond the Standard Model. Efforts are underway to develop silicon pixel detectors for use in the vertex tracker. These detectors must meet strict specifications, including the requirement that the chips thickness does not exceed about 50 microns. In order to assess the feasibility to meet this requirement with monolithic CMOS pixel sensors, a number of prototype chips, named MimosaV, have been back-thinned from 450 to 50 and 39 microns. The MimosaV chip response is tested by electrons, radioactive sources, and lasers before and after back-thinning. The lasers consist of a solid state diode coupled to an optical fiber terminated on a collimating lens doublet. The goal of this project is to setup a laser calibration system to ensure that possible variations in chip response are to due to the chip post-processing and not due to the laser source instabilities. The experimental setup consists of a 1060 nm laser, a photodiode to read the laser's output, readout electronics, and a data acquisition (DAQ) analog-to-digital converter board. Laser pulses of 100 ns length are converted to voltages through a photodiode, and the signal is then sent to the readout electronics, which amplify, shape, and lengthen the pulse. The signal is then sent to DAQ board, which samples the input signal every 10 microseconds. A LabVIEW program stores peak values and computes the average peak value, the standard deviation, and the error. Statistical analysis removes peaks due to noise. It was observed that the setup takes twenty minutes before it gathers precise and consistent peak values. However, after this warming-up period, the peak values are stable to better than 1%, well within the tolerance of the measurements. When the pulse length is shortened, the peak value decreases in a non-linear fashion. The stability could be improved by adding a reset functionality into the readout electronics. More investigation is needed into possible causes of the warming-up time. It will also be necessary to develop a more consistent way to align and secure the laser in place. The present laser calibration system will be added to the chip test setup and will provide calibration for all future tests of monolithic pixel chips.

The effect of fire on soil carbon storage in a prairie ecosystem: applications to global climate change and ecosystem-climate feedbacks. RYAN SMITH (California State University Fresno, Fresno, CA, 93740) MARGARET TORN (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

The increasing concern around the topic of global warming is reason to assure climate change models be accurate. A potentially important omission from current models is CO2 feedbacks between climate and soils. Natural fire and managed fire both play an important role in maintaining the prairie ecosystem. If warming increases the frequency of fire it will be important to know the effect fire has on the storage of carbon in soil. The goal of this project was to investigate the effect of fire on soil carbon stocks in a prairie of the Southern Great Plains. Our null hypothesis was: no change in carbon content of grassland soil due to the fire treatment. The alternative hypothesis was that carbon content of the soil will decrease due to the treatment of fire. The USDA Grazing Research Laboratory in Oklahoma collaborated in this project and provided soil samples for analysis. Ten soil cores, 1 meter deep, were collected from two adjacent prairie fields in March 2005. Shortly after collection, the north field was treated with fire while the south field was left unburned to function as the experimental control. In August 2005, 10 more cores were collected from each of the two fields. Each core was divided into 10 cm depth increments from 0-50 and 25 cm increments below that, to look at the change in carbon content with depth. One half of the core was used to determine bulk density. The other half was used to test for carbonates and determine carbon content. No core tested positive for carbonates. For carbon analysis, the soil was homogenized and roots were removed. The carbon analysis showed that the content of carbon decreased with depth and that the greatest variability among cores was in the top 10 cm. Between March and August, the south field (unburned control) lost, 0.31 kg Carbon/m2 (p < 0.14 for the difference). In contrast, the north field showed an average total carbon stock loss of 1.1 kg Carbon/m2 after the treatment of fire. The loss of soil carbon in the burned field was highly significant (one-tailed p < 0.034) Assuming that the lost soil carbon was released to the atmosphere as CO2, these results suggest that there could be a strong positive feedback effect if warming increases fire frequency in prairie. Continued research on the rates of inputs and outputs of carbon into the soil-and the effect of fire on them-needs to be done to make whole conclusions.

The Efficiency of Stripe Removal from a Galactic Map. CAROLYN MELDGIN (Harvey Mudd College, Claremont, CA, 91711) GEORGE SMOOT (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

The Galactic Emissions Mapping project (GEM) aims to isolate the radiation emitted by the Milky Way and other galaxies from the cosmic microwave background radiation. Two dimensional scans of the sky at different frequencies allow separation of the cosmic microwave background from the galactic signals. Signals are most informative in the microwave spectrum, where the frequency of the light is in the 0.5 to 5 GHz range, because the intensity of the galactic signal relative to the cosmic microwave background changes most rapidly. In recent years, widely available technology such as cell phones and microwave ovens have increased terrestrial noise, making the galactic signal extremely difficult to analyze. GEM uses data taken in 1999, when terrestrial sources of microwaves were less common. These scans were taken at 2.3 GHz in Cachoeria Paulista, Brazil. The scans have striped flaws caused by microwaves from a nearby radio station diffracting over the top of the shielding around the antenna. This paper describes the use of filtering techniques involving Fourier transforms to reduce or remove the striped flaws. It also describes a metric, based on the Principle of Maximum Entropy, to determine the efficiency of different stripe-removal filters.

The Occasional Appearance of Carbon in Type Ia Supernovae. JEROD PARRENT (University of Oklahoma, Norman, OK, 73071) R. C. THOMAS (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Recent observations made by the Nearby Supernova Factory reveal C II absorption features below 14,000 km/s in the early photospheric spectra of the Type Ia Supernova (SN Ia) 2006D. The largest of these features is attributed to C II λ6580, which dissipates before maximum brightness. Only a handful of SNe Ia have been observed to contain the C II λ6580 feature. This is the largest observed carbon feature in SN Ia thus far and is additional evidence that carbon, and hence unburned material, can be detected at early times in SN Ia spectra. Presently, certain 3D hydro-dynamical SN Ia deflagration models contain unburned carbon and oxygen below the W7 (1D deflagration model) cutoff of 14,000 km/s. These observations support explosion models that contain unburned material at low velocities, such as 3D deflagrations and certain delayed-detonations. The sporadic nature of observed carbon features in SN Ia and its implications for explosion models will be discussed. We also emphasize the importance of obtaining spectrapolarimetry data in order to test for asymmetries in the supernova.

Thermophoresis and its Thermal Parameters for Aerosol Collection. ZHUO HUANG (Sacramento State University, Sacramento, CA, 95819) MICHAEL APTE (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

The optimal particle collection efficiency of a prototype environmental tobacco smoke (ETS) sampler based on the use of thermophoresis was determined with a particle phase “titration” in a 24 m3 environmental chamber. This sampler’s heating element was made of three sets of thermophoretic (TP) wires of 25 µm in diameter suspended across a channel cut in a printed circuit board. Two collecting surfaces were mounted, one on each side, to form a flow channel of 1 mm high with the TP wires suspended in the middle, 500 µm from each surface. The separation of between the heating element and the room temperature collection surface was determined in a numerical simulation based on Brock-Talbot model. Other thermal parameters of this TP ETS sampler were predicted by the Brock-Talbot model for TP deposition. The theoretical thermal parameters were examined and were used to characterize the TP ETS sampler’s collection mechanism. In addition, by heating the wires we determined their temperature-resistance relationship. From the normalized results the optimal collection ratio was expressed in terms of applied voltage and filter mass. We raised the operational voltage from 1.0 to 3.0 V, and we found that the collection efficiency was increased by a factor of five for both theory and experiment.

Upgrade of Insertion Device Magnetic Measurement System: Motion Control and Position Measurement. JUSTIN HSU (University of California, Berkeley, Berkeley, CA, 94720) STEVE MARKS (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

An Elliptically Polarizing Undulator (EPU) is a storage ring insertion device that generates intense light for use in scientific experiments at synchrotron radiation facilities like the Advanced Light Source (ALS). An EPU has four individual mechanical beams, which are comprised of permanent magnets that have been arranged in a deliberate pattern of alternating pole orientations. By adjusting the position of the beams, one is able to manipulate the trajectory of an electron beam into a sinusoidal or helical path, inducing a desirable phase-coherent superposition that increases the intensity of the produced light by many orders of magnitude. Because the electron beam path depends directly on the EPU’s magnetic field, errors in the magnetic field translate into out-of-phase beam trajectories, leading to degradation in brightness. To prevent this, a precision magnetic measurement system is used to measure magnetic field as a function of position. This device assesses optical phase errors and provides a basis for correcting the EPU’s magnetic field. Because it had been several years since the magnetic measurement bench at the ALS had been upgraded, many of the components were obsolete. In order to maintain compatibility with newer equipment and software, a complete overhaul of the device was necessary. My task involved upgrading key components of the system. This included replacing laser interferometers with linear encoders, mechanical mounting of the linear encoders, installing an updated motor controller, building a convenient user interface to display all data, planning/implementation of the system’s electrical wiring, updating the computer’s software/operating system, and writing C/C++ code that facilitated communication within the entire system. Because the project is still ongoing, experimental results have yet to be determined. Future work may involve further investigation and code modification to expand system functionality.

USING MICROARRAY TECHNOLOGY TO COMPARE BACTERIAL DIVERSITY WITHIN DIFFERENT HORIZONS OF CONTRASTING SOILS. DHARSHINI VENKATESWARAN (California State Polytechnic University, Pomona, Pomona, CA, 91768) GARY ANDERSEN (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

There is emerging interest in understanding the linkages between above- and below ground microbial communities. As part of a large DOE funded climate change project, the Andersen lab is studying the microbiological responses of Californian grassland ecosystems. Various mesocosm conditions were set up to replicate the effects of climate change (modified temperatures and precipitation) on plant and microbial communities. Since climate change influences the function of these ecosystems, we compared bacterial diversity within different horizons of contrasting soils. Two soils compared were from Hopland, in Northern California, and Sedgwick, from Southern California. Since the pattern of growth in plant roots is directly dependent on water supply, climate change such as increased or decreased rainfall may indirectly affect the microbial communities found in different horizons depending on the growth of the roots in response to water. Generally there are different approaches to profile soil microbial diversity. One way is to use Phospholipid fatty acid analysis (PLFA), which analyzes membrane lipids; however, this approach is not sensitive and will not effectively resolve microbial speciation. Another conventional method is cloning of the 16S rRNA gene, which requires transferring the amplified DNA fragments into a plasmid, and randomly sequencing a few hundred plasmids. In soil samples where one will expect to find at least 100,000 species then one would need to sequence 100,000 clones using this method, which is time consuming, expensive, and inefficient. Instead, a high-density photolithography microarray displaying 500,000 oligonucleotide probes complementary to diverse 16S rRNA sequences was used. This new technology, allowed us to identify species or groups of bacteria present in the soil sample more efficiently. The study showed the horizon B1 to have more microbial communities than horizon A (Figure1). The observed microbial biomass also seems to increase at deeper horizons (Figure5). Clostridia for example was found to be in higher microbial amounts in the 40-65 cm depth of the Hopland-B2 soil, where as Cellulomonas (Actinobacteria) and Phyllobacteriaceae (Alpha-proteobacteria) were predominately present at the top horizon (Hopland-A) (Figure6).

Vineyard Optimization through Novel Characterization and Cluster Analysis: Applications of Geographic Information Science in Precision Viticulture. JAMES WOLF (Univeristy California Santa Barbara, Santa Barbara, CA, 93106) SUSAN HUBBARD (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

With viticulture and other agriculture comprising a significant portion of the California economy and water budget, accurate and precise information regarding soil and hydrological characteristics is crucial. With regards to viticulture, the science of growing wine grapes, 90% of the United States’ wine is produced within California. Furthermore, California is the nations greatest consumer of developed water, with roughly 80% of that water allocated to agricultural production. This research combines a wide array of data collected from both traditional and novel techniques into a single analysis. Traditional data acquisition techniques explored in this project include surveying of topography for elevation, slope, and aspect analysis, soil pit drilling for analysis of soil texture and chemical composition, and Time Domain Reflectrometry. These traditional techniques offer discrete sampling which necessitates interpolation between sample points. Novel techniques, on the other hand, provide more continuous data collection. Such techniques include Ground Penetrating Radar, electromagnetic measurements, Cone Penetration Testing for soil behavior types, and Normalized Difference Vegetation Index. From these data, properties that are important to the production of wine grapes such as soil moisture and texture can be estimated. To manage the data collected, a Geographic Information System was developed. Then the Hierarchical Ordered Partitioning And Collapsing Hybrid algorithm was applied to look for clusters of similar values, with the ultimate goal of delineating management zones within a single vineyard. In this way, a wine grower can optimize production by precisely planning where higher quality grapes will be most suitable for planting. Also, in order to optimize water resource management, water balance simulations can be performed within the defined management zones to explore the impact of different irrigation strategies.

Visualization of Nanowires in Shewanella oneidensis with Transmission Electron Microscopy. DANIELLE JORGENS (California State University, Fresno, Fresno, Ca, 93740) MANFRED AUER (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

A variety of toxic metals, like uranium, are by-products of industrial processes and threaten the fresh water supply of entire regions. Some microbes, like Shewanella oneidensis, have the ability to produce energy through the reduction of metals by using metals as terminal electron acceptors. Shewanella oneidensis MR-1, a motile facultative anaerobe, can reduce uranium (VI), iron (III), manganese (III/IV), and other oxidized metals under anaerobic conditions. In the absence of soluble electron acceptors, S. oneidensis forms electrically conductive filaments, known as nanowires. Evidence for nanowire expression in Shewanella sp. has come mostly by way of observation using scanning electron microscopy and atomic force microscopy. We have developed a new method, in which S. oneidensis was grown directly on the surface of an electron microscopy grid allowing the direct visualization of nanowire formation and maturation. A better understanding of nanowires will benefit the interests of effective bioremediation and alternative carbon neutral energy production.

Windows XP (SP2) Security Settings, Windows Vista Firewall Settings and Remote Assistance. ZACHARIAH TANKO (Big Bend Community College, Moses Lake, WA, 98837) CHARLIE VERBOOM (Lawrence Berkeley National Laboratory, Berkley, CA, 94720)

Security is an integral part of any software deployment plans at the Lawrence Berkeley National Laboratory (Berkeley Lab). There is a template at the Berkeley Lab that contains recommended security settings for Windows XP Professional deployment. The settings are in line with the National Institute of Standards and Technology (NSIT) recommendations. The current template is for Windows XP with service pack 1 (SP1). Most of the computers running Windows XP Professional at the Berkeley Lab have SP2 already installed. There is a need to update the Berkeley Lab’s security settings template. Microsoft introduced a personal firewall as part of the operating system with Windows XP. Microsoft will soon introduce a new desktop operating system known as Microsoft Vista. In Microsoft Vista, the Windows Firewall functionality in SP2 is retained. It is important for Berkeley Lab to undertake an extensive review of the Firewall rules and settings in Microsoft Vista even before an operational version of Vista is released by Microsoft. A good understanding will make Berkeley Lab fully prepared. Remote Assistance provides a way for users to get the help they need when they run into problems with computers. It is a way for Help Desk departments to save on the cost of supporting users. The Computer Support Division at the Berkeley Lab is always looking at ways of improving user support at a reduced cost without compromising security.