Historical regular seminars
ComSci Fellows meet on a weekly basis for special programs and lectures
intended to involve them in discussions on current science, technology,
and technology policy issues. At least 26 half-day seminars are held
throughout the duration of the fellowship year with speakers invited
from government agencies, the private sector, and academia. In lieu of
a speaker, regular seminars may take the form of a visit to scientific
and research facilities or industries in the Washington Metropolitan
Area. ComSci Fellows are encouraged to provide their input regarding
the selection of seminar topics, speakers, and visits for these weekly
sessions.
Class of 2004-2005
Class of 2003-2004
Class of 2002-2003
Class of 2001-2002
Class of 2000-2001
Class of 1999-2000
Class of 1998-1999
Class of 1997-1998
Class
of 2004-2005
Gary Smith
IP Consultant
(September 29, 2004)
Topic: International Aspects of Intellectual Property
The ComSci Program’s regular seminars began on September 29th with
an engaging presentation and discussion with Mr. Gary Smith, an intellectual
property (IP) consultant. Mr. Smith’s IP projects include work
for the Turkish Patent Institute and the Egyptian Patent Office. Until
2002, Mr. Smith was Director of the Patent Cooperation Treaty (PCT) at
the World Intellectual Property Organization; he also served for 25 years
at the United States Patent and Trademark Office (USPTO), culminating
with a position as Director of the PCT International Division. Accordingly,
Mr. Smith gave a wide-ranging talk beginning with the basics of intellectual
property and moving on to cover the intricacies of the international
system for protection of IP rights. The ComSci Fellows were also interested
to learn that Mr. Smith had been a ComSci Fellow earlier in his career,
and that his assignment at the White House Office of Science and Technology
Policy had contributed to his career in international technology issues.
In introducing the ComSci Fellows to the basics of intellectual property,
Mr. Smith noted that generally speaking, there are three types of property – real
property (land), personal property, and intangible property (a type of
personal property). All have in common that the owner of property has
the right to exclude others from its use. Intellectual property is a
form of intangible property, and thus cannot be physically protected
the way one might protect land or personal property. Rather, intellectual
property addresses protection of different types of innovations, and
is designed to provide incentives for continuing technological, economic,
and artistic advances. One rationale for a legal system for the protection
of intellectual property is to promote its disclosure and public availability.
For example, without patent protection, it is likely that many inventions
would not be made public, and would instead be kept as trade secrets.
Mr. Smith explained that intellectual property is generally divided
into two broad categories – industrial property, including patents,
trademarks, and industrial designs; and copyright. In the United States,
the Constitution expressly calls for the protection of intellectual property
in the form of patents and copyrights: “The Congress shall have
Power . . . To promote the Progress of Science and useful Arts, by securing
for limited Times to Authors and Inventors the exclusive Right to their
respective Writings and Discoveries . . .” U.S. Constitution, Art.
1, § 8, cl. 8. Accordingly, the Patent Act provides for the protection
of inventions or discoveries, essentially including new and useful processes,
machines, products, and compositions of matter. Industrial designs, which
are ornamental in nature, are also protected pursuant to the Patent Act.
The Copyright Act calls for protection of “original works of authorship,” including
literary works, musical works, dramatic works, choreographic works, pictorial
and sculptural works, motion pictures, sound recordings, and architectural
works. As the Constitution specifies, both patent and copyright protection
are for limited times – generally, 20 years for patents, and the
life of the author plus 70 years for copyrights. The Lanham Act provides
for protection of trademarks, which include symbols, labels, packages,
names, words, and phrases that distinguish goods or services. Trademark
protection is available indefinitely.
In the United States, the United States Patent and Trademark Office
administers patent and trademark protection. The U.S. Copyright Office,
a unit of the Library of Congress, administers copyright protection.
However, national protection of intellectual property is insufficient,
because once it is disclosed it is easily transferred and exploited.
In regard to patents, the need for international protection became clear
in 1873, when foreign inventors who were concerned that their ideas would
be stolen and commercialized refused to provide exhibits for an International
Exhibition of Inventions in Vienna.
International protection of inventions, trademarks, and industrial designs
began with the Paris Convention for the Protection of Industrial Property,
which entered into force in 1884 with 14 member States. In 1886, international
protection of copyright began, with the Berne Convention for the Protection
of Literary and Artistic Works. Both the Paris and Berne Conventions
set up small international offices to administer the functions of the
treaties. Over the years, the number of treaties and the size of the
offices grew, eventually becoming the World Intellectual Property Organization
(WIPO) – an organization that administers 23 treaties on behalf
of 182 member nations. WIPO seeks to harmonize national intellectual
property legislation and procedures, provide services for international
applications for industrial property rights, exchange intellectual property
information, provide legal and technical assistance to developing and
other countries, facilitate the resolution of private intellectual property
disputes, and marshal information technology as a tool for storing, accessing,
and using valuable intellectual property information.
The most successful and widely used of the treaties administered by
WIPO is the Patent Cooperation Treaty (PCT); as mentioned above, Mr.
Smith was the Director for Administration of the PCT at WIPO, and also
at the USPTO. The PCT provides for a single international patent application,
which has legal effect in all countries, which are bound by the treaty
and designated by the applicant. By filing a PCT application, an inventor
receives valuable information about the potential patentability of his
invention (through the international search report and the optional international
preliminary examination report) and has more time than under the traditional
patent system to decide in which of the designated countries to continue
with the application. Thus, the PCT system consolidates and streamlines
patenting procedures and reduces costs, providing applicants with a solid
basis for important decision-making. The PCT system now has 125 member
nations and had 110,065 international applications in 2003. Because these
applications covered more than one country, this total represents the
equivalent of about 8.5 million national patent applications.
In closing, Mr. Smith emphasized that WIPO strives to ensure that intellectual
property protection benefits all nations. Accordingly, WIPO has developed
a forum to explore intellectual property issues arising from traditional
knowledge and folklore as well as the conservation, preservation, management,
sustainable utilization and benefit-sharing of genetic resources. In
addition, WIPO is giving special attention to issues facing least-developed
countries as they turn to meeting their intellectual property obligations
as members of the World Trade Organization.
Visit to the National Institute of Standards and Technology,
U.S. Department of Commerce
(Gaithersburg, Maryland)
(October 6, 2004)
Technology-based innovation remains one of the Nation’s most important
competitive advantages. Today, more than at any other time in history,
technological innovation and progress depend on the unique skills and
capabilities that abound at the National Institute of Standards and Technology
(NIST). NIST has a long and distinguished history of providing the necessary
standards, measurements, and measurement science and technology for the
United States and its industries. Founded in 1901 as the National Bureau
of Standards, NIST is a non-regulatory federal agency within the U.S.
Department of Commerce. Its mission is to develop and promote measurement,
standards, and technology to enhance productivity, facilitate trade,
and improve the quality of life for United States’ citizens. NIST
implements its mission within its laboratories, located within the Gaithersburg,
Maryland and Boulder, Colorado campuses, and three extramural programs;
the Baldrige National Quality Program, the Hollings Manufacturing Extension
Partnership, and the Advanced Technology Program.
The technology, measurements, and standards that are the essence of
the work done by NIST’s laboratories help United States’ industry
to invent and manufacture superior products and to provide services reliably.
NIST also promotes United States’ access to global markets and
a fair marketplace for consumers and businesses. The NIST Hollings Manufacturing
Extension Partnership strengthens the competitiveness of thousands of
America’s small and mid-sized manufacturers – not just preserving
but expanding jobs – with a broad array of technical and business
support services ranging from plant modernization and employee training
to business practices and information technology. NIST’s Baldrige
National Quality Program works closely with manufacturers, service companies,
and educational, health care, and non-profit organizations to develop
and disseminate world-class “best practices” for their management
and operation – that result in higher quality products and services.
The NIST Advanced Technology Program (ATP), through a competitive research
and development cost-sharing program, fosters the development of emerging
technologies that enable revolutionary new products, industrial processes,
and services for the world’s markets and helps lay the foundations
for the new industries of the 21st Century.
Upon the ComSci Fellows’ arrival, Mr. Mat Heyman, NIST Chief of
Staff, gave a tour of the exhibits stationed throughout the main lobby
of the Administration Building in Gaithersburg. Walking by the displays,
Mr. Heyman explained how United States’ industries rely on NIST
for standard reference materials and metrology. “NIST is in your
house,” Mr. Heyman informed the ComSci Fellows as they stared at
a refrigerator full of bread, lettuce, milk and other food products.
He went on to explain, “NIST does not work on regulating standards
but works closely with regulatory agencies to ensure that they are all
working from the same base.” It is through the work performed at
NIST that measures for nutritional values of food and its shelf life
have been derived.
United States’ industry is NIST’s customer. NIST is the
keeper of the standard reference materials and makes them available for
industry to measure against in assuring the quality of their product.
Engineers in the NIST labs develop performance criteria and standards
for many of the products that affect our daily lives. They define codes
and standards to increase the structural integrity of our buildings and
their performance when acted upon by forces such as hurricanes or earthquakes.
NIST is also in your doctor’s office. NIST provides the reference
materials for ensuring the quality and accuracy of such things as blood
tests, radiation measurements for medical treatments, and composite materials
that make up dental fillings. In the NIST Crime Laboratory, performance
standards are determined for bullet resistant armor, fingerprint systems
and biometrics. But don’t forget, NIST is very well-known for its
time keeping expertise. NIST maintains the atomic clock in Boulder, Colorado
that helps our national power grid run smoothly as well as the millions
of electronic transactions that occur each day both nationally and internationally.
To compete in what is now a global economy, the United States depends
critically on its ability to conduct innovative research and then translate
that research into new innovative products with a high potential to fuel
economic growth. The mission of the Advanced Technology program is to
accelerate development of innovative, high-risk technologies that enable
multiple end use applications that improve the daily lives of Americans.
Mr. Marc Stanley, Director of the ATP, talked to the ComSci Fellows about
the diversity of technologies that were co-funded by NIST ATP and private
industry, ranging from technologies to improve the fitting of auto parts
for a higher quality product with a competitive edge to DNA chips. Within
the ATP, Industry leads the way, identifying those technology areas where
they believe government investment can make the biggest impact. Since
1990, ATP has co-funded 768 projects involving United States, for-profit
companies, universities, national laboratories and non-profit organizations.
It was then time to leave the Administration Building and head for the
NIST Center for Neutron Research (NCNR), one of the four large neutron
scattering facilities in the United States. Neutrons are powerful probes
of the structure and dynamics of materials. They reveal properties not
available to other probes. They can behave like microscopic magnets,
can diffract like waves, or set particles into motion as they recoil
from them. These unique properties make neutrons particularly well-suited
to investigate all forms of magnetic materials such as those used in
computer memory storage and retrieval. Atomic motion especially that
of hydrogen can be measured and monitored, like that of water during
the setting of cement. Residual stresses such as those inside stamped
steel automobile parts can be mapped. Neutron-based research covers a
broad spectrum of disciplines, including engineering, biology, materials
science, polymers, chemistry, and physics. The NCNR supports important
NIST research needs, but is also operated as a major national user facility
with merit-based access made available to the entire United States’ technological
community. Each year, over 1,700 research participants from all areas
of the country, from industry, academia, and government, use the facility
for measurements.
A visit to the Ionizing Radiation Division within the NIST Physics Laboratory
was next on the agenda. Ms. Lisa Karam, Acting Division Chief, explained
to the ComSci Fellows the importance of the work performed in this lab
to quality assurance of nuclear pharmaceuticals. This Division provides
national standards for radionuclides used in 13 million diagnostic procedures
and 200,000 therapeutic nuclear medicine procedures annually in the United
States. Mr. Stephen Seltzer, Group Leader of the Radiation Interactions
and Dosimetry Group, showed the ComSci Fellows the radioactive seeds
used in radiation therapies to treat prostate cancer and to prevent restenosis
following balloon angioplasty. He explained how the work of his division
is critical to the calibration of these seeds for accurate dosage. Work
within the Ionizing Radiation Division extends into other areas beyond
medicine. Scientists are exploring applications in worker protection,
environmental protection, and even national defense. For example, after
the 2001 anthrax incident in the United States, research was performed
in this NIST laboratory to determine accurate doses of radiation required
to kill anthrax spurs in mail. It is clear that the Ionizing Radiation
Division provides national leadership in promoting accurate, meaningful,
and compatible measurements of ionizing radiations and radioactivity
for applications that improve everyone’s quality of life.
The next stop brought the ComSci Fellows to NIST North, a part of the
NIST Gaithersburg campus where much of the Information Technology Lab
resides, including the Advanced Network Technologies Division where the
motto is “Provide the networking industry with the best in test
and measurement technology." This Division is committed to improving
the quality of networking specifications and standards and to expedite
the commercial availability of new, high-quality networking products.
In this vein, the High Speed Network Technologies Group has delivered.
Ms. Nada Golmie, Group Leader, with several of her colleagues demonstrated
their Generated Multi-Protocol Label Switching (GMPLS)/Lightwave Agile
Switching Simulator (GLASS) product. GMPLS/GLASS was developed to address
failure and recovery issues within networks that are fully connected
meshes and that typically comprise our Nation’s communications
infrastructure. With this tool, simulations of complex networks can be
run to perform sensitivity analyses and identify weaknesses within the
infrastructure, eliminating the need for testing on actual networks and
reducing costs.
The ability to manipulate molecules and atoms, and see them one-by-one
has advanced tremendously over the past ten years. Now, because of the
research performed within the NIST Manufacturing Engineering Laboratory’s
(MEL) Nanoscale Metrology Group, one can see and build things at the
atomic scale. Dr. Thomas LeBrun, a physicist within MEL’s Precision
Engineering Division, explained how nanomaterials can be used to build
nano-systems through the process of grabbing and manipulating them with
lasers that act as tweezers. With this technique, scientists can manipulate
components inside a cell without damaging them or the cell wall. Dr.
LeBrun then demonstrated the laser tweezers by “picking up” a
nanowire and moving it across a microfluidic plate. This breakthrough
will have a significant impact on the advancement of the United States’ microelectronic
manufacturing industries.
The last stop of the day was certainly no less intriguing than all of
the previous visits of the day. The Building and Fire Research Laboratory
(BFRL) at NIST performs studies on building materials, computer-integrated
construction practices, fire science and fire safety engineering, and
structural, mechanical, and environmental engineering. It is to NIST
and this lab that the Nation turned to lead a technical investigation
of the World Trade Center (WTC) disaster. Dr. Kevin McGrattan, a Mathematician
within the Fire Research Division of BFRL contributed to this investigation
by trying to answer questions like “Why did the WTC fire not look
much like other fires in high-rise buildings?” To answer this and
other fire-related questions, Dr. McGrattan utilized visual images from
outside of the World Trade Center buildings as input into his simulation
tool. The simulation uses numerical models that are applied to the numerous
factors associated with fires in high-rise buildings such as combustion,
smoke movement, and the interchange of hot and cool gases. Dr. McGrattan
was able to validate his simulation results by using the BFRL test chambers
where actual fires are created in a controlled environment instrumented
with a number of sensors for data collection. As a result of this study
and other on-going innovative research within BFRL, the Nation will benefit
through improvements in the way buildings are designed, constructed,
maintained and used.
Gregory Tassey
Senior Economist
National Institute of Standards and Technology (NIST)
U.S. Department of Commerce
(October 14, 2004)
Topic: Research and Development Investment Trends in Manufacturing
and the Role of Government
Dr. Gregory Tassey gave a compelling presentation about government science
and technology policy and the analytical tools with which to manage
policy. Over the past 15 years or so, many Federal agencies and especially
NIST, have come under increasing political pressure to justify their
technical programs’ results.
Dr. Tassey began by stating that technical knowledge shouldn’t
necessarily be an end in itself. Rather, investment in basic and applied
research and development (R&D) is an input that leads to an output
of new technologies, which leads in turn to economic growth. Greater
economic growth is an outcome that can be used to address a variety of
social welfare issues, broadly defined, including a higher standard of
living, better health care, greater national security, and so forth.
He also made the distinction between public and private goods. Basic
science falls into the former category because it is widely used by many
people and so much of it is done by the Government. Applied research
is a pure private good because it benefits only certain segments of the
population and so is typically conducted by private industry. Research
that is not so clearly basic or applied is grounds for debate about whether
Government or industry should be the primary sponsor. In response to
a question about the optimal mix of Government-private sector investment,
Dr. Tassey responded that this needs to be addressed at the microeconomic
level.
In advocating politically for their agencies’ science and technology
(S&T) programs, it is critical for analysts to identify specific
underinvestment in a particular area and to quantify this gap. This should
then lead to strategic planning.
Historically, our intensity of R&D investment hasn’t changed
significantly since the Sputnik era. Geographically within the United
States, a small number of states account for the vast majority of domestic
technological innovation.
In terms of being competitive internationally, “off-shoring” of
jobs has existed for centuries and is not necessarily an economic problem.
However, the plummeting United States’ trade balance since the
early 1990s indicates that the rest of the world has become more competitive.
He dismissed the rationale for trade protectionism that it gives domestic
industries “more time to catch up” as being false in that
such industries just become stale and inefficient.
Next generation (disruptive) technologies account for a very significant
share of industrial profits, although most technological innovations
tend to be incremental.
Dr. Tassey also stressed the importance of the government helping to
build appropriate industrial bases for long-term economic development.
Thus, for example, perhaps the Government should not worry so much about
developing new, more efficient weapons systems per se but should be concerned
with creating the infrastructure and incentives for a strong defense
industrial base that can respond agilely to the Pentagon’s needs.
Somewhat similarly, the goal of a Small Business Innovation Research
program should be to foster an industrial base of small companies that
can be more innovative, nimble, and fill niche markets more efficiently
than large corporations.
Dr. Tassey does a superb job of quantifiably documenting the value of
his agency’s S&T contributions, as well as similar issues on
a macroeconomic level. His economic analyses and understanding of how
politics, economics, and science and technology converge are excellent
models for other agencies. He also did a great job of communicating economic
concepts in layperson’s language. It is only a pity that his work
is virtually unique in the Government.
Dr. Tassey’s website is at: http://www.nist.gov/public_affairs/budget.htm.
Thomas A. Weber
Director, Division of Materials Research
Directorate for Mathematical and Physical Sciences
National Science Foundation
(October 20, 2004)
Topic: Materials Research at the National Science Foundation
The focus of Dr. Thomas Weber’s presentation was to provide a
broad overview of the materials research at the National Science Foundation
(NSF). In his opening remarks, Dr. Weber highlighted NSF’s vision,
mission, and strategic goals. NSF was established through an Act of Congress
in 1950. NSF’s vision is to enable the Nation’s future through
discovery, learning, and innovation. NSF’s mission is to promote
scientific progress, advance national prosperity, and secure national
defense. NSF’s strategic goals highlight the role of Ideas, People,
and Tools.
Dr. Weber mentioned that NSF’s focus is on academic institutions
promoting research and education in all areas of science and engineering
except medicine and space. The Assistant Director for Mathematical and
Physical Sciences (MPS) manages the Division of Materials Research and
reports to the Director and Deputy Director of NSF who in turn report
to the National Science Board. In the fiscal Year 2004, a total of 24,860
people were involved in MPS activities.
NSF invests in the best ideas from the most capable people, as determined
by competitive merit review, which judges each proposal against intellectual
merit and broader impacts of the proposed activity. NSF’s support
for materials ranges from fundamental phenomena to functional materials,
devices, and systems. Some of the areas of research include synthesis,
processing, properties, theory and modeling, characterization, design,
and manufacturing. The “materials community” includes materials
scientists, physicists, chemists, engineers, educators and more.
Division of Materials Research (DMR) funds diverse programs in the areas
of metals, ceramics, electronic materials, condensed matter physics,
etc. The crosscutting programs include materials theory, materials centers,
user facilities, and instrumentation, and office of special programs.
The Distributed Mechanisms include focused research groups, workshops,
conferences, NSF-wide programs such as CAREER, EPSCoR. DMR’s support
of materials in Fiscal Year 2003 was approximately $250 million. A total
of 5,814 people were supported by DMR’s research grants in Fiscal
Year 2003.
Some of DMR’s research facilities include Center for High Resolution
Neutron Scattering, Cornell High Energy Synchrotron Source, National
High Magnetic Field Laboratory, Synchrotron Research Center, and National
Nanofabrication User Facility. The Basic Science Cluster includes Condensed
Matter Physics, Polymers, and Solid-State Chemistry. Some examples of
research work are Tunneling Spectroscopy of Electron-in-a-Box Energy
Levels in Metal Particles, Polymers for Self-Assembled Biomaterials,
Dynamics of Macromolecules, and Construction of Metal-Molecule-Metal
Bridge.
Advanced Materials and Processing Cluster involves metals, ceramics,
and electronic materials. Some examples of current research are in situ
processing of superconducting MgB2-Metal Composites, Domain Specific
Surface Reactivity of Ferroelectric Surfaces, and Growing Virtually Defect-free
Germanium on Silicon.
Materials Research and Technology Enabling Cluster involves Materials
Research Science and Engineering Centers, Materials Theory, National
Facilities, Instrumentation for Materials Research, and Office of Special
Programs. Some examples of current research are self-assembling devices,
computational materials design based on novel spectral density functional,
The Thompson Problem and Spherical Crystallography.
NSF’s strategic goals involve People (a diverse, internationally
competitive and globally-engaged workforce), Research Experience for
Undergraduates (REU) and Teachers (RET), and Partnerships for Research
and Education in Materials. In the summer of 2003, DMR supported 73 REU
sites and 21 RET sites in which more than 1,000 undergraduates and 100
pre-college teachers participated. DMR also has Partnerships for Research
and Education in Materials (PREM) program, which awards up to $750 thousand
a year for five years to minority institutions.
One other strategic goal is Tools (accessible, state-of-the-art information
bases and shared tools). DMR supports materials instrumentation and instrument
development efforts through research awards, grants to centers, funding
instrumentation programs and major instrumentation programs.
NSF is actively involved in international activities and supports are
provided through regular awards and co-sponsorship of several international
workshops. One major initiative that is currently underway is to develop
a Materials World Net as a resource for research and education. Examples
of international cooperation are NSF-EC Workshops, International Materials
Institutes, United States-Africa Interactions, Implementation meetings-Asia-Pacific
region and Africa, Planning activities-India, Russia, and Middle East.
Some data from current international materials collaboration projects
are: 247 collaborative proposals, 51 multi-year awards, and $18.4 million
total award money. Some other International Programs of Interest are:
MPS Distinguished International Postdoctoral Research Fellowships, Pan
American Advanced Studies Institutes, International Research Fellowship
Program, Japan Postdoctoral Fellowships, etc.
NSF is actively engaged with the National Nanotechnology Initiative
(NNI). NSF’s funding toward Fiscal Year 2003 was $221 million,
which includes efforts in areas such as biological sciences, engineering,
mathematical and physical sciences etc. Some of the Fiscal Year 2005
program solicitations are in Nanoscale Science and Engineering (NSE),
Nanotechnology Science and Engineering Education (NSEE). Characteristics
of NSF Centers are: Interdisciplinary/multidisciplinary research groups,
educational component, industrial outreach, and shared instrumentation.
There are several NSE Centers such as Columbia, Cornell, Harvard, Northwestern
etc. NSF has several Materials Research Science and Engineering Centers
located throughout the country (http://www.mrsec.org).
NSF’s nanoscale efforts also include Educational and Societal
Outreach. NSF is educating students and society to technologically literate
in nanotechnology and encouraging medical professionals to avail themselves
of the latest advances in nanotechnology. NSF is aware of the societal
and educational implications of science and technology advances and is
educating community about the health implications of nanotechnology.
Finally, Dr. Weber spent some time in educating the ComSci Fellows about
scientific ethics. He drew on an example involving a scientist at Bell
Labs who had severely violated scientific ethics.
Overall, this was a very productive seminar. The ComSci Fellows learned
a great deal about NSF, its initiatives in materials research, and NSF’s
focus on nanotechnology.
Websites of interest include: http://www.nsf.gov and http://www.nano.gov.
Anne Kelly
CEO/Executive Director
Federal Consulting Group
A Franchise of the U.S. Department of the Treasury
(October 27, 2004)
Topic: Changes, Challenges, and Opportunities for Future Leaders.
Ms. Anne Kelly directs a team of senior executives who provide management-consulting
services to senior leaders throughout the Federal Government. She performs
this function from a broad perspective of the Federal Government obtained
through a variety of senior positions held at the United States Patent
and Trademark Office in the U.S. Department of Commerce. She knows the
inner workings of the Federal Government well and is able to work within
the system to change and improve it. She is a student of women’s
issues in the Federal Government and has been an activist in promoting
the role of women in the Federal Government.
Ms. Kelly presented herself as the hands-on coach that her position
requires her to be. She used powerful metaphors and figures of speech
and loaded the ComSci Fellows with many powerful quotes, which should
help the ComSci Fellows in their own respective quests to make a difference.
Some of these are represented here.
For example, in her presentation she made a compelling case for perpetual
change within the Federal Government (and outside of it for that matter).
She started from the proposition that has seen more changes in society
in the last 30 years than in all of human civilization before then. To
make this exponential development palpable, she used an effective model.
Assume human civilization, as we know it started 5,000 years ago with
the earliest writings and the invention of the wheel, and project these
5,000 years onto a period of 24 hours. In so doing it can be seen that
the first Olympics appear on our time scale about 13 hours ago. Ten hours
ago, paper was invented and three hours ago, we learned how to print
on it (i.e., Gutenberg). The Star Spangled Banner was written (and distributed,
printed on paper) one hour ago. The Civil War was 45 minutes ago, the
moon landing 14 minutes. The Internet started three minutes ago, and
September 11, 2001 happened less than a minute ago. A society that changes
at this rate needs a government that can keep pace with it if it is to
lead. In other words it has to change very rapidly, or else it will become
obsolete.
Accepting that change is necessary, Ms. Kelly touched upon the principles
and tools that leaders should have at their fingertips to lead change.
These ranged from the practical: the necessity to communicate and convince
(particularly through listening), to the political: “the grapevine
is a powerful tool to effectuate change.” Change, should be seen
as a form of “creative destruction,” or as an engineer would
have it: “planned obsolescence,” or the politician: “the
burning platform.” Some of these tools were presented within the
context of a series of short stories and case studies, which are paraphrased
below.
-- Know the Problem: Government can be seen as a machine that has as
its function the automation of [bad] processes. As soon as this is detected,
one has to act, but not randomly, or by another process such as change
for the sake of change. Instead follow a course of change followed by
evaluation and adjustment. Change is most productive as an evolutionary
process.
-- Create Enthusiasm: In order to get anything done, one has to energize
the troops and get the top leaders committed, by showing what is in it
for them. Allow the boss to take credit for your success. Since every
leader is in fact some sort of a middle manager (“everyone has
a boss”), modern leaders should create a culture of “followership.” In
other words, it makes sense to teach people how to manage up. This means
that if you are the boss, lead by example (i.e., be led).
-- Document your work: According to Ms. Kelly this is the single most
effective use of e-mail, a tool with which so many people do otherwise
more damage than good. It is not a good communication tool for change
because it fosters mistrust.
-- Drive the change: Start the process – it makes no sense to
wait until the next boss arrives. The Department Secretary will go, regardless
of who gets elected (note that this remark was made less than one week
before the 2004 presidential elections, in which President George W.
Bush was to win a second term).
-- Baldrige Process: Ms. Kelly is an active examiner for the Baldrige
Quality Award, which places the customer in the driver’s seat.
The Baldrige criteria states that after the customer, the business systems
are the most critical and with that the leadership, and the employees.
-- Patent Office: The customer used to be – the law and law firms.
After careful analysis it turned out that this was ripe for a paradigm
shift. The Patent Office now has a new customer image: inventors. This
may sound trivial but according to Ms. Kelly, this took a lot of work
to make happen.
-- Department of Homeland Security: In the South Texas port of entry
that goes by the name of Port Isabel, at any given time 800 people are
detained for some sort of immigration law violation. Port Isabel had
the reputation of being the worst camp for immigration detainees. Top
leadership was aware of the problem, but has no idea what the solution
was. Finally, the employees were asked what should be done about this.
A two and a half-day session was organized in which consensus was reached
on those things that were wrong in the detainee center. There was also
consensus that the condition of the aliens should be improved. It turned
out that the alien was a compelling customer.
An important conclusion taken away from this presentation was that the
ability to be trusted is the single most important attribute a leader
can have. Being trusted is not something anyone can achieve overnight.
Ms. Kelly gave the ComSci Fellows some issues to take away and think
about. For example:
-- Recognize the current ever-changing workforce. Accept the existence
of generation conflicts and enjoy its diversity.
-- Allow 360-degree assessment [of yourself]. The more painful you think
it is going to be, the more necessary it is. It is a great trust builder.
-- Lead by example: if you want continuous learning, be a continuous
learner.
Kenneth Alibek
Executive Director
The National Center for Biodefense
George Mason University
(November 3, 2004)
Topic: Bioterrorism
Dr. Kenneth Alibek is the Executive Director Education and Science for
George Mason University’s National Center for Biodefense and is
a Distinguished Professor at George Mason University. He also holds the
positions of President and Chief Scientist of Advanced Biosystems. Dr.
Alibek is responsible for establishing collaborations with scientific
and other organizations as well as overseeing research for the National
Center for Biodefense. As a Distinguished Professor of Medical Microbiology
and Immunology, Dr. Alibek conducts research and teaches in the areas
of microbiology, immunology, and biotechnology. At Advanced Biosystems,
he leads medical and scientific research programs dedicated to developing
new forms of medical protection against biological weapons and other
infectious diseases.
Dr. Alibek was born in Kazakhstan prior to the break-up of the Soviet
Union and defected to the United States in 1992. He was educated in the
Soviet Union and received multiple degrees in his field of expertise,
including Biological Sciences (Biotechnology), Moscow, Russia, 1990;
Ph.D., Microbiology, Moscow, Russia, 1984; and MD (specializing in Infectious
Diseases and Epidemiology), Tomsk, Russia, 1975. Dr. Alibek served as
First Deputy Chief of the civilian branch of the Soviet Union’s
offensive biological weapons program and has more than 20 years of experience
in the development, management and supervision of high containment (BL-4)
pathogen laboratories. He has extensive knowledge of biotechnology, including
bioprocessing, biological weapons threat analysis; antibacterial and
antiviral drug development; development of regimens for urgent prophylaxis
and treatment of the diseases caused by biological weapons; and mass
casualty handling. He is a former Soviet Army Colonel.
Since defecting to the United States, Dr. Alibek has subsequently served
as a consultant to numerous United States’ government agencies
in the areas of industrial technology, medical microbiology, biological
weapons defense, and biological weapons nonproliferation. He has worked
with the National Institutes of Health and testified extensively before
the U.S. Congress on nonproliferation of biological weapons in trying
to raise the knowledge base and alertness of this important threat to
our county. Dr. Alibek has published articles in a number of classified
journals on the development in the field of biological weapons, biological
weapons threat, and on medical aspects of biodefense.
Dr. Alibek began his presentation by describing biological weapons – which
are weapons that are based on pathogenic microorganisms or toxic substances
of biological origin, formulated in such a way that they are capable
of disabling or/and killing people and livestock, as well as munitions
and delivery systems for deployment. He described the classes of weapons:
(1) Viral, (2) Rickettsial, (3) Fungal, (4) Toxin, and (5) Bio-regulators
(mediators of various systems). He then described the methods of transporting
the weapon to an adversary: (1) Vector (i.e., mosquito), (2) contamination
of food or water sources, or (3) aerosols (described as the most effective
method). Dr. Alibek further described the three types of categories of
weapons being developed by scientists and engineers in multiple countries.
The first category is “Lethal” weapons like anthrax, plagues,
small pox, Ebola virus, or yellow fever. The second category is “Lethal/Incapacitating,” like
West Nile encephalitis or SARS corona virus infection. The last category
is “Incapacitating”, like influenza or monkey pox.
After describing the types and purposes for using biological weapons,
Dr. Alibek described the manufacturing capabilities and depot levels
in the former Soviet Union that still exists in parts of Russia and Kazakhstan:
200 tons of Anthrax stockpiled in the Sverdlovsk facility, 20 tons of
Plague stockpiled in Kirov facility, and 20 tons of Smallpox stockpiled
in the Zagorsk facility. All facilities can manufacture similar levels
annually to replenish stockpiles if used or obsolete due to shelf life
issues, or produce stockpiles of other agents listed above.
Dr. Alibek furthered the discussion by describing the research of modifying
natural strains of agents into more effective weapons to further shelf-life
like bulk dry storage, by introducing binders to protect the spores when
dispensed from munitions and genetically altering strains to be more
resistant to antibiotic drug therapies. He also discussed the methods
for deploying such weapons, air-delivered cluster bombs, spray tanks,
ballistic missiles, cruise missiles, and special operatives.
Due to Dr. Alibek’s extensive educational background and interesting
life experiences, the ComSci Fellows felt they were extremely privileged
and fortunate to hear Dr. Alibek speak on such a relevant and important
subject of bioweapons and bioterrorism. Dr. Alibek stated there is evidence
that most countries conduct research and may also produce biological
weapons including North Korea, France, United Kingdom, South Africa,
Iran, Iraq, Israel, Germany, and possibly Brazil, to name a few, but
he stated most countries have some sort of biological weapon research
program, if not just to understand the threat and countermeasure to such
a threat.
As for the Soviet Union’s policy, Dr. Alibek stated the Soviet
Union does not stockpile anti-agents or cures to protect their own population,
because they considered using bioweapons analogous to using other weapons
of mass destruction, under the vale of the policy of “assurance
of mutual destruction.” In other words, the weapons would only
be used as a last resort, where destruction of the entire modern world
civilization, similar to a nuclear holocaust, was the known result. Now
that the iron curtain has fallen and the Soviet Union no longer exists,
there is the pressing issue or concern that such dreadful weapons or
the knowledge to develop weapons could be acquired by terrorist supported
countries or non-state actors and the threat of the use of such type
of weapons have increased.
Visit to Pittsburgh, Pennsylvania
(November 16-18, 2004)
The ComSci Fellows’ three-day trip to Pittsburgh, Pennsylvania
began by stopping first at Penn Power’s Bruce Mansfield Power Plant,
a coal-fired power plant in Shippingport, Pennsylvania. Penn Power is
part of FirstEnergy Corporation. The Bruce Mansfield Power Plant was
built in 1976 and was the first utility plant built with a scrubber system
to remove sulfur dioxide from its emissions. The scrubber system works
by spraying a liquid lime substance into the flue gas, which creates
calcium sulfite, a lime-based by-product.
Ms. Amanda Leech, and Mr. John Hindman, Communications and Outreach
Manager, both from Science Applications International Corporation (SAIC)
met the ComSci Fellows outside the Plant’s facilities. Ms. Leech
introduced the group to Mr. James “Jim” Mooney, a bulk materials
specialist, with Bruce Mansfield. On the drive up to the first stop of
the tour of the facilities, Mr. Mooney pointed out various points of
interest on the grounds of the facility, such as where the coal is stored,
the stacks and the cooling towers. He told the ComSci Fellows that the
plant burns seven million tons of coal a year. The bulk of the coal comes
from Pennsylvania mines by barge and by rail. The facility is state-of-the-art
from an environmental perspective.
Upon arrival at the Plant, the ComSci Fellows had a formal presentation.
It was learned that FirstEnergy developed a process called FOG (Forced
Oxidation Gypsum) for use in the scrubbing system of the smoke stack.
The process creates a by-product called calcium sulfite, which is usually
placed in a landfill. FOG converts the by-product into commercial-grade
gypsum. Other interesting facts were:
-- Although river water is used in the cooling system, no waste water
is returned to the river.
-- Fifty-five percent of the operating costs of the plant are used for
the environmental system.
-- Five percent of the electricity generated by the plant is used for
the environmental system.
-- At full operating capacity, all three generators, 24,000 tons of
coal a day is consumed.
-- Ninety-four percent of the sulfite is removed by the scrubbers; 98
percent removal is targeted.
-- 280,000 gallons of water a minute flow over the cooling tower.
-- The Plant produces 56-million kilowatt-hours of electricity a day.
-- The Plant employs 475 people.
-- The U.S. Environmental Protection agency has its own monitors at
the Plant to ensure environmental compliances.
The tour of the Plant concluded with a stop to tour the massive turbines
and the control room. High pressure steam that was created from the ignition
of the coal turns the turbines and this generates the electricity. The
control room utilizes a mix of computer controlled and non-computer controlled
sensors.
Later in the afternoon, the ComSci Fellows visited the National Gypsum
Company. In 1999, the National Gypsum Company facility, located adjacent
to the Bruce Mansfield Power Plant. In 1999, the National Gypsum Company
built an $85-million facility to manufacture wallboard from the gypsum
created by the FOG process. Gypsum is a mineral that naturally occurs
in many parts of the world. In scientific terms, it is hydrous calcium
sulfate. In nature, it usually occurs in veins or ledges and is normally
found close to the surface, where it can be mined or quarried easily.
Gypsum is the only natural substance that can be restored to its original
rock-like state by the addition of water alone. Benjamin Franklin was
one of the first to introduce gypsum to this country. Gypsum is used
in some well-known brands of toothpaste. It is often used as a plaster
to mold everyday objects like plates, cups, eating utensil handles, etc.
By far, the most prevalent use of gypsum is for wallboard manufacture.
Mr. Mark Young, Quality Assurance Manager at National Gypsum, hosted
the ComSci Fellows. After the introductions, the group embarked on a
walking tour of the facility. The first stop was the storage facility
for the gypsum. Over 1,000 tons of FOG a day from Bruce Mansfield is
used at National Gypsum. Additional daily operations of the plant, such
as how the gypsum is moved from the storage facility to the production
area, were explained to the group as they trekked around the facility
grounds.
The gypsum comes to National Gypsum from the Bruce Mansfield Power Plant
on a 1.5 mile-long conveyor belt. After finishing the tour of the outside
of the Plant, the group entered the actual production facility of National
Gypsum.
In the highly-automated, production section of the plant, the group
observed the gypsum slurry being sprayed between a moving sheet of light-colored
paper and another moving sheet of darker-colored paper, effectively making
a “sandwich” that was formed into wallboard at the forming
station. The long, continuous “sandwich” then travels on
belts and conveyors to a knife, where it is cut into panels of specific
lengths. This long line allows time for the gypsum slurry to harden before
it is cut (about four minutes). The panels are turned light-colored paper
side up and sent into the kiln to dry. It was explained that it takes
a total of approximately 45 minutes for the panels to go through the
four drying stages. The panels enter the kiln much like slices of bread
entering a food service toaster oven. The entire process line is one
quarter of a mile long. Once the wallboard is dry, it is a strong, hard,
and fire-resistant building material.
The ComSci Fellows were told that approximately 99 percent of the content
of the wallboard manufactured at National Gypsum is composed of recycled
material from power plants. The Plant produces approximately 3.2 million
square feet of wallboard a day. Many of the ComSci Fellows found the
simple elegance of the automated manufacturing progress impressive. It
only takes 13 people in a shift to run the entire Plant. The process
is fast and has the capability to produce wallboard of different thicknesses
with different fire ratings. The tour of National Gypsum was considered
by many to be the highlight of the day’s events.
On the second day of the Pittsburgh visit, the ComSci Fellows found
themselves being led through a coal mine by Mr. Paul Stefko of the National
Institute of Occupational and Safety Health, U.S. Department of Health
and Human Services. Following the excellent tour and briefing in the
mine, the ComSci Fellows settled into a conference room at the National
Energy Technology Laboratory (NETL), where numerous presentations were
given, including a welcome and introduction by Mr. James M. Ekmann, Associate
Director, Office of Technology Imports and International Coordination;
and a brief on America’s energy picture (e.g., supply, distribution,
demand, deregulation, coal, gas, oil, renewables, nuclear, distributed
generation, fuel cells, hydrogen, advanced combustion, FutureGen, Clean
Coal Power Initiative, etc.) and an overview of NETL by Mr. Ekmann.
Following lunch, Dr. Anthony Cugini, Computational and Basic Sciences
Focus Area Lead, briefed the ComSci Fellows on “computational energy
science.” Additional briefing followed: “geological sequestration
and CO2 capture” by Dr. Curt White, Senior Management and Technical
Advisor, Office of Science, Technology and Analysis; “watershed
science and technology” by Mr. Terry E. Ackman, Geosciences Division,
Office of Science, Technology and Analysis; “environmental quality
technologies” by Dr. Evan Granite, Research and Chemical Engineer
and Mr. Donald Martello of the Environmental Science Division, Office
of Sciences, Technology and Analysis.
The day ended with a visit to the Air Quality Monitoring Facility
The first stop on the third day was to the University of Pittsburgh
Medical Center where the ComSci Fellows were welcomed by Ms. Jody Cervenak,
Chief Information Officer Physician Division and Mr. Dan Drawbaugh, Chief
Information Officer for the University of Pittsburgh. Also present were
Ms. Cathy Poole, Integrated Medical Information Technology Systems, Dr.
G. Daniel Martich, Executive Director of E-Records and Dr. Loren Roth,
Senior Vice President and Chief Medical Officer Quality Care, Associate
Senior Vice Chancellor of the Health Sciences at the University of Pittsburgh
and co-chairman of bioterrorism preparedness.
Dr. Loren Roth was the first speaker of the morning and spoke about
the University of Pittsburgh Medical Center (UPMC) system, which consists
of the university and the medical center complex. The two separate corporate
entities share expertise and pursue the goal of providing the best possible
patient care. UPMC ranks eighth nationally in NIH funding, is a national
leader in the use of advanced information technology and the second largest
employer in western Pennsylvania with 39,000 employees. One of UPMC’s
goals is to optimize health care delivery through information technology.
The second speaker of the morning was Mr. Dan Drawbaugh who continued
lecturing on the UPMC system and the electronic health record initiative.
UPMC is an integrated health care system consisting of 20 hospitals in
western Pennsylvania and 1 hospital in Palermo, Sicily. There are approximately
2,000 physicians with UPMC and more than 4,000 with privileges at UPMC
hospitals. In addition, UPMC operates an insurance division, provides
diversified services such as home care and nursing home care, and invests
in diverse health related industry, including bioinformatics and medical
equipment.
The next speakers were Dr. Martich and Ms. Cervenak, who spoke about
the National Strategic Agenda. The goals are to inform clinical practice
through electronic health records, interconnect clinicians, personalize
care and improve population health. The first goal has been met with
the development of electronic records (e-records) of patients. These
e-records are accessible to all of the hospitals/physicians in the UPMC
system. UPMC now requires data entry by all physicians including drug
prescriptions. The University component has also started a new course
for medical students/pharmacists/nurses on the benefits of data entry.
A pilot project, designated Med-Track, is aimed at meeting the second
goal of the National Strategic Agenda. Med-Track will improve the communication
infrastructure between physicians through e-records. The e-records will
allow physicians within the system to have access to medications, allergies,
lab data and radiographic data etc. for any patient with an established
e-record. In response to the third goal, a pilot project designated “health
track” is being tested by UPMC. The health track system will allow
physicians to directly send messages to patients and to continuously
monitor their patients. Similarly, patients will have better access to
their physicians. Built into the system is the ability of patients to
make on-line appointments and monitor other factors such as weight and
BMI. These efforts, based on the National Strategic Agenda, have not
gone unnoticed and Information Week recently recognized UPMC as the most
innovative users of information technology in medical and health care.
One of the benefits of this electronic system has been a dramatic decrease
in errors that were previously attributed to illegible handwriting.
The last speaker of the morning was Ms. Poole who briefly spoke about
UPMC’s partnership with the U.S. Air Force and the development
of programs for health initiatives going high-tech. In 2001, UPMC established
a partnership with the Department of Defense (DOD) that focuses on health
care delivery and technology. The overall goals of this partnership are
to improve patient care through advanced technologies, establish UPMC
as a national model for improving the Nation’s health care delivery
system, and support UPMC’s significant investment in technology.
This partnership with DOD was driven by a decline in specialists in the
private and military sectors, a need to provide ubiquitous access to
care and a desire to improve the quality of care. In 2003, UPMC established
the DOD Program Management Office, which centralizes management of the
DOD programs, identifies business development and assists in government
relations. The medical center will also soon enter into partnership with
IBM/Hewlett Packard to promote research in health-related health information
technology.
The next stop on day three of the Pittsburgh trip was to Solar Power
Industries, Inc. in Belle Vernon, Pennsylvania. Mr. Richard Rosey, Vice
President of Marketing and Sales, greeted the ComSci Fellows. Mr. Rosey
first gave the group a brief lecture about the company’s history
and about solar power.
The company was first started as part of Westinghouse for the development
of a silicone crystal. It was then sold to EBARA Corporation, which makes
agricultural and water pump machinery. The new company became Ebara Solar
Inc. EBARA Corporation initially invested in Ebara Solar to develop energy
sources for remote irrigation and water supply pumps. In 2003, the parent
company EBARA cut its funding and auctioned all of Ebara Solar Inc.,
which were purchased by King of Fans, best known as one of the Nation's
largest manufacturers of ceiling fans. King of Fans subsequently established
the present company Solar Power Industries Inc., which has 60 employees
and is engaged in the manufacturing, marketing and sales of photovoltaic
(PV) solar module technology.
Solar Power Industries standard products include solar cells, modules
and systems. Approximately 95 percent of the company’s sales are
in solar cells. The solar cells are 150 mm by 150 mm, which is one of
the largest solar cells available in the industry. Their automated cell
processing line is capable of producing these cells using multi-crystalline
silicon wafer substrates, typically 300 microns thick. These cells are
available in two standard bus bar configurations: a two-bus and a three-bus
design. The front surface has a blue silicon nitride anti-reflective
coating deposited by PECVD. This coating minimizes reflectivity and increases
absorbance. The front bus bars and back interconnect contacts are screen-printed
and fired silver, with widths of 2 mm and 5 mm, respectively. The back
surface has an alloyed aluminum layer, making the back side positive
and the front side negative upon illumination. The main factor that limits
the amounts of these cells that can be sold is the supply of solar grade
silicone.
The standard power modules are available in 30-, 50- and 100-watt sizes,
with designs under way for up to 200 watts. The company also provides
custom designs with versatile packaging of the solar cells, opening the
way to integrating the power sources into many portable products, such
as communications, marine, recreational, automotive, and traffic control
applications. They also focus on working with architects to provide true
building integrated photovoltaic modules for commercial and industrial
utility grid connected applications.
Solar renewable energy has been growing approximately 25 percent in
the past five years but most of this growth is outside the United States.
The U.S. Department of Energy has an annual budget designated for solar
power but only a small fraction of the budget is for manufacturing. California
currently has legislation to have a certain percent of its energy derived
from solar power. Outside the United States, some countries (e.g., Germany,
Japan) promote the use of solar power. In this regard, most of Solar
Power Industries products are exported to Germany and China whereas the
United States controls about ten percent of the market. The solar panels
are sold globally whereas systems are mostly for local sales.
After the introduction to solar power and Solar Power Industries, Inc.,
the ComSci Fellows were given a field trip around the facilities and
witnessed the production and testing of solar cells.
David W. Houseknecht
Energy Program Manager
U.S. Geological Survey
(December 1, 2004)
Topic: Artic National Wildlife Refuge (ANWR) Petroleum Assessment
Dr. David Houseknecht joined the U.S. Geological Survey (USGS) in 1992,
serving as Energy Program Manager until 1998. He has worked on Alaska
North Slope basin analysis and petroleum resource assessments since 1995.
He frequently has represented the USGS scientific perspective of ANWR,
NPRA, and other Alaska oil and gas issues to Congress and the Administration.
Previously, Dr. Houseknecht was a professor of geology at the University
of Missouri (1978-1992) and a consultant to the oil industry (1981-1992),
working on domestic and international projects. He received geology degrees
from Penn State University (Ph.D. 1978, B.S. 1973) and Southern Illinois
University (M.S. 1975).
Dr. Houseknecht provided a very timely, interesting and informative
description of a new 1998 petroleum reserve assessment conducted by USGS
for the Artic National Wildlife Refuge, 1002 Area; the presentation included
an economic analysis of the viability of making the large investment
in the exploration, production, and pipeline infrastructure necessary
to bring the remote petroleum reserves to market in the United States
and abroad. Dr. Houseknecht also described the environmentally sensitive
issues involved in the potential development of the reserves that are
located in the northwestern part of ANWR on the Artic Ocean coastline.
ANWR was established by the Alaska National Interest Land Conservation
Act in 1980. In Section 1002 of the Act, Congress deferred a decision
regarding future management of the 1.5-million-acre coastal plain (“1002
Area”) in recognition of the area’s potentially enormous
oil and gas resources and its importance as wildlife habitat. A report
of the resources (including petroleum) of the 1002 Area was submitted
in 1987 to Congress by the Department of the Interior (DOI). Since completion
of that report, numerous wells have been drilled and oil fields discovered
near ANWR, new geologic and geophysical data have become available, seismic
processing and interpretation capabilities have improved, and the economics
of North Slope oil development have changed significantly.
The new assessment involved three years of study by USGS scientists,
who coordinated work with colleagues in other Federal agencies, Alaska
state agencies, and several universities. New field studies were conducted,
new well and sample data were analyzed, and new geophysical data were
acquired. Perhaps most importantly, all 1,400 miles of seismic data collected
by a petroleum-industry consortium in 1984-1985 were reprocessed and
reinterpreted. Seismic data collection within ANWR requires an act of
Congress, and these are the only data ever collected within the 1002
area. All this information was integrated as basic input into the petroleum
assessment. The results of the study indicated the total quantity of
recoverable oil within the entire refuge is estimated to be between 5.7
and 16.0 billion barrels (95 percent and 5 percent probability range),
with a mean value of 10.4 billion barrels. Recoverable oil within the
ANWR 1002 Area is estimated to be between 4.3 and 11.8 billion barrels
(95 percent and 5 percent probability range); with a mean value of 7.7
billion barrels.
A previous assessment of the ANWR 1002 Area’s oil resources was
conducted as part of the 1987 Report to Congress; however, the estimate
made was based on the amount of in-place reserves (not recoverable).
The current assessment for ANWR 1002 indicates an overall increase in
oil reserves when compared to the 1987 estimate; ranges are 11.6 to 31.5
BBO versus 4.8 to 29.4 BBO (95 percent and 5 percent probabilities) and
mean values are 20.7 BBO versus 13.8 BBO. The increase results from improved
resolution of reprocessed seismic data, which allowed the identification
of many more potential petroleum accumulations in parts of the 1002 Area,
and analog information provided by recent nearby oil discoveries.
Dr. Houseknect’s formal presentation was followed by a question
and answer period that included topics such as the potential impact of
oil field operations on the fragile ecosystem found in the North Slope
of Alaska, the impact of developing the oil reserves on the current United
States’ crude oil imports, and the likelihood of these oil reserves
being developed at all due to public sentiment for preservation of the
refuge. (Selected excerpts are from USGS Fact Sheet No. 0028-01).
Kathryn Olesko
Associate Professor of History
Department of History and BMW Center for German and European Studies
Georgetown University
(December 8, 2004)
Topic: The Role of Science and Technology in Daily Life
Dr. Kathryn Olesko is Associate Professor in the History Department
of Georgetown College and in the Core Faculty of the School in Foreign
Service, where she is presently Director of the Master of Arts in German
and European Studies Program. She majored in Physics and Mathematics
as an undergraduate at Cornell University, where she also received her
master’s and doctoral degrees in History of Science. At Georgetown
University since 1981, she teaches courses in the history of science
and technology, European intellectual history, German history, and European
Civilization. Her research focuses on the social history of science and
technology in Germany, with special emphasis on how rational beliefs
and actions relate to daily life, local cultures, and personal and professional
identities. In addition, her work covers issues in historical methodology,
everyday life, gender, and industrialization.
She was also the former director of Georgetown’s Program in Science,
Technology and International Affairs as well as Co-Director of the Center
for the Environment. She has held visiting appointments at Princeton
University, Cornell University, and the Max-Planck-Institute for the
History of Science and fellowships from the National Science Foundation
and the National Endowment for the Humanities. She is a Fellow of the
American Association for the Advancement of Science.
Dr. Olesko has published widely on the history of science in Europe
and the United States, and is editor of the annual journal, Osiris, published
by University of Chicago Press for the History of Science Society. Her
current research is on the cultural foundations of science in Germany,
especially the cultural meaning of precision measurement.
Charles E. McQueary
Under Secretary for Science and Technology
U.S. Department of Homeland Security
(January 5, 2005)
Topic: Organization of Science and Technology Activities at the
Department of Homeland Defense
Confirmed by the U.S. Senate, in March 203, Dr. Charles McQueary began
his discussion by referring to the monumental reorganization that occurred
within the Federal Government after the 9/11 tragedies. There was an
accelerated ramp up of concern and emphasis on our Country’s internal
security capabilities, which resulted in the development of many programs
and organizational changes. The most monumental of these was the formation
of the U.S. Department of Homeland Security (DHS). Dr. McQueary explained
that although currently affective, the DHS is in its infancy and is still
in the process of defining a clear and adequate mission structure to
evolve and advance to the next stage.
One important component of success for the newly developed Department
was the application of the Nation’s scientific capabilities to
develop technologies to protect against terrorist attacks. A Science
and Technology (S&T) Division was established to be the lead in conducting
research and development activities specifically related to this cause.
Dr. McQueary explained that there are four other divisions besides S&T:
(1) Border and Transportation Security; (2) Emergency Preparedness and
Response; (3) Information Analysis and Infrastructure Protection; and
(4) Management. Additionally, besides the five Directorates of DHS, several
other critical agencies are folding into the Department or being newly
created. He said that technology plays an important role in each of these
offices. Each one’s success is partly attributed to their ability
to efficiently and affectively expedite the process of getting mission
appropriate technologies to those on the front lines of their homeland
protection activities.
One area of concern, which has greatly increased its capabilities and
efficiency by utilizing counterterrorism driven technological developments,
is border security. Considerable progress has been made in the ability
of airports, seaports and border patrols to screen cargo and persons
crossing the border and identify threats or potential threats appropriately.
Technologies to improve capabilities for confirming the identities of
international travelers to the United States have sharply increased the
numbers of criminals arrested at border crossings as well as fugitives
apprehended who were formerly identified. Utilizing new digital fingerprint
technology, over 23,000 criminals or terrorists were arrested during
one quarter of 2004.
The DHS S&T planning for the future has adopted the philosophy of
applying a needs and risk based approach to research and development
(R&D). The DHS S&T Division leads a department-wide effort to
address R&D requirements of all DHS components by collaborating with
interagency partners to develop an overarching National Strategic Plan
for S&T initiatives. The approach is not to exhaust Department resources
on developing new technologies for their own sake, but to identify existing
requirements and problems and develop to them. The DHS S&T R&D
budget for Fiscal Year 2005 is $1 billion. The budget is aligned with
addressing major chemical, biological, radiological, nuclear, high explosive,
and cyber-related threats. New countermeasures are continually being
developed to defend against each of these. A systems engineering approach
which allows for flexibility and reusability of technologies is being
followed to improve the Nation’s capabilities to prevent, protect
against, and respond to terrorists related events.
The Fiscal Year 2005 budget for biological countermeasure research was
provided an 84 percent increase over that for FY 2004. Goals in the future
include expanding the existing BioWatch capabilities to monitor air in
urban areas for biological threats. This technology will offer unprecedented
protection to over 30 cities. Second generation BioWatch technologies
will boost efforts in sample collection, analysis, and testing. The Washington,
D.C. metropolitan public transportation system currently has first generation
BioWatch capability. One area focus that still needs some improvement
is the amount of response time needed to react to a detection alert.
The Fiscal Year 2005 budget allocates 12 percent of its funding to research
and development pertaining to prevention, protection against, and recovery
from radiological or nuclear release. Field testing is currently underway
of radiation detection technologies in the actual operating environments
of the Port Authorities of New York and New Jersey. Developing advanced
methods for detecting radioactive materials at our borders is a major
concern.
The Homeland Security Advanced Research Projects Agency (HSARPA) is
the primary funding arm of the S&T portion of DHS R&D. This organization
engages the private sector primarily in efforts to detect and counter
chemical, biological, radiological, nuclear and explosive (CBRNE) and
cyber attacks. Around 90 percent of their overall focus is not new development,
but improving existing technologies that are reliable, cost-effective
and can be produced quickly. HSARPA current areas of focus include personal
protective equipment for emergency responders, cyber security, unified
incident command technology, CBRNE detection systems and improvised explosive
device detection.
Major efforts are currently underway to improve technologies that provide
protection of ports and coastal waters by, among other things, improving
recognition of small ships or rafts and increasing security on cargo
bound for the United States. Hawkeye is an integrated port and coastal
maritime surveillance system used by the U.S. Coast Guard and other entities.
It helps in detecting, tracking, and identifying various vessel traffic
and identities. Currently operating in Miami with plans to expand to
Key West, Hawkeye improves detection of possibly harmful and illegal
shipments and decreases maritime law enforcement reaction times. DHS
S&T Border and Transportation Security personnel are developing futuristic “smart
container” systems to enhance security and flow of commerce. Advanced
cargo container security devices are being developed to detect tampering
and track the status of cargo moving through a supply chain. Currently,
only about three percent of all cargo can be checked using a manual effort.
The S&T Office of Interoperability is focused on enhancing emergency
response capabilities of public safety officials and first responders
at all levels of government nationwide. They set national standards and
foster interoperability and compatibility in equipment, communications
and in first responder training. Based on problems with the incompatibility
of communication equipment and systems during 9/11, the S&T Office
of Interoperability is working on implementing a central communication
relay base that will provide the necessary filtering for any one responder
to be able to talk to another.
Additional resources for S&T innovation come from the expanding
network of university-based R&D Centers of Excellence, each with
a different focus on terrorism. Research topics include food security,
foreign animal diseases, behavioral and social aspects of terrorism,
and microbial risk assessments. There are currently four locations with
four additional locations to be added in Fiscal Year 2005.
John S. George
Physiologist
Biological and Quantum Physics
Los Alamos National Laboratory
(January 12, 2005)
Topic: The Human Brain Topic
Dr. John George is a research scientist in the Life Sciences and Physics
Divisions at Los Alamos National Laboratory. Dr. George lectured to the
ComSci Fellows on his work in the Human Brain Project at Los Alamos.
The goal of research supported by the Human Brain Project is to develop
composite techniques for non-invasive, functional brain imaging that
provides resolution superior to any available imaging technique. His
group is developing experimental, theoretical and computational procedures
to combine anatomical Magnetic Resonance Imaging (MRI), functional MRI,
and Magneto encephalography (MEG) into an integrated structural/functional
imaging technique that exploits the strengths and minimizes the weaknesses
of each technique when used alone.
Dr. George first introduced the different techniques that are currently
used to look at brain function/injury, which include MRIs, PET scanning,
MEGs and optical methods. He also explained that there are differences
in conductivity of neural tissue which can be exploited by these different
methodologies. For example, there are differences between the grey matter
and white matter of the brain. The grey matter, which is basically a
sheath around the brain, is the area where the actual "processing" is
done whereas the white matter is the network that provides communication
between different grey matter areas, and between the grey matter and
the rest of the body.
Dr. George’s group uses the retinal model for his studies. Visual
stimulation to the retina can be used to investigate communication between
different types of cells that are found in the human brain. These studies
essentially involve using different types of retinal cell stimulation
to investigate neuroimaging patterns using different neuroimaging techniques
such as functional MRI. Because his work centers on functional neuroimaging
and the development and application of techniques for imaging neurofunction,
Dr. George also works on neural electromagnetic measurement (in particular
MEG and EEG), which provides a number of advantages for the non-invasive
characterization of neural function. As part of his lecture, Dr. George
provided examples of imaging fast optical signals from sematosensory
cortex when different whiskers on a rat are stimulated, demonstrating
the specificity of a response to different stimuli.
Besides his studies in neuroimaging, Dr. George has also been involved
in development of software to allow the use of digital Magnetic Resonance
Imaging (MRI) data to provide individual anatomical context for source
localization and as a geometrical constraint for modeling. His group
has also developed novel techniques for confocal microscopy and endoscopy,
developed advanced instruments and modeling strategies for macroscopic
photon migration spectroscopy and time-resolved optical tomography, and
demonstrated the feasibility of thermal imaging of neural function. Recently
they have demonstrated dynamic microscopic imaging of fast intrinsic
signals that are tightly coupled to the neuronal electrical response.
Steven L. Rolston
Department of Physics
University of Maryland
(January 12, 2005)
Topic: Quantum Computing
Dr. Steven Rolston received a Ph.D. in Nuclear Physics from the State
University of New York at Stony Brook. He was a post-doctoral research
associate in Atomic Physics at the University of Washington and at Harvard
University. Dr. Rolston was a member of the technical staff of the National
Institute of Standards and Technology (NIST), U.S. Department of Commerce.
Additionally, Dr. Rolston is with the Atomic Molecular and Optical Group
at the Physics Department at the University of Maryland, College Park.
Dr. Rolston explained that quantum computing is a new science from quantum
mechanics (QM) and information science. He gave a brief history of both,
specifically explaining QM oddities and superposition in atoms. He further
explained the Einstein-Podolski-Rosen Paradox and entanglement.
Dr. Rolston explained quantum bits (qubits) can be a superposition of
0 and 1; therefore, there is massive storage capability. He then explained
Shor’s Algorithm and presented information about cryptography and
how all public key cryptographic systems rely on the difficulty of factoring
large numbers and the possibility of quantum computing being a revolution
in computer science, especially with applying Shor’s Algorithm
and turning it into a quantum mechanics problem.
Dr. Rolston explained quantum simulation is where one quantum system
simulates another. Quantum communication, with attenuated sources, is
100 percent physically secure and has been demonstrated over kilometer
distances at NIST.
Dr. Rolston summarized that many qubits have been proposed; few qubits
have been demonstrated; the maximum number of entangled qubits equals
four; there is no demonstration of sufficient decoherence; there is no
demonstration of sufficient fidelity; there is no demonstration of sufficient
number of qubits; and there is measurement of single qubits in a few
systems.
Barry Bozeman
Regent’s Professor of Public Policy
School of Public Policy
Georgia Institute of Technology
(January 26, 2005)
Topic: R&D Laboratories in the United States’ National
Innovation System
Dr. Barry Bozeman specializes in science and technology policy, as well
as organization theory and design. He is the author or editor of fourteen
books, including Red Tape and Bureaucracy (Prentice Hall, 2000)
and Limited by Design: R&D Laboratories in the U.S. National Innovation
System (Columbia University Press, 1998). For nearly 20 years, Dr.
Bozeman was on the faculty of the Maxwell School of Public Affairs, Syracuse
University, where he was jointly appointed in the L.C. Smith College
of Engineering and was founding Director of the Center for Technology
and Information Policy. His government experience includes positions
at the Ohio Legislative Service Commission, the National Science Foundation,
and Japan's National Institute of Science and Technology Policy. Dr.
Bozeman has been involved in a wide array of public policy consulting
activities. Among others, he has served as a technology policy consultant
to the U.S. Department of Commerce, Office of the Assistant Secretary
for Technology Policy and the National Science Foundation's Office of
Evaluation. Dr. Bozeman received his Ph.D. in Political Science from
Ohio State University.
Dr. Bozeman has been studying laboratories all over the world (16,000
labs) from the perspective of how they are organized, how innovative
and successful they are, and what are the national-level guidance and
constraints. He has written many papers and books on the subject. He
has drawn conclusions from his analysis attributes that assist laboratories
in being successful. He concluded that national-level laboratory policy
can assist the laboratory mission, collaborative efforts, and how successful
the laboratory transitions research to the market economy. Dr. Bozeman
emphasized national-level policymakers can assist in focusing national
goals. Currently, there are a few countries outside of the United States
that focus their laboratories better than the United States, by developing
a National Innovation System, a rational research plan to affect the
economy based on metrics. His analysis studied why the United States
is so innovative even though, is his opinion; the United States is not
organized from a national-level policy perspective. His findings concluded
the United States is successful because the order of magnitude of funding
expended on research, compared to other nations. He also stated that
the number of universities in the United States as compared to all other
nations, the desire of international students to attend the universities
in our country, and the desire of international students to remain in
the United States, assist in the United States’ success in innovation.
Mark Modzelewski
Managing Director
Lux Research
(January 26, 2005)
Topic: Presentation of Lux Research Analysis of Competitive Position
of U.S. States in Nanotechnology Research and Commercialization
Mr. Mark Modzelewski is Managing Director of Lux Research. He founded
and was Executive Chairman of The NanoBusiness Alliance, the world’s
fastest growing technology association. He is a member of the Nanotechnology
Technical Advisory Group to President Bush’s Council of Advisors
on Science and Technology (PCAST).
Lux Research is the world’s premier research and advisory firm
focusing on the business and economic impact of nanotechnology and related
emerging technologies. Lux Research has been involved in nanotechnology
for four and a half to five years, and is instrumental in United States’ technology
public policy. In benchmarking, Lux Research triangulates the data from
Fortune 500 firms, intelligence agencies/DOD, and university/nanotechnology
start-ups.
Mr. Modzelewski defined nanotechnology as the “purposeful engineering
of matter at scales of less than 100 nanometers (nm) to achieve size-dependent
properties and functions” and identified a range of products: gold
nanoshell cancer treatments, artificial setae, carbon nanotube crossbar
memory, and semiconductor nanocrystal biolabels.
Nanomaterials enable premium prices and high margins (e.g., clothing
and tennis balls). Materials (e.g., nanotubes), life sciences (e.g.,
alternative to bone) and storage and computing devices are examples of
sectors using nanotechnology.
Governments, corporations and venture capitalists will spend more than
$8.6 billion world-wide on nanotechnology research and development in
2004, with North America and Asia spending about equally. Media coverage
of nanotechnology – both positive and negative – is increasingly
exponentially with an increase in scientific articles and patents. Mr.
Modzelewski stated that the United States is in a great position for
generating patents in nanotechnology.
Andrew F. Mazzara (USMC, retired)
Director
Institute for Non-Lethal Defense Technologies
Penn State University
(February 2, 2005)
Topic: Non-Lethal Defense Technologies and Future Warfare
Colonel Andrew Mazzara gave a very interesting and informative briefing
on non-lethal defense technologies (NLDT) and their use in military and
law enforcement settings. He began by giving some background on himself
and the Institute for Non-Lethal Defense Technologies (INLDT). It is
basically the only such academic-based center in this area and is part
of Penn State University’s Applied Research Laboratory. Penn State
University ranks second in the Nation in terms of university-based total
defense research funding. Penn State University and the INLDT do not
develop any non-lethal defense technologies, but the INLDT assesses the
science and engineering, as well as the program plan for various such
technologies and advises various units of the military and law enforcement
on these technologies' best usage.
Colonel Mazzara spent 28 years in the U.S. Marine Corps in various positions
dealing with artillery. At the end of his formal military career, he
was asked to serve as the leader of a new Department of Defense team
studying non-lethal technologies. He then retired from the military and
worked on other defense-related issues before heading up the INLDT for
the past several years.
There are a wide variety of NLDTs including tasers and other electrical
stun guns, chemical agents, mechanical means such as nets and spikes
to stop vehicles (the military typically wants to stop approaching cars
and law enforcement typically wants to stop fleeing suspects), and delivery
systems that use acoustic, ultrasound and other forms of electromagnetic
radiation to disable temporarily disruptive personnel.
The military uses the term "non-lethal" while law enforcement
personnel typically prefer "less lethal." This is because virtually
any technology, especially one that uses physical force, can cause accidental
deaths in at least a relatively small proportion of instances.
Colonel Mazzara discussed the concept of four generations of warfare
historically. The first generation entailed muskets and other small,
inaccurate firearms, as well as small units of personnel. The second
generation involved machine guns and longer range artillery. The third
generation was characterized by precision munitions and increased firepower.
Currently, we are in the fourth generation with few boundaries between
formal war and peace, and more emphasis on such things as psychological
and information warfare.
This fourth generation warfare is also characterized by the "three
block war." This urban warfare scenario entails military personnel
engaging in a humanitarian mission on one city block, while their comrades
are involved in low intensity conflict on an adjoining block, and the
third consecutive block is characterized by full combat operations.
In general, NLDTs are useful in urban warfare and some crowd control
situations, but are not appropriate for deterring terrorist activities
(such as improvised explosive devices in current-day Iraq). NLDTs could
be successfully deployed; however, to defuse a terrorist hostage situation.
In 1997, the Marine Corps was designated the executive agent for a joint
service (DoD-wide) NLDT program. The Army actively pushed to obtain this
responsibility, while the Marines did not. However, Colonel Mazzara felt
it was precisely because the Marines were culturally disinclined towards
NLDTs that they were assigned this responsibility, the thinking being
that if the Marines embraced NLDTs, then the other military services
would follow suit.
There is a ~$43 million DoD budget for NLDT.
NLDT should not necessarily be the first resort of military or law enforcement
personnel and should not be the only option for any deployed military
force. Much discretion for NLDT use is still given to individual military
and police commanders for NLDT use. There has been discussion about whether
a national policy is needed, but in general these commanders cherish
their individual discretion.
There are many legal reviews before any weapons are fielded. Colonel
Mazzara went over some of the relevant treaties that cover chemical weapons
usage, for example. Overall, he did an excellent job outlining the backdrop,
scenarios and challenges, the technologies themselves, technical issues,
and non-technical issues surrounding NLDT usage. This is likely to be
a future growth area for both the military and law enforcement.
For more information, refer to the website at: http://www.nldt.org.
Michael S. Francis
Program Director
DARPA’s Joint Unmanned Combat Air Systems
(February 9, 2005)
Topic: The Next Generation of Unmanned Air Vehicles
Dr. Michael Francis opened his presentation by pointing out that unmanned
air vehicles have been around for quite some time. Shortly after the
Wright Brothers’ first manned heavier-than-air powered flight at
the beginning of the 20th Century came the "Kettering Bug," the
first UAV.
The development of UAVs has come a long way since then. DARPA views these
vehicles as information systems with an air vehicle as a peripheral.
Dr. Francis contented that many of today's modern aircraft could be considered
as autonomous in the same way. The Boeing 777 and stealth bombers really
could fly themselves and in the case of military aircraft, the human
is on board only to authorize weapons release.
The first generation of UAVs differed only slightly from regular air
frames with the crew on the ground instead of the cockpit. The pilots
controlled every maneuver of the vehicle. The UAV was of conventional
design and required a reliable line of site in order to ensure the vehicle
could be commanded and controlled. These platforms were often given missions
that were considered "dull" for piloted aircraft.
This was contrasted with the contemporary UAV systems, which have moved
to more and more autonomy. The pilot is now called the operator because
the UAVs involve no manual options. They have become almost fully automated
from take-off, navigation, to landing the vehicle itself. Today's UAVs
have extended range and reach, incorporating unique and specialized platforms
housing multi-sensor packages.
The future UAV missions will be anything but dull; in fact, they will
be designed to go places that are extremely dangerous. Missions will
include Suppression of Enemy Air Defenses (SEAD), electronic attack,
surveillance, and strike. Their range will be extensive with significant
endurance once arriving in the area of interest, even with airborne refueling
capability. They will be capable of large payload capacity to include
synthetic aperture radar, electro-optic, infrared, and electronic warfare
capabilities.
The future, in DARPA's plans, will see collaborative operations between
multiple UAVs where formation flight will be choreographed deep into
denied environments. They will be capable of monitoring their own damage
from enemy fire and automatically compensate for changes in aerodynamics
through advanced software. Formations will be able to adapt to the loss
of one or multiple shoot-downs and will be able to integrate UAVs out-of-range
for bi-static pulsing of targets while maintaining the stealth cover
of those in harms way. These attributes will only serve to increase survivability
with "predictable effects but unpredictable tactics."
DARPA has made significant progress to date with two prototypes, the
A-45A (first flown in May 2002) and the X-47A (first flown in Feb 2003).
The A-45A accomplished a multiple vehicle (with a piloted aircraft) coordinated
demonstration in August 2004, the first test on the path to collaborative
operations.
Dr. Francis explained DARPA's business model used in developing these
first prototypes. They have developed a common operating system between
the two defense industry giants participating in the UAV development.
They have also hired an "integrator/broker/observer" to serve
as a third party, facilitator, and referee. DARPA has also allowed for
small business to break into this field so dominated by industry giants
by some unique business practices, which serve to promote competition
and ownership. Dr. Frances described the results as "increasing
the idea pool while decreasing the technological risk."
Dr. Francis concluded with what he saw as some broad challenges to his
project. The first and maybe the highest hurtle is building user and
public confidence over reliability and safety of unpiloted aircraft.
He also discussed the regulatory barriers including airspace control
and vehicle/system certification standards. Integration into the infrastructure
would have to be addressed (e.g., basing, logistics, and maintenance).
Lastly, Dr. Francis discussed meeting affordability expectations. UAVs
were sold on the prospect they would be less expensive than their manned
counterparts. But as more expensive and complex payloads are added and
R&D costs are factored into the baseline, these low-cost expectations
will have to be tempered until mass production lowers the per unit cost.
Dr. Francis indicated that DARPA would be handing the project over to
a joint program office led by the Air Force in Fiscal Year 2007.
Michael Rodemeyer
Executive Director
Pew Initiative on Food and Biotechnology
(February 16, 2005)
Topic: Challenges Facing Genetically Modified Food and Other Agricultural
Biotechnology Products
Genetically modified foods and agricultural products are a lighting
rod for many, although 65 percent of the population knows virtually nothing
about them. The subject is politically-charged, and even terminology
is subject to considerable debate.
Intentional genetic modification of agricultural products has been going
on for over 4,000 years, as man has repeatedly tried (and succeeded)
to selectively breed and strengthen particular characteristics into agricultural
products, including animals. Between these intentional modifications,
as well as naturally-occurring hybridization and cross-pollination, there
are no natural foods today that are the same as those found 4,000 years
ago. These facts are important to recognize because many people today
feel that only recently have agricultural products undergone any genetic
changes. This is clearly not true, although it is correct to say that
only recently have recombinant bioengineering techniques been applied
to achieve genetic modification of foodstuffs. However, the use of recombinant
techniques has the potential to surpass 4,000 years of genetic modifications
in one fell swoop.
The history of food genetically-modified via recombinant techniques
(GM foods) is a relatively short and turbulent one. In the public’s
perception, GM foods have moved uncomfortably fast. The 1990s marks the
first introduction of recombinant-based products for public consumption
and release. This includes the Flav-Savr tomato, Golden Rice, Ice Minus,
and Starlink corn. Common threads among all these products are that they
fared poorly in the marketplace, and drew heated opposition among activist
groups, concerned about potential health effects and invasion of native
agricultural species. The 1990s also marks the introduction of recombinant
bovine growth hormone and the outbreak of Mad Cow Disease in the United
Kingdom. Though not connected to genetically-modified (GM) foods, activist
groups were able to link them nevertheless in the public’s perception.
Despite a National Academies of Sciences report that concluded the recombinant
techniques used to create GM foods posed no greater health concern than
other longstanding methods to achieve genetic changes (e.g., gamma radiation,
chemical treatment), there continues to be strong resistance by several
countries as well as activist groups, for a variety of reasons, which
include:
-- Segregation Issues: Segregation and labeling of GM-derived products
is desired by many, but is problematic with the current agricultural
infrastructure. Crops are harvested from many sources and combined into
large containers for shipment and processing. They are extensively co-mingled,
and to keep GM and non-GM products separate requires a vastly different
infrastructure, which would necessitate cleaning of containers and farm
processing equipment to ensure no cross-contamination – all at
very high expense. In the United States, most GM food is fed to animals
or processed to extract components (e.g., corn syrup), so it is several
levels removed from direct human consumption, which keeps concerns from
surfacing.
-- Adventitious Presence: Organic growers do not want contamination
of GM crops and express concern about the presence of GM crops in adjoining
farms. This is even more of a concern when crops are used to produce
biologics. There is tension between local governments that want to encourage
high-technology investments in their areas, and local farmers of non-GM
crops that feel threatened by potential cross-contamination and becoming
non-competitive against the higher value crops.
-- GM Insects and Animals: There are already celebrated examples of
GM animals – cloned livestock, but GM livestock have also been
promoted for potential BSE (Mad Cow Disease) resistance and are currently
being developed as bioreactors to produce important proteins, vaccines
and other pharmaceutical products. GM insects, though not yet released
into the environment, have been proposed as a means to control or eradicate
pests and insect-transmitted diseases. While regulators have begun to
consider the framework of jurisdiction of GM animals and insects, there
remain a great many issues still unresolved in this area. For example,
the Food and Drug Administration (FDA) has recently taken the position
to classify transgenic salmon as a new animal drug. This forces producers
to follow the same path as pharmaceutical manufacturers in proving the
safety and efficacy of new drugs, which can take up to a decade and cost
up to $1 billion in protracted clinical trials for full approvals. However,
it remains unclear whether the FDA has jurisdiction in the event of accidental
release of these salmon into the wild and what would happen if they cross-bred
with non-GM salmon. This may now cross over into the Environmental Protection
Agency’s (EPA) domain, but these interagency boundaries and jurisdictions
have not been resolved.
-- Social and Cultural Issues: As stated above, the rate of change of
development and adoption of GM foods has been uncomfortably fast for
many in the general public, and this has led to the strong activist responses
that the GM food industry is encountering. In addition, GM activities
have been associated with globalization worries and concerns about an “American
takeover” on a global scale. This again, has given many groups
reason for pause and resistance.
From today going forward, there will be continued improvements in existing
GM products, but there will probably be no new significant product introductions
for the time being. However, as patents begin expiring, open sources
for GM products continue to expand, and other countries begin to develop
their own GM varieties, acceptability of GM products worldwide will increase.
There have been attempts to deal with the regulatory policy frameworks
required to deal with GM animals and insects, but clearly this remains
a major hurdle.
Ian Noble
Professor of Global Change Research
Australian National University and Adviser to the BioCarbon Fund
The World Bank
(March 5, 2005)
Topic: Global Climate Change
Mr. Ian Noble is the Professor of Global Change Research at the Australian
National University and is currently on a staff exchange program with
the World Bank in Washington, D.C., where he is an advisor on the BioCarbon
Fund and on issues relating to adaptation to climate change. He was responsible
for the technical design of the BioCarbon Fund, which is now operational
and is expected to support projects in developing countries using finance
from the private and public sectors. In Australia, he participated in
the public and policy debate over responses to climate change and served
as a Commissioner in an inquiry into the future of the Australian forests
and forest industries. Mr. Noble is an ecologist by training and has
over 100 publications on topics including animal behavior, vegetation
dynamics, ecosystem modeling, expert systems and the science-policy interface.
In 1999, he was elected as Fellow of the Australian Academy of Technological
Sciences and Engineering.
Mr. Noble’s briefing was titled, “Climate Change: The State
of Play.” This briefing discussed what quantitative tools and processes
he is currently using to guide his recommendations to the World Bank’s
BioCarbon Fund. The models he is using interpolates thousands of years
of data of carbon found in ice formations, attempting to extrapolate
future climate temperatures and sea levels, and how humans on Earth are
affecting the Earth’s climates and sea levels. His data shows that
the Earth’s average temperatures will continue to rise in the next
100 years causing the weather patterns to become more extreme. The cycles
of rainy and dry seasons will last longer, possibly causing indigenous
people to have a difficult time to coping with the type of crops they
are used to planting and eating, especially in third world countries
that do not have the resources to irrigate their crops or plant perennials.
Land which is currently in a temperate region (United States, India,
Europe) will be come warmer for more days in the year and more dry, and
land farther north (Russia, Canada, China) which currently is colder
will become more usable for growing crops. The primary cause to the change
in temperatures and precipitation is due to the predicted increase in
carbon in the atmosphere, mostly in the form of carbon dioxide, caused
by emissions from human activity (pollution). Mr. Noble emphasized that
to reverse the current trends in the increase in annual carbon in the
atmosphere will take all the nations in the world to work together to
reduce the carbon. If nations do not expend resources to mitigate the
release of carbon, nations will have to expend resources to adapt to
the changing climate.
William K. Hubbard
Associate Commissioner for Policy and Planning
U.S. Food and Drug Administration (FDA)
(March 9, 2005)
Topic: Drug Importation
The focus of the presentation was to provide a broad overview on drug
importation. In his opening remarks, Mr. William Hubbard highlighted
key facts on FDA. Subsequently, he discussed FDA’s mission, legislative
history, and “The Food, Drug, and Cosmetic Act.” Next, he
discussed briefly the drug importation history and why the United States’ consumers
are buying foreign drugs (the key driver being the price savings, ease
of Internet purchasing, and confidence in the quality of drugs from places
such as Canada). FDA considers the drug importation as a fundamental
challenge to the United States’ drug regulatory system.
Mr. Hubbard highlighted some of FDA’s concerns through interesting
visual aids. These are volume, scope, resources, regulatory authority,
uncertainty in the origin of drugs, “sameness,” untruthful
websites, pharmacy quality issues, disclaimers, and counterfeits. Some
of the websites are claiming to be operating in Canada while they are
not. A simple laboratory analysis of popular prescription drugs (e.g.,
Lipitor, Viagra, and Ambien) offered through a website called “Canadian
Generics,” failed on most criteria such as “potency,” “dissolution,” and “impurities.” Many
times fake drugs are sent to United States’ consumers from shadowy
operations in the third world. Other examples of violations are: drugs
are sent as substitute for United States’ brand name drug when
the label clearly says that it is not interchangeable.
Lately, a few states in the United States has been undertaking drug
importation ventures as a service to the low-income and/or seniors in
the states. Mr. Hubbard presented the issues associated with one such
program in place in the State of Wisconsin. The PSW findings on this
program noted the following: (1) one-third of total prescriptions violated
state agreement; (2) 237 impermissible drugs dispensed; (3) 134 drugs
were non-FDA approved; and (4) six drugs required refrigeration. The
FDA findings noted that two-thirds of total prescriptions violated state
agreement and mostly generic drugs were dispensed. A major point of concern
with the use of the Internet pharmacies (that are linked on the State
of Wisconsin homepage) is that the customer has to sign a release waiver
stating that he/she would not sue the pharmacy for any loss whatsoever.
Often, counterfeit drugs are dispensed and the packaging looks very similar
to the real one. The chain of events that take place after a customer
places an order on the Internet pharmacy is that they fax the order to
an outfit in Bahamas (for example), which in turn sends the medications
to the United States’ customer (often making it appear to have
been shipped from Canada or the United States). Many times the United
States’ customers think that they are getting a good deal on price
when they place orders through these Internet pharmacies. However, a
simple inquiry at the corner pharmacy store in the United States indicates
that the United States’ drug prices on common brand name drugs
are often cheaper than those offered by the Internet pharmacies.
The U.S. Department of Health and Human Services (HHS) published a report
that outlined that personal drug importation cannot be done safely while
commercial importation is possible with significant new resources and
authority. The report noted that cost of such a legalized system would
be quite large compared to the savings. It also stated that the future
research and development (R&D) would be significantly affected and
this could have a significant impact on IPR and liability.
The U.S Commerce Department published a report on price controls overseas.
The report indicates that price controls in Europe limit R&D, generic
drugs are inadequately used in Europe, and eliminating foreign drug controls
would benefit United States’ companies.
Mr. Hubbard discussed some avenues that could assure safety. These are:
increased inspection at border, of foreign pharmacies, and foreign manufacturers;
implementation of an alternative drug review process; reciprocity with
foreign countries; tracking and tracing technology for reimported American
drugs.
The following questions remain: will Congress enact legislation, will
congressional leaders permit and vote on importation, will consumers
continue to seek foreign drugs, will demand level off, will more states
and cities establish importation programs, will FDA take legal action?
The ComSci Fellows learned a great deal about issues related to drug
importation in the Internet age and political twists and the tremendous
challenges that the FDA and the U.S. Customs personnel face in making
United States’ consumers safe from consuming counterfeit and illegal
drugs from overseas.
Additional information can be found at: http://www.fda.gov/.
Mark A. Boroush
Senior Policy Analyst
Office of Technology Policy
Technology Administration
U.S. Department of Commerce
(March 23, 2005)
Topic: The Department of Commerce’s Role in Technology Transfer
Mr. Mark Boroush, a Senior Policy Analyst in the Office of Technology
Policy (OTP), Technology Administration, U.S. Department of Commerce,
talked to the ComSci Fellows about the role of the OTP in federal technology
transfer policy. Mr. Boroush began by explaining that technology transfer
encompasses a broad range of activities in which technology or knowledge
is developed by one entity and transferred to another. In the government,
technology transfer usually refers to the transfer of early stage technology
developed under federal funding to public or private entities for the
purpose of further development and commercialization. Such transfer often
involves invention disclosure, patenting, patent licensing, and cooperative
research and development (R&D) agreements. However, technology transfer
may take place by other means such as publication or transfer of equipment
or personnel. Many organizations, including universities, private companies,
and government laboratories, participate in technology transfer, and
the activity may be international in scope. In bringing science to market
in this way, the biggest challenge is often convincing industry to invest
enough resources to bridge the “valley of death” between
discovery and commercialization.
Mr. Boroush emphasized that the United States will have to work to sustain
its technological leadership in the years ahead, so as to maintain the
United States as the preferred location for innovation, growth, and creation
of economic value. In this endeavor, new know-how and technology flowing
from federally-funded R&D will continue to be a critical resource
for United States’ economic competitiveness. Federally-funded R&D
is substantial, totaling $276 billion in Fiscal Year 2002, of which 18
percent was for basic research, 24 percent for applied research, and
58 percent for development. Moreover, the Federal Government provided
59 percent of the support for basic research, and 32 percent of applied
research.
The current framework for technology transfer of innovation arising
from federally-funded R&D began in 1980, with the enactment of the
Technology Innovation Act of 1980 (“Stevenson-Wydler Act”)
and the University and Small Business Patent Procedures Act of 1980 (“Bayh-Dole
Act”). These and subsequent laws provided for the patenting and
licensing of inventions developed in federal laboratories and federally-funded
laboratories, such as those of universities or small businesses. According
to a 2003 report by the President’s Council of Advisors on Science
and Technology, the transfer of government-funded technology to the private
sector has grown significantly, and has become an increasingly important
part of the overall industrial commercialization of technology. Moreover,
such technology transfer has resulted in many commercial successes and
led to entirely new technology-based industries such as biotechnology
and information technology.
The Department of Commerce’s Technology Administration and Office
of Technology Policy play key roles in federal technology transfer policy
development. The Under Secretary and Assistant Secretary have statutory
roles, and participate in policy discussions and development with practitioners,
federal agencies, the Administration, Congress, and other stakeholders.
In addition, the Department of Commerce’s Office of General Counsel
is the lead agency in resolving legal issues raised by federal agency
patenting and licensing of intellectual property. OTP has a long history
of contributing to the development of major technology transfer legislation.
Another way in which OTP contributes to policy development is through
the coordination of the Interagency Working Group on Tech Transfer (IWGTT).
IWGTT is a long-standing working committee that includes technology transfer
principals from most of the federal science and technology agencies,
including the Department of Agriculture, the Department of Defense, the
Department of Energy, the Environmental Protection Agency, the Department
of Health and Human Services, the Department of the Interior, the National
Aeronautics and Space Administration, the Department of Transportation,
and the Veterans Administration, as well as the Department of Commerce.
The IWGTT is chaired by OTP, and meets monthly to discuss policy issues
and related topics of interest to the federal laboratory technology transfer
community. The agencies reflect a diversity of technology transfer programs,
priorities, authorities, and goals, so IWGTT focuses on advancing policy
in areas of consensus. The group is recognized by Congress as a source
of informed opinion, and has been influential in shaping policy.
OTP also contributes to technology transfer policy by producing congressionally
mandated annual performance reports on federal laboratory technology
transfer. The Technology Transfer Commercialization Act of 2000 governs
the current reporting process, which involves annual performance reports
to the Office of Management and Budget, the President, and Congress.
As a result of this law, the previously “irregular” reporting
process became an annual, substantive matter. The law requires each agency
that operates a federal laboratory to report, “to the Office of
Management and Budget, as part of the agency’s annual budget submission,
on the activities performed by that agency and its Federal laboratories” pursuant
to the technology transfer laws. The agencies submit these annual reports
to the Department of Commerce, where OTP uses them to develop an Annual
Summary Report, including information on all agencies with federal laboratories.
The Annual Summary Report is then submitted to the President and Congress.
The reports cover the goals of each agency’s technology transfer
program; the role of technology transfer in the agency’s mission;
statistics on patenting, licensing, and cooperative agreements; downstream
outcomes such as new products, businesses, job creation, income, and
impact on economy; and overall “bang for the buck” resulting
from program resources and the and intellectual property portfolio. While
this reporting is not quite the same as that required in the Government
Performance and Results Act, it is kindred in spirit and is one response
to the call for greater accountability.
Mr. Boroush predicted that in the future, publicly-funded research would
be increasingly important as a foundation for innovation contributing
to United States’ competitiveness in the global economy. Accordingly,
OTP’s role in developing policy that will make technology transfer
and public/private partnering work more efficiently will be increasingly
important. Moreover, Mr. Boroush was optimistic that this endeavor will
be successful because OTP has learned to ask the questions that prepare
it to undertake this important policy task.
Visit to the Science Applications International Corporation’s
(SAIC) Public Safety Integration Center (PSIC)
(McLean, Virginia)
(March 30, 2005)
Dr. James W. Morentz, Vice President for Homeland Security Technology
and the Director of the Public Safety Integration Center hosted the ComSci
Fellows at their McLean Headquarters. He gave an overview of the lab
and the PSIC mission to prevent natural, technical, and/or terrorist
events; and in those events that cannot be prevented will be detected
in order to alert and protect people and the economy and respond and
recover effectively.
The ComSci Fellows were able to see first-hand how integrated systems
can be quickly pulled together from a wide variety of legacy hardware
and software and newer GOTS and COTS products. How law enforcement personnel
and first responders could interact to a fictitious suspicious package
at a restaurant in the Washington, D.C. area was demonstrated.
A call was simulated from the restaurant to a 911-dispatch center, which
contacted local police. An explosion model output using the approximate
size of the package described at the restaurant was used to predict affected
areas. The analysis was then superimposed on a map of the local area
and sent to first-responders and displayed on a simulated computer in
a police cruiser on the scene. Software identified where local traffic
should be diverted away from the scene and which police officers where
closest to the divert points.
Public safety radio interoperability is a fundamental capability in
the PSIC, which houses an audio interconnect device for linking disparate
land mobile radio systems and commercial wireless networks together on
a "demand-authorized" basis.
SAIC is working with numerous vendors and service providers to demonstrate
their latest technologies. The vendors work across six major areas towards
a solution: collaboration, access control, intelligence and surveillance,
vulnerability and consequence assessment, interoperable incident management,
and public safety communications.
The goal of the PSIC is to provide the ability to make the right decision
whenever and wherever it is needed, from on-scene first responders to
national decision makers.
Visit to the Turner-Fairbank Highway Research Center
Federal Highway Administration (FHWA)
U.S. Department of Transportation
McLean, Virginia
(April 14, 2005)
The ComSci Fellows spent an exceptional day at the Federal Highway Administration’s
Turner-Fairbank Highway Research Center in McLean, Virginia. The group
was welcomed by Mr. Dennis Judycki, Associate Administrator for Research,
Development and Technology, who also provided an overview of the facility.
Briefings on FHWA Corporate Research and Technology Management and Services
were given by Ms. Deb Elston, Corporate Research and Technology; Ms.
Marci Kenney, Program Development and Evaluation; and Mr. John McCracken,
Research and Technical Services. An overview of the Research and Development
(R&D) Offices was given by Dr. Steve Chase, Infrastructure R&D
and current ComSci Fellow; Mr. Toni Wilbur, Operations R&D; and Mr.
Tom Granda, Safety R&D.
The afternoon was spent touring various laboratories, including the
Structures Lab, the Hydraulics Lab, the Bridge Management Information
Systems Lab, the Aerodynamics Lab, the Traffic Research Lab, the Photometric
and Visibility Lab, and the Highway Driving Simulator. The ComSci Fellows
were treated to a Segway demonstration and had the opportunity to ride
one.
Visit to the Produce Quality and Safety Laboratory, Agricultural
Research Service
U.S. Department of Agriculture
(Beltsville, Maryland)
(April 20, 2005)
This site visit, hosted by Dr. Arvind Bhagwat, Dr. Ken Gross and Dr.
Jim McAvoy, consisted of an overview slide presentation about the mission,
capabilities, and some highlighted research activities at the Beltsville
Agricultural Research Center (BARC).
BARC is the largest and most diversified agricultural research complex
in the world. They conduct research to develop and transfer solutions
to agricultural problems of high national priority and provide information
access and dissemination in order to ensure high-quality safe food and
other agricultural products; assess the nutritional needs of Americans;
sustain a competitive agricultural economy; enhance the natural resource
base and the environment, and; provide economic opportunities for rural
citizens, communities, and society as a whole.
One traditional role of BARC is well-known to consumers, and that is
providing objective reference data on food ingredients and commodities,
such as recipes, standardized food labels, and manufacturers' product
information. They are also responsible for standardizing other reference
data, such as portion sizes, nutrient values of ingredients and commodities,
and food group classifications.
Poultry sperm lose functional competence very quickly during liquid
and cryogenic storage, so it is imperative to develop successful storage
methods. BARC scientists have determined the influence of sperm phenotype
on liquid and cryogenic storage conditions. They also determined the
molecular basis of sperm subsistence in the sperm storage tubules using
serial analysis of gene expression (SAGE) and developed an in vitro model
to elucidate molecular and cellular events regulating prolonged oviductal
sperm storage and sustained fertility in poultry.
Non-thermal technologies for inactivation of pathogenic and spoilage
bacteria in foods is another priority area for BARC. Non-thermal approaches
are much better in preserving food flavor and texture, but have not been
as effective as thermal methods in bacterial control. The current emphasis
is to integrate bacteriaphage and bacteriocins with other antimicrobials
with high-pressure processing to enhance microbial inactivation. There
are already several noted commercial applications, including treatment
of orange juice, guacamole, and shucking of mussels. High pressure approaches
also hold potential for neutralization of prions associated with bovine
spongiform encephalitis (Mad Cow Disease).
New knowledge derived from genome information is critical for understanding
numerous biological processes that operate in plant pathogens and for
developing novel strategies to combat plant diseases. One approach for
introducing new genetic information is to develop and implement viral-based
vectors for the rapid delivery and expression of foreign gene sequences
in plants as a means of testing sequences and protein products which
may be useful in plant and animal disease control, as well as engineering
other desirable characteristics.
Controlling the growth of human pathogens on fresh-cut fruits and vegetables
is a growing concern, as the demand for these types of consumer products
is exploding. The objectives of the BARC research is to gain a greater
understanding of the biochemical and molecular genetic mechanisms involved
in the attachment, survival and growth of food borne pathogens on fresh
fruits and vegetables.
The overview was followed by an opportunity to evaluate fresh-cut apple
slices treated with antioxidant and preservative solutions developed
at BARC for maintaining the quality and shelf-life of fresh-cut apple
slices. The objective for the dip solution is to prevent browning; maintain
firmness, aroma, and flavor; inhibit microbial growth; be inexpensive;
and be formulated from natural or other additives already accepted in
the food industry. In fact, BARC scientists were successful and had conducted
extensive laboratory and consumer taste-testing to validate their findings.
The same group that developed the dip solution and conducted consumer
taste-tests gave the ComSci Fellows an opportunity to repeat these consumer
taste-tests. The ComSci Fellows evaluated the instrumental and sensory
attributes of apple slices of “Gold Rush,” “Granny
Smith,” “Fuji,” and “Pink Lady” varieties,
varieties, treated with various dip solutions. It was truly remarkable
how well the best prototype dip solution preserved, and by some tasters
actually improved, the flavor and texture of treated apples.
The visit to BARC concluded with brief tours of some of the laboratories,
which largely resembled those of typical wet chemistry research laboratories.
Visit to the Goddard Space Flight Center
National Aeronautics and Space Administration (NASA)
(Greenbelt, Maryland)
(April 20, 2005)
Dr. D. J. Emmanuel, Visitor Center Operations Manager, and Ms. Nina
Harris of the Public Affairs Office were tour guides for the ComSci Fellows’ visit
to NASA Goddard Space Flight Center. After starting at the Visitor Center,
the group viewed the Hubble Space Telescope cleanroom. They also toured
other spacecraft test facilities in the same building complex such as
the centrifuge and acoustics chambers.
The ComSci Fellows toured the Earth Science Control Center in Building
32 where personnel operate and control a variety of remote sensing spacecraft
that track weather and perform atmospheric, terrestrial, and oceanographic
scientific measurements. Mr. Paul Ondrus handled various questions about
the backgrounds and experiences of the personnel who work in this control
center and the types of data that these spacecraft handle. Goddard Space
Flight Center is the lead NASA Field Center for earth science and more
information about this is available at: http://www.earth.nasa.gov.
The ComSci Fellows enjoyed their tour of the Scientific Visualization
Studio. Dr. Horace Mitchell, a project manager there, demonstrated some
exciting video animations that show changing conditions in Earth and
space science and the like. This studio is a rather unique facility with
the ability to convert the bytes of information that scientific spacecraft
gather into dramatic visual demonstrations for both scientists and the
general public.
The tour guides also gave the ComSci Fellows some general background
about NASA and Goddard Space Flight Center in particular. More information
is available online at: http://www.nasa.gov and http://www.gsfc.nasa.gov.
Information about NASA history is available at http://history.nasa.gov.
30th Annual AAAS Forum on Science and Technology Policy
(April 21-22, 2005)
Dr. Gilbert S. Omenn, Professor of Internal Medicine, Human Genetics,
and Public Health at the University of Michigan; and President of AAAS,
opened the 30th Annual Forum on Science and Technology Policy and introduced
keynote speaker, Dr. John H. Marburger, Director of the Office of Science
and Technology Policy, Executive Office of the President. Dr. Marburger
defended United States’ research and development investment and
focused his remarks on budgets and the measures of the strength of American
science and technology.
The morning’s plenary session followed Dr. Marburger and concerned
budget and policy issues for research and development (R&D) in Fiscal
Year 2006. Panel members included: Mr. Paul Posner of the U.S. Government
Accountability Office; Mr. Kei Koizumi of AAAS; Mr. Scott Lilly, Senior
Fellow at the Center for American Progress; and Mr. Robert Klein, Chair
of the Board, California Institute for Regenerative Medicine.
Dr. Arden L. Bement, Jr., Director of the National Science Foundation
(NSF) was the featured luncheon speaker. During his time, he sketched
the vision of the National Science Foundation. NSF’s aim, he said,
is to foster the Nation's science and engineering strength to power our
economic and social future.
The concurrent afternoon sessions addressed three topics: (1) the future
of scientific communication, (2) a systemic view of the science and technology
(S&T) workforce, and (3) science and global health disasters. Featured
speakers included:
-- Mr. Clifford A. Lynch, Executive Director, Coalition for Networked
Information
-- Dr. Carol Tenopir, Professor, School of Information Sciences, University
of Tennessee-Knoxville”
-- Dr. David Stern, Director of Science Libraries and Information Services,
Kline Science Library of Yale University
-- Dr. Donald W. King, Research Professor, School of Information Science,
University of Pittsburgh
-- Mr. Anthony P. Carnevale, Senior Fellow, National Center on Education
and the Economy
-- Mr. William O. Berry, Acting Deputy Under Secretary of Defense for
Laboratories and Basic Science, U.S. Department of Defense
-- Ms. Joan Robinson-Berry, Director, External Affiliation, The Boeing
Company
-- Dr. Norine Noonan, Dean, School of Science and Mathematics, College
of Charleston; and Member, AAAS Board of Director
-- Ms. Shirley Malcom, Director, Education and Human Resources, AAAS
-- Dr. Henry Masur, Chief, Critical Care Medicine Department, Warren
G. Magnuson Clinical Center, National Institutes of Health; and President-Elect,
Infectious Diseases Society of America
-- Dr. Ali Khan, Associate Director for Science, Division of Parasitic
Diseases, National Center for Infectious Diseases, Centers for Disease
Control and Prevention
-- Dr. Charles H. Riemenschneider, Director, Liaison Office for North
America, Food and agriculture Organization of the United Nations
--Dr. Irina V. Dardynskaia, Research Associate Professor, Environmental
and Occupational Health Sciences, School of Public Health, University
of Illinois at Chicago; and Associate Director, Project on International
Research and Training in Occupational and Environmental Heath in Russia,
Ukraine, and Belarus
-- Dr. Jerome Donlon, Acting director, Office of Research and Development
Coordination, Office of Public Health Emergency Preparedness, U.S. Department
of Health and Human Services
The day ended with the 2005 AAAS William D. Carey Lecture. The annual
lectureship recognizes individuals who exemplify the leadership of William
D. Carey in articulating public policy issues related to science and
technology.
The Honorable Rush Holt (D-New Jersey) told the audience that the diminished
influence of science and public policy is posing significant risks to
the United States across a range of areas, from energy and the economy
to education and the underlying spirit of the Nation.
Friday’s program began with a plenary session on “The Role
of R&D in the U.S. and Global Economies,” chaired by F.M. Ross
Armbrecht, Executive Director, Delaware Foundation for Science and Mathematics
Education. Speakers from the United States, Japan, India, and Iran made
presentations considering the centrality of R&D in national innovation
systems and the policy responses that are needed. Their presentations
also provided insights on how the United States measures up in its strategic
use of science and technology capacity, and how other nations use S&T
for economic development.
Dr. Peter Cannon, Managing Partner of VRE Company, presented “A
View from U.S. Industry.” He pointed out that there is no official
United States industrial policy, but various laws and policies are favorable
to R&D. Federal funding, private funding, R&D tax credits and
other incentives effectively create over $300 billion of funding, an
amount that is larger than the total revenue of some countries or United
States sectors, such as aerospace. Dr. Cannon emphasized the importance
of government partnerships with industry, highlighting the Defense Advanced
Research Projects Agency, which is currently threatened; the National
Institute of Standards and Technology; and the National Science Foundation.
He also considered return on investment, and said that revenues resulting
from products developed from investments in R&D are very high. Thus,
science and technology has a substantial payoff in jobs and economic
growth.
Mr. Martin Neil Bailey, Senior Fellow at the Institute for International
Economics and former Chairman of the President’s Council of Economic
Advisors (PCEA) gave his “Perspective on U.S. Federal Government
Activities.” Mr. Bailey stated that the Clinton Administration
recognized the importance of funding science and technology, but the
Bush Administration instead relies on tax cuts and deregulation. Clinton’s
proposed S&T budgets were typically cut by Congress, except for the
budget of the National Institutes of Health, which was often increased
because of the benefits of biomedical research to the Nation’s
health. Thus, Mr. Bailey suggested that scientists should highlight “Pasteur’s
Quadrant” – science that is both basic and applied – in
today’s effort to avoid S&T budget cuts. However, he cautioned
that scientists should not misrepresent the potential payoffs of research,
because that could lead to a perception that science funding “fleeces” the
public. Mr. Bailey suggested that four factors have helped the United
States remain an economic and industrial leader: the dominance of the
United States in high technology; the collapse of the Soviet Union; the
relative economic decline of Japan; and the acceleration of United States’ productivity
in the mid-1990s. A flexible regulatory climate also fosters innovation
in the United States; other countries are far more restrictive. United
States’ scientific and technological growth has also benefited
from an influx of foreign students, but this source of talent is now
threatened by restrictions on immigration. In the future, the United
States will be both a source and recipient of technology used by large
developing countries such as China and India.
Thereafter, three speakers discussed the role of science and technology
in countries at very different stages of scientific and technological
development. Dr. Masayuki Kondo, of Yokohama National University, presented
a “Comprehensive Review of Japan’s S&T Plans,” which
emphasizes four high technology disciplines – nanotechnology, environmental
science, life sciences, and materials science. Although Japan’s
science and technology is highly developed, its approach is quite different
from that in the United States in regard to permanence of researcher
employment and laboratory size. The linkages between science and patents
are good in the United States, but not as strong in Japan. In addition,
United States’ patents are licensed much more frequently than Japanese
patents.
Dr. Marco DiCapua, Science Counselor (India), U.S. Dept. of State, discussed “India’s
Strategic Use of S&T Capabilities.” He emphasized that in the
future; the combination of demographics and growing technological capability
could make India and China world economic leaders. Although at one time
these countries dominated world trade, they fell behind because of a
lack of incentives to innovate and improve product quality. Creative
people immigrated to the United States, to the detriment of India and
the gain of the United States. However, increased competition in the
1990s has helped innovation and product variety. Moreover, India stumbled
onto a new production model for intellectual goods that has revolutionized
the global economy. In this model, developed countries such as the United
States have tapped the vast and inexpensive intellectual talent of countries
such as India by outsourcing software development and other technological
endeavors. Intercontinental fiber optic cables have fostered the transfer
of software and data, and allowed Indian scientists to remain at home.
India is also strengthening its capabilities in biotechnology, pharmaceuticals,
agriculture, and environmental sciences. While an exclusion from patent
protection enhanced the development of a generic pharmaceutical industry,
India must now comply with World Trade Organization requirements, and
provide patent protection for all technologies. Although this will inhibit
India’s production of “copycat” pharmaceuticals, it
may position India to become a global partner in clinical trials of drugs,
vaccines, and medical devices. In sum, technological interactions between
the United States and Asia will increase in the future, with uncertain
effects on both sides, and the possibility of significant changes in
the global economy.
Dr. Hessamaddin Arfaei, of the Iranian Institute for Theoretical Physics
and Mathematics, concluded the morning session with a presentation on “The
State of Science in Iran and its Development in Recent Years.” Dr.
Arfaei noted that private industry has only a minor role in science and
technology in Iran. Initially, Iran aimed to train people for government
service and management, but now its goal is to increase Iran’s
role in scientific research. The largest increase took place after 1995,
with a substantial increase in publishing from 2,000 papers a year in
2002 to 3,000 in 2003. Despite this increase, Iran produces only a very
small fraction of the world’s science. Before the mid-1980s most
of the research was applied research, not basic research. Although the
basic research taking place now is comparable in quality to that of other
nations, it has not yet had much impact on Iranian society. In “Declaration
2001,” Iran announced its goals of increasing science and technology
in areas that would contribute to cultural development, survival, peace,
and national security. Iran has increased R&D spending from 0.8 percent
of its annual budget in 1978 to the current level of 1.4 percent. Iran
hopes to increase its collaborative efforts and S&T budget so that
in the future, Iran’s research will contribute to its economy.
The Plenary Session in the afternoon, “Science Versus Society?
When Scientific Interests and Public Attitudes Collide,” was moderated
by Dr. Albert H. Teich, Director, Science and Policy Programs, AAAS.
The theme was issues at the intersection of science and religion.
Dr. Eugenie C. Scott, Executive Director, National Center on Science
Education, spoke on “Evolution vs. Creationism/“Intelligent
Design,” and explained how scientists can respond to arguments
opposing the teaching of evolution in school. Dr. Scott began by explaining
the history of the dispute over the teaching of evolution in public school
classrooms, beginning with the famous Scopes “Monkey Trial” in
1925. She also explained more recent cases, including the Supreme Court
cases, Arkansas v. Epperson (1968) and Edwards v. Aguillard (1987). In
these cases, the Court struck down certain state laws either prohibiting
the teaching of evolution or requiring the teaching of creationism as
impermissibly advancing a religious belief in violation of the Establishment
Clause of the First Amendment. The dissent in the latter case; however,
paved the way for continuing dispute and the repackaging of creationism
as “Intelligent Design.” Nonetheless, today’s anti-evolution
arguments are not very different from those made by William Jennings
Bryan in the Scopes trial. The three “Pillars” of those arguments
are that evolution is a “theory in crisis,” evolution and
religion are incompatible, and that it is only “fair” to
teach creationism with evolution. Dr. Scott rebutted those arguments,
but emphasized that attacks on the teaching of evolution will continue,
and that there are many anti-evolution bills before state and other legislatures.
Accordingly, she urged scientists to explore these issues further.
Dr. John Gearhart, Professor of Medicine, Johns Hopkins University,
made a presentation on “Human Embryonic Stem Cell Research and
Therapeutic Cloning.” In November 1998, Dr. Gearhart’s research
group was one of the first two teams to publish groundbreaking research
on the isolation and maintenance of human stem cells in culture, leading
to the possibility of using human tissues grown in the laboratory to
treat diseased and injured organs. However, even though his research
was privately-funded and was not subject to the federal limitations on
human embryo research because his stem cells were isolated from fetal
tissue, those who oppose stem cell research have attacked Dr. Gearhart
and his research. Accordingly, he has become an advocate for stem cell
research, spending much of his time in discussions in public forums such
as churches. He receives thousands of e-mails daily, primarily from people
wanting body parts. Although fulfillment of this age-old dream of replacing
worn-out organs is still far in the future, it could become a reality
if society invests in the research. However, research in this area has
been distorted for political gain. He believes that our pluralistic society
should accommodate stem cell research, and is pleased to see the state
initiatives to support it. He hopes that federal-funding will be expanded,
and that the United States will not lose its international prominence
in stem cell research.
Dr. Lawrence M. Krauss, Professor and Chair, Department of Physics,
Case Western Reserve University, made a presentation entitled, “When
Sentiment and Fear Trump Reason and Reality.” Dr. Krauss cited
various statistics indicating that scientific ignorance is pervasive,
and many are superstitious. Journalists, who are taught to present two
sides to every story, inadvertently abet this state of affairs. In regard
to science; however, one side is usually wrong. The scientific community
is not well-equipped to counter those who attack it. Moreover, the public
is often uncomfortable with the perceived risks of science and is impatient
with the pace at which its benefits are realized. In addition, the public
often doesn’t like to hear the facts, especially when they are
uncertain, e.g., in regard to global warming. Dr. Krauss believes that
when fighting scientific ignorance, it is important to avoid offending
contrary sensibilities, and to ensure that scientists are perceived as
open-minded, honest, and fair. Moreover, religion and science are compatible,
and the scientific community should not become estranged from the rest
of society. Scientists need to become community activists to open the
lines of communication and ensure that the scientific view is not drowned
out.
James Scanlon
Executive Staff Director
National Committee on Vital and Health Statistics
U.S. Department of Health and Human Services (HHS)
(April 27, 2005)
Topic: HHS Information Quality Guidelines
This presentation was about how the Department of Health and Human Services
deals with the information it generates in the pursuit of its mission.
This includes information that is generated both within the Department
and by contracts let by the Department. Typically this is information
related to human health and by its nature it has the potential to cause
reactions from producers and consumers of goods and services alike.
Over the years, there have been several rules and regulations that have
attempted to control the validity and quality of information; in particular,
information that is broadly disseminated to the public. The Data Quality
Act of 2001 was mentioned as was the Office of Management and Budget
(OMB) Data Quality Guidelines of September 2001. There are further regulations
at the agency level that are effective since October 1, 2002.
This regulatory program is comparable to an engineered quality control
plan, but it is difficult to administer because the information is not
as verifiable as most engineering data.
It applies to all substantive information that is disseminated by the
agency, with a couple of exceptions. Included are scientific and technical
reports, statistics, and authoritative information:
-- Results of scientific assessments and research studies.
-- Statistical and analytic studies and products.
-- Programmatic and regulatory information including program evaluations.
-- Public health surveillance, epidemiological and risk assessment studies
and information.
-- Authoritative health and medical information.
Generally excluded is administrative information. Another intriguing
exception is peer reviewed scientific work. This exception is interesting
because it indicated a continued trust of both the scientific method
and the scientific community. Lastly, information can be excepted if
a proper disclaimer is added to the dissemination. This allows individual
scientists to publish data in a non-peer reviewed environment.
Quality guidelines relate to quality, objectivity, utility and integrity
of information. Besides the quality of the data itself, the quality of
the process by which changes to information are made when mistakes are
found is subject to the guidelines.
Finally, the guidelines include reporting requirements to the Office
of Management and Budget.
Visit to the Federal Bureau of Investigations (FBI) Laboratory
Quantico, Virginia
(May 4, 2005)
Dr. Joseph DiZinno, Deputy Administrator, gave the ComSci Fellows an
overview of the Laboratory and its facilities. Because the facility is
an accredited ASCLD, some spaces are off-limits. The FBI processes about
10,000 cases annually, approximately 150,000 items of evidence, and 300,000
exams. Examiners interpret and testify by comparing the known to the
unknown. Of the approximately 360 employees, the majority is non-agent
scientists; there are about 64 agents in the labs. The Evidence Response
Team collects and preserves evidence in the field. Most of the evidence
is for FBI cases; however, the FBI provides free of charge exam and testimony
for state and local law enforcement cases. The FBI is looking at the
business process mapping to make science better, faster, and cheaper.
The Research Partnership Program improves and leverages forensic science
with joint publishing and peer review. There is a Quality Assurance audit
of laboratory work to ensure procedures are followed. State and local
governments have access to the forensic databases with state and local
fingerprints entered by the local units. Since 9/11, there is more cooperation
and sharing of intelligence. The FBI’s number one priority is preventing
terrorism, so international testing has increased.
The ComSci Fellows were able to get short glimpses of the work done
in several of the case-working units. For example, in the Explosives
Unit, the ComSci Fellows learned that much of the work done involved
post-blast assistance, where they need to analyze many pieces of possible
evidence to aid in piecing together what type of improvised explosives
device was employed, and evidence to trace the origin of the pieces.
To this end, the Explosives Unit has extensive cooperation with most
of the other units in the Laboratory.
Work being done in the Chemistry Unit ranges from more general analyses,
such as drug quantification, pharmacy, toxicology, trace evidence (paint
analysis) inks/dyes, polymers, and accelerants. For many of these analyses,
robotics is used for sample preparation.
Firearms identification is performed in the Ballistics Unit. There is
a 5,700 reference firearms collection, which starts with Civil War revolvers
and includes automatic weapons. The collection serves as both reference
library and parts resource, if a part is needed to obtain a "bullet
signature" from a gun. The water recovery tank is used to discharge
the ammo and recover it for analysis. The FBI destroys from 600 to 1,000
guns annually and converts them into manhole covers.
The two DNA units, Mitochondrial DNA Unit (DNA Unit II) and Nuclear
DNA Unit (DNA Unit I), serve different functions, and the physical state
and amount of evidence determines which type of analysis will obtain
the most useful data. DNA Unit II usually is involved in cold case work
where the evidence is often degraded, and where there are only small
samples available and don’t want to contaminate it. This type of
DNA provides more general identification, as it indicates familial DNA
and not identification specific to a single individual. Family members
of missing persons submit their DNA for comparisons to skeletal remains.
DNA Unit I, on the other hand, works more with bodily fluid samples,
and determines the unique DNA signature. This unit is connected with
the Combined DNA Index System (CODIS) and the National DNA Index System
(NDIS).
The Trace Evidence Unit deals mostly with hair and fibers comparisons,
which are extremely useful, but not as concrete as DNA evidence. This
Unit also does work with clothing, rope/cordage, glass, soils, as well
as skeletal remain analysis.
The Latent Print Unit deals not only with fingerprint analysis, but
also with latent prints in general, which are prints that require some
kind of development for visualization. Along with the demonstration of
several techniques for print development, it was explained that superglue
binds to moisture on fingerprints.
The Questioned Documents Unit deals with strips and cross-cut shreds
and sorts by color. This unit performed the typewriter analysis for the
Unabomber case.
The Cryptanalysis and Racketeering Unit examines evidence relating to
criminal and terrorists organizations. They informed the ComSci Fellows
that Julius Caesar is credited with inventing codes and that Thomas Jefferson
is the father of American cryptography. The ComSci Fellows heard about
the Unit’s success with encrypted codes and messages of the Brian
Patrick Regan spy case.
The ComSci Fellows agreed that the tour presented a fascinating look
into the range of work done at the Laboratory. One of the more memorable
stories included a description and inside story of the "Shoe Bomber."
Dena Puskin
Director, Office for the Advancement of Telehealth
Health Resources and Services Administration
U.S. Department of Health and Human Services
(May 11, 2005)
Topic: Telehealth – Through the Looking Glass
Telehealth is a new paradigm in delivering health care where the patient
can be geographically separated from their health care provider, but
maintains a virtual link. It can be thought of as a toolbox of technologies
applied to diverse health care needs in a wide range of health care settings.
It integrates or draws upon many different fields, including consumer
health services, information technology, e-business/e-commerce, and many
different educational areas. This tries to address some of the major
challenges to cost-effective health care in the United States, including
difficult access to care in certain geographic and functionally-isolated
populations, medical errors, an aging (and less mobile) population exerting
a greater demand on the health care system, and shortages of adequate
health care providers.
Dr. Dena Puskin described some of the activities undertaken by the Office
for the Advancement of Telehealth (OAT). They have awarded over $250
million in grants since 1989, with $34 million in Fiscal Year 2005 alone.
Although the majority of the awards are congressionally-mandated projects,
there are competitively-selected projects as well, with coverage in 43
states and the District of Columbia. Although OAT accounts for the major
fraction of federally-supported telehealth activities, the above statistics
undercount the full federal investment. There are many allied projects,
not officially classified as telehealth, but with significant relevance
such as distance learning, health informatics, consumer health information
and mentoring activities. In addition, there are also significant telehealth
efforts at the National Institutes of Health, the Department of Defense,
and the National Aeronautics and Space Administration as well. OAT chairs
the Joint Working Group on Telemedicine to ensure coordination and promote
information sharing among the participating agencies. The clinical services
covered by OAT, as well as the other agencies, are quite broad, and include
mental health, dermatology, diabetes, cardiology, radiology, nutrition,
orthopedics, trauma/ER, surgery and endocrinology. Services rendered
typically take two forms: (1) “Store-and-Forward” where the
primary provider takes a collection of still images and sends them to
a consulting physician for evaluation; and (2) “Home Health” where
low-cost equipment is kept at the patient’s home and is used to
interactively check vital signs, monitor medications and general patient
condition, and visualize the patient.
Despite the convergence of sophisticated information technology, immense
pressure to reduce health care costs by the governmental and private
sectors, and the growth of e-commerce and the Internet, growth in telehealth
has been modest at best. The telehealth community, thus far, has only
produced anecdotal evidence to support a business case for further investment,
but this has not been enough to convince policymakers and the larger
health care industry. Consequently, a recent focus of OAT has been to
actively make sure that new projects focus on obtaining rigorous longitudinal
data so decision makers have the information they need to objectively
evaluate telehealth’s clinical efficacy and cost-benefits. This
includes metrics in four key areas: (1) improving access to needed services;
(2) reducing rural practitioner isolation; (3) improving health system
productivity and efficiency; and (4) improving patient outcomes. These
metrics are expected to provide valuable information going forward, but
it will still take several more years to refine these metrics and acquire
the longitudinal data necessary.
Visit to the U.S. Army Medical Research and Materiel Command
(USAMRMC)
Fort Detrick, Maryland
(May 25, 2005)
The ComSci Fellows were welcomed to the U.S. Army Medical Research and
Materiel Command (USAMRMC) in Fort Detrick, Maryland by Chief of Staff,
Colonel Gina Deutsch, who gave the group an overview of the USAMRMC.
The mission of USAMRMC is to enhance the protection of soldiers and leverage
technology to protect the war fighter. In this regard, USAMRMC studies
environment, psychology and combat care and is the only branch of the
military to carry out this type of research and development. The mission
encompasses military operational medicine (how to cope with injuries
in the battlefield) military infectious diseases (biodefense research,
education and training) and congressional programs to define health care
problems and develop solutions. The core research programs at USAMRMC
are in infectious diseases, military operational medical care, combat
casualty and medical chemical and biological research. Overall the personnel
at USAMRMC are one-third military, one-third civilian and one-third contract
employees.
Mr. Bill Lebherz then spoke to the group about the medicinal, chemical,
and biological defense program at Fort Detrick. In 2003, there was a
reorganization of the science management and technology transfer divisions
to address the required capabilities of the armed forces. This reorganization
involved the cooperation between the government, industry and academic
institutions. Because of the unique laboratory facilities at Fort Detrick,
the USAMRMC carries out the efficacy testing for some of the most dangerous
infectious agents and chemicals whereas the pharmaceutical industry has
been given the task to developing therapeutical agents/vaccines to combat
these agents. These include prophylaxis, pretreatment strategies, nerve
agent therapeutics, vesicant agent therapeutics and decontamination devices
for medical chemical defense and multivalent vaccines, therapeutic vaccines
and alternative delivery methods to combat bioagents.
The third speaker of the morning was Colonel Erik Henchal, a microbiologist
and expert in bioterrorism. Colonel Henchal spoke to the group about
the studies at USAMRMC to combat bioterrorism. Their mission is to conduct
basic and applied research on biological threats resulting in medical
solutions (i.e., prophylactic vaccines, therapies and medical diagnostics
to protect the war fighter). Colonel Henchal once again emphasized the
unique facilities at Fort Detrick and how these facilities have helped
the USAMRMC remain at the forefront of this type of research.
In the afternoon, the ComSci Fellows were greeted by Mr. John Winston
and Ms. Tony Story, who introduced the group to the Telemedicine and
Advance Technology Research Center (TATRC) and gave the group a tour
of portable medical facility designed for rapid deployment in the battlefield.
The mission of TATRC is to apply technology to predeployment, deployment
and post-deployment (e.g., amputees). The TATRC is an entrepreneurial
branch of the USAMRMC, which is run very much like a business and has
an integrated research team, an integrated product team and a product
line review. Part of their funding goes to SBIR and STTR programs which
are set aside funds for congressionally-directed research. Some of the
products that have been developed through TATRC include the test tube
simulator, digital x-ray, medical robotics and retinal imaging. After
viewing a brief video describing the TATRC vision of directions for using
robotics to improve rescue of soldiers from the battlefield as well as
potential victims of chemical or bioterrorist attacks, the ComSci Fellows
witnessed first-hand the impressive progress that has been made towards
making this vision a reality. Other innovations that have been developed
include the electronic information carrier in a soldier’s dog tag,
which documents injuries and documents medical care in real time. The
day ended with a tour of the makeshift medical hospital and demonstrations
of the capabilities of these MASH style units.
Visit to the Smithsonian Environmental Research Center
Edgewater, Maryland
(June 8, 2005)
The Smithsonian Environmental Research Center (SERC) is one of five
Smithsonian Institution research divisions. SERC focuses on basic science
with some obvious potential applications. SERC was formally established
in 1983 with some of its predecessor organizations dating back to 1929.
Approximately 180 people work at SERC in Edgewater, Maryland. Of these
people, about 17 are principal investigators. SERC is open to the public
during the week and also features two self-guided tours.
After a fun and fascinating canoe trip, the ComSci Fellows were briefed
about the Marine Invasions Research Lab by Mr. Whitman Miller. Much of
the discussion centered on biological species that are introduced into
coastal waters from large ships' ballast water. Large shipping vessels
such as those used for containerized shipping take on ballast water when
they don't have heavy cargo aboard and release ballast when they do have
heavy cargo. While ships are supposed to take on and discharge ballast
water on the open seas, often they do this in coastal areas instead,
causing ecological problems. Tankers typically discharge ballast water
all at once, while container ships dump ballast water more intermittently
as containers are offloaded at different ports.
Species such as zebra mussels and kudzu are carried in ballast water
and can clog municipal water systems. In addition, certain snails eat
other shellfish in the Chesapeake Bay watershed, causing economic impacts.
Other species can foul water and cause diseases such as cholera in humans
and animals that drink this water. If nothing is done to deal with such
problems, further homogenization of coastal species is likely.
On-board treatment systems are showing some promise and some possibilities
include ultraviolet light treatment, physical filtering systems, and
deoxygenation (by pumping in nitrogen).
This Lab is also the home of the National Ballast Information Clearinghouse.
This central registry of information about ballast discharges enables
scientists to understand more about these problems. This clearinghouse
is administered jointly with the U.S. Coast Guard and analyzes such trends
as how many ships enter the United States, how many carry ballast water,
how many discharge it where and so forth. Now there are sizeable daily
fines for commercial ships that do not report the required data.
The Lab also looks at hull fouling organisms such as certain kinds of
barnacles that cling to the hulls of big ships and have a biological
effect on other species. There are special paints to defeat these hull
fouling organisms, but they are very toxic and so some researchers would
like to find better mechanisms to address this set of issues.
Approximately 32 people work in the Marine Invasions Lab. The Lab has
Memoranda of Understanding for formal cooperation with Portland State
University and San Francisco State University.
Next, the ComSci Fellows had an interesting presentation from Dr. Wayne
Coats of the Protistan Ecology Lab. (Protists are defined as one-celled
organisms that have characteristics of both plants and animals, such
as algae, yeasts, and protozoans.)
The discussion centered on dinoflagellate algae that cause "red
tides." In this host-parasite system where both are dinoflagellates,
shellfish are contaminated and cause illness in those who eat them. Dr.
Coats explained that his Lab looked at the formation, persistence, and
decline of such biological phenomena. He and his colleagues also study
viruses.
Overall, scientific studies of such host-parasite systems demonstrate
that there is a complex microbial food web (not just the simplistic hierarchical
predator-prey food pyramid). Scientists try to target host specificity.
They are not introducing new parasitic species, but rather targeting
where and when they introduce existing species.
After lunch, Ms. Sharyn Hedrick gave the ComSci Fellows an introduction
to the Phytoplankton Lab. Phytoplankton is at the base of the food chain
and contributes to the brown color of the Chesapeake Bay (not mud). She
has spent several decades sampling the nearby Rhode River and analyzes
such things as light absorption and attenuation. Over this time, she
has seen construction of buildings near the Bay's edges cause significant
ecological damage.
There are seasonal "dead zones" in the Bay in which there
is no oxygen in certain areas and thus no life. In winter; however, new
cold water typically flows in and increases the oxygen content.
After this Lab, the ComSci Fellows met with Mr. Jess Parker, a forest
ecologist at SERC. He led the group on a very informative and engaging
walk through the woods. He stressed the human context for forest development.
He pointed out many varieties of trees and their characteristics. Dominance
is a forest ecology term referring to the fact that a small initial advantage
confers large growth advantages later on for trees. He also explained
that perhaps contrary to a layperson's intuition, the term old growth
forest refers to a forest with old, medium, and young trees and no evidence
of a key precipitating event so the overall age of the forest is hard
to date.
SERC’s website can be found at: http://www.serc.si.edu.
A short history of SERC and be found at: http://www.si.edu/archives/historic/history.htm.
Visit to the National Aquarium
(Baltimore, Maryland)
(June 28, 2005)
What do you see when you walk into an aquarium? You see marine life – fish,
marine mammals, plants and maybe even reptiles. But do you see ecosystems?
That’s what Dr. Glenn Page, Director of Conservation for the National
Aquarium in Baltimore, wants you to see when you walk into the Baltimore
Aquarium. Dr. Page’s vision for the Aquarium, and as a matter of
fact, for the rest of the large, not-for-profit aquariums around the
world, is to become a center for inspiration as well as education. That
vision is becoming reality as more and more opportunities, through the
work of Dr. Page’s group, are created for responsible action by
individuals. One such opportunity is the Chesapeake Bay Initiative. The
Chesapeake Bay, a unique watershed of 64,000 square miles but extremely
shallow, is severely threatened by human activities. The Aquarium’s
Chesapeake Bay Initiative involves restoration of wetlands through the
planting of beneficial marsh grasses by residents within the wetland
communities. These individuals not only plant the grasses, they may have
grown them from seeds too. This personal involvement is the impetus of
incentive to continue proactive involvement by monitoring the health
of the wetland, water quality, and use by birds, fish and other wildlife.
Action at the community level helps to ensure consistent and long-term
restoration.
The Baltimore Aquarium does inspire visitors with the many informative
exhibits that are constructed to present a natural environment for the
species highlighted in the exhibit. One of the main attractions is the
shark tank that encircles visitors as they wind down ramps from one level
to another. The ComSci Fellows were able to view these sand, tiger and
nurse sharks from above when they were allowed to go “behind the
scenes” on a bridge crossing the open surface of the shark tank.
While touring behind the exhibits, the ComSci Fellows were also able
to see the tanks where sharks and dolphins are brought for examinations
and treatments. The health of the animals is of utmost importance to
the Aquarium staff.
While viewing the stingrays gliding and turning through their 265,000-gallon
pool with a few sharks and a sea turtle, the ComSci Fellows were able
to watch divers carefully feed these animals. Some of the sharks are
collected from the ocean, and after a year in the exhibit, they are tagged
and released as part of the Cooperative Shark Tagging Program of the
National Marine Fisheries Service. Scientists are also finding that through
tagging stingrays in the wild, much is being learned about discrete populations
and their migration patterns off of the North and South American coastlines.
Visit to the University of Maryland’s Center for Marine
Biotechnology
(Baltimore, Maryland)
(June 28, 2005)
The ComSci Fellows’ final visit of the fellowship year was made
to the University of Maryland’s Biotechnology Institute (UMBI),
Center of Marine Biotechnology (COMB) in Baltimore, Maryland. COMB has
earned international acclaim in its two Programs of Excellence on aquaculture
and fisheries biotechnology, and marine microbial biotechnology, and
throughout the ComSci Fellows’ visit the reason for such acclaim
became clear. Crabs and oysters, once in abundance off of the Maryland
and Virginia shores, are now dwindling and endangered. COMB is working
towards reversing these trends. As state lawmakers consider the introduction
of the Asian oyster into the Chesapeake Bay, COMB scientists are studying
potential risks of introducing this non-native species and their potential
impact on the health of the Chesapeake Bay. One question being asked
is whether the Asian oyster will be able to resist parasites that have
decimated the native oyster population when they too will be exposed
to the environmental stresses of the Chesapeake Bay. COMB is an optimal
lab from which to study non-native species, as they are an isolated “warehouse” eliminating
the threat of introduction of the species into the natural habitat prior
to a complete and viable assessment.
Blue Crab populations are also in danger along the shores of Maryland
and Virginia. A once thriving population is showing signs of stress from
over-fished and under-protected estuaries. The Blue Crab Research Program
at COMB has demonstrated that Blue Crabs with multi-stage, complex development
cycle can hatch and grow in tanks from the larvae, or “zoea” stage
through their post-larval “megalops” stage (in which they
resemble tiny crayfish or lobster) all the way to juvenile miniature
Blue Crabs. In fact, the program has been so successful that COMB has
run out of room to stock all of the baby crabs.
Stocking the Chesapeake Bay with hatchery-raised crabs is a beginning
to rejuvenating the population of Blue Crabs, but once released, these
crabs face an uphill battle in an environment that offers insufficient
grasses and plant life to hide from predators when they molt. The greatest
predator to a Blue Crab is its brother or sister. Cannibalism is rampant
in the crab world and with random molting patterns producing significant
availability of crabs with soft shells at any time; there is plenty of
opportunity to be eaten. COMB scientists are addressing this issue by
trying to determine how to make these brother and sister crabs molt at
the same time. Such a phenomenon would significantly increase the probability
of survival for each crab.
Extremophiles are microorganisms that thrive under conditions that from
a human perspective are clearly extreme like high temperature, pH, pressure
and salt concentration to name a few. Each group has unique features
that can be exploited to provide biomolecules with a wide variety of
applications ranging from manufacturing to medicine. COMB scientists
are investigating marine microbes that fall into the category of thermophiles,
microorganisms that thrive in high temperatures. They are presently interested
in those that live in temperatures above 100oC. The hope is that these
microorganisms will produce enzymes that will lead to pharmaceuticals
to fight cancer or even produce hydrogen from carbon intake that may
someday help to fuel our future hydrogen economy.
back to top
Workshop on Regional, State and Local Initiatives in Nanotechnology
(September 30 - October 1, 2003)
The Workshop on Regional, State and Local Initiatives in Nanotechnology was
sponsored by the National Nanotechnology Coordination Office (NNCO) and the
U.S. Department of Commerce as part of the National Nanotechnology Initiative
(NNI) (see www.nano.gov).
Following a welcome by Dr. Mike Roco, Chairman, Subcommittee on Nanoscale
Science, Engineering and Technology (NSET), Dr. Clayton Teague, Director,
NNCO, stated that the purpose of the workshop was threefold:
- To provide regions, states and locations with information, models, and
networking opportunities to assist them in developing, launching, and nurturing
nanotechnology initiatives;
- To provide information on federal programs relevant to such initiatives;
and
- To assist NSET/NNCO in collecting and disseminating such information.
Dr. Teague also pointed out that state and regional nanotechnology initiatives
allow for accelerated introduction of these technologies into the marketplace,
enable regional, state, and local entities to plan and prepare for disruptions
produced by transitions involving nanotechnology innovations, and provide fundamental
links to federal and international activities in nanotechnologies.
Mr. Phillip J. Bond, Under Secretary of Commerce for Technology, U.S. Department
of Commerce, followed with a keynote speech. He stressed that nanotechnology
is a new frontier in science, one in which the United States may not hold
its usual front-running position. Mr. Bond commented that it is critical
to United States national security to quickly gain leadership in this crucial
technology area. He stated that while the U.S. Government is spending $800
million this year on research and development in this field, spending funds
is not the only requirement. He said we must concentrate on public education
concerning the importance of this technology, facilitate technology transfer
and have a global environment based on trade agreements that effectively
permit the international transfer of these technologies.
The Honorable Aris Melissaratos, Secretary, Maryland Department of Business
and Economic Development, then presented an overview of nanotechnology in
the State of Maryland. The U.S. Government, he stated, spends $9 billion
on research and development in the State of Maryland, mostly in the area
of biotechnology. Mr. Melissaratos noted that the future for each state rests
on its ability to pursue research, development, and engineering in this new
and challenging nanotechnology field, and estimates that in ten years the
market for nanotechnology products could reach $400 billion. He then spent
a few minutes talking about each of the larger science and technology organizations
in Maryland.
Dr. David Sampson, Assistant Secretary for Economic Development, U.S. Department
of Commerce, spoke on the importance of maximizing the impact of every grant
dollar, while fostering regional innovation and competitiveness in nanotechnology.
He said that private industry spends twice as much on technological research
and development compared to government, which fuels economic expansion and
development, which results in the creation of jobs. Dr. Sampson further pointed
out that technology exports accounted for 29 percent of all exports in 2000.
He then concluded his presentation by stressing that commitment by local
communities is crucial to developing a vibrant nanotechnology infrastructure.
These opening speeches were followed by a series of case studies in several
representative states. Ms. Mary Jo Waits, Associate Director, Morrison Institute
for Public Policy, Arizona State University presented the case study for
Arizona. She explained that Arizona chose clusters having business interdependence,
i.e., where businesses relate to each other through the buyer-supplier “food
chain,” as competitors, or as partners; that are export oriented, i.e.,
where many of the companies in the cluster sell products or services to companies
outside the region; and where the cluster is an existing or emerging area
of specialization such as nanotechnology. She stated that the enduring competitive
advantages in a global economy lie increasingly in local things - knowledge,
relationships, and motivation – that distant rivals cannot match.
Dr. Jo Anne Feeney, Senior Business Strategist at Albany NanoTech, observed
that a new paradigm is emerging for technology commercialization and regional
development that merges the strengths of industry with those of government
and universities. She stated that Albany NanoTech at the University at Albany – State
University of New York – is home to the Center of Excellence in Nanoelectronics.
This, she observed, is one of six New York Centers of Excellence, and serves
as the main catalyst for industry-university-public collaboration to develop
innovations that enable the integrated circuit industry to sustain its historic
progress.
Dr. Fraser Stoddard from the California NanoSystems Institute (CNSI) and
the University of California at Los Angeles gave a summary of the characteristics
of an effective regional nanoscience initiative – establishing a viable
organization devoted to promoting this emerging technology; facilitating
transfer of nanotechnology from the academic or governmental research organizations
to industry; and encouraging establishment of educational curricula that
can result in the best trained and capable research technicians and scientists.
Dr. Warren Ford of Oklahoma State University presented the viewpoint that
the coordination of nanotechnology research programs in Oklahoma began in
January 2000 when a group of faculty organized the Oklahoma Network for Nanostructured
Materials (NanoNet). The NanoNet proposed research on single-wall carbon
nanotubes, molecular beam epitaxy routes to semiconductor quantum dots, and
solution-grown colloidal particles, and assembly of these building blocks
into devices. Dr. Ford pointed out that in May of this year the Oklahoma
Legislature passed a resolution creating the Oklahoma Nanotechnology Initiative
(ONI) to further business in nanotechnology via cooperation among companies,
financiers, academe and government.
Mr. Sean Murdock described AtomWorks, an initiative of the Illinois Coalition,
of which he is Executive Director. AtomWorks was formed to foster nanotechnology
in Illinois and, more broadly, the Midwest as a world leader in commercializing
nanotechnology-enabled innovations. In less than one year, Mr. Murdock noted,
AtomWorks has made significant progress in creating the entrepreneurial ecosystem
that will be required to secure global leadership. AtomWorks, Mr. Murdock
pointed out, has accomplished these activities by focusing on these four
key activity platforms: 1) Education/Awareness, 2) Advocacy, 3) Resource
Aggregation and Integration, and 4) Community Building.
Dr. Barry Stein, Ben Franklin Technical Partners, followed with an overview
of nanotechnology in Southeastern Pennsylvania. He reviewed the formation
and mission of the Nanotechnology Institute (NTI), saying that NTI arose
from the confluence of three factors:
- The Federal Government’s National Nanotechnology Initiative, which
identified nanotechnology as a principal enabling technology of the early
21st Century, and which provides substantial federal funding to stimulate
nanotechnology
activity;
- The creation of a Commonwealth of Pennsylvania Authority to encourage major
university-based research and development initiatives with integral commercialization
components; and
- Regional interest and capability in the field as evidenced by the participation
of more than 100 company and university representatives in the region’s
first NanoForum, organized by the Ben Franklin Partners of Southeastern Pennsylvania
(BFTP/SEP), at which National Nanotechnology Initiative staff presented the
promise and opportunities of nanotechnology.
The formal presentations were followed by panel and individual discussions
among the participants. Overall the conference accomplished the three goals
stated at the beginning of the first day and reiterated here:
- To provide regions, states, and localities the opportunities for networking;
- To provide information on federal programs relevant to nanotechnology initiatives;
and
- To assist NSET/NNCO in collecting and disseminating information on nanotechnology.
The second day of the workshop continued presentations on the diverse strategies
different regions and states have taken to further initiatives in nanotechnology
to support economic growth. Leading off the sessions were talks focused on
nanotechnology work force development and education. The first presentation
in this session was from Mr. Paul Hallacher, who described the Pennsylvania
regional program, NanoManufacturing Technology (NMT) partnership. The NMT
partnership has now evolved into a consortium of 31 higher education “partners,” including
community colleges, technical colleges, and the state universities, which
support programs including a semester of nanotechnology courses at the Penn
State Nanofabrication
Facility. Continuing the discussion on education in nanotechnology, Mr. Barry
Stein with Ben Franklin Technology Partners, described a series of courses,
leading to an AA degree in bio(nano)technology. The regional program includes
select community colleges in Pennsylvania, New Jersey, and Maryland.
In the discussion period following the presentations, the future of workforce
development in nanotechnology was debated. It was agreed that there were
uncertainties in how quickly demands for jobs in this area might grow, and
exactly how nanotechnology
will develop. Currently, micro- and nano-electronics have the biggest demand
for new jobs. It was stressed that there doesn’t seem to be a downturn
in nanotechnology itself; however, the downturn in the national economy is
affecting demand for nanotechnologists, as new hiring is generally down.
The next session covered research infrastructure development. Dr. Tom Picraux,
of Arizona State University (ASU) described how investment was made in Arizona’s
Nanotechnology programs through a voter initiative, Proposition 301, which
was passed in November 2000. This initiative imposed a 0.6 percent sales
tax increase for the purpose of enhancing education in Arizona. A portion
of this
20-year initiative provides funds for development of infrastructure enhancement
at the state universities in support of new jobs in Arizona. At ASU, the
investment focuses on biotechnology, nanotechnology, information technology,
and manufacturing
science. The Arizona Biodesign Institute (AzBio) was created in 2002, with
a focus on combining training in biotechnology, nanotechnology and information
technology. Dr. Picraux stressed investment strategies made in the programs,
by linking local strengths with regional strengths. The interdisciplinary
focus of the program was emphasized, and also the need to build high-impact
focus
areas and teams for joint use, and to make partnerships with regional government
and industry.
Dr. Mike Roco of the National Science Foundation (NSF) then presented a national
viewpoint on nanotechnology, describing the NNI Centers with nanotechnology
education components, including those around National Laboratories, such
as Argonne, Brookhaven, and near the University of California at Berkeley.
The final presentation of the workshop was by Mr. Steve Crosby, editor of
the Small Times magazine, on the annual state-by-state ranking by status
in small
technology. This is a sort of “economic race,” where various
statistics are analyzed to compile the rankings. These statistics include
venture capital
tracking, grant-tracking, and private investment of research in universities
toward commercialization, measures of a well-trained work force, and the
cost of doing business in an area. The ranking is in the May/June 2003 issue
of
Small Times and on its website (www.small_times.com), with more information,
including graphics.
At the conclusion of the formal sessions, the organizers requested that the
audience suggest some preliminary points to aid in drafting the report on
the workshop; in other words, what were the key points to take away from
the workshop,
and what are perceived as the key issues?
Among the suggestions:
- To develop “one pagers,” or white papers, to educate state
and local policy-makers;
- To maintain a sense of regional engagement as a new model of economic development;
the need to partner regionally, not just by state;
- To keep in mind that there are great differences in maturity of the different
programs at regional/state levels;
- As demonstrated by the presentations at the workshop, the different programs
demonstrate similarities in process, but not necessarily the conclusions
and strategies;
- Each program will always have to map out a unique “go to market” based
on personalities and politics involved;
- The interaction between industry-academia-government is complex, and it
would help implementation to build on other successes;
- To attain credibility is important as a tool to establish legitimacy;
- There needs to be more work from the programs to support infrastructure;
- The communication process is very important.
The Workshop was interesting in that it focused on the process of encouraging
development, and not so much on the science of nanotechnology. Many models,
both within-state and regional approaches, were presented. There were many
innovative approaches, which ranged from how to promote education in the
new area of nanotechnology to various economic development plans that would
encourage nanotechnology firms to locate to a state or region.
Visit to the National Institute of Standards and Technology
U.S. Department of Commerce
Gaithersburg, Maryland
(October 8, 2003)
The National Institute of Standards and Technology (NIST) is described as
the hidden jewel of all the national laboratories because of its paramount
focus on raising the consistent reliability of measurements and standards in
the United States of America – a preeminent position it has held for
more than 100 years.
Established in 1901 as the National Bureau of Standards, NIST today is not
only charged with developing measurements and standards, but also with promoting
technology to enhance productivity, facilitate trade, and improve the quality
of life. Even though NIST functions as a non-regulatory agency, its research,
measurement tools and technical services are integrated deeply into many of
the systems and operations that drive our economy.
NIST laboratories provide technical leadership for vital components of the
Nation’s technology infrastructure. NIST remains the steward of the United
States measurement system and is known for its achievements in physical measurements,
standards developments, test methods, and basic scientific and technical research.
Just prior to the threshold of its second century, NIST was granted an expansion
of its duties to include:
- the Advanced Technology Program, to support and enable innovative technologies
that otherwise would not be pursued through co-funding of industrial partners
in the pursuit of pioneering technical research;
- the Baldrige National Quality Program, to encourage and assist United
States’ organizations
in their quest for performance excellence and quality improvement efforts,
as well as manage the highly acclaimed Malcolm Baldrige National Quality
Award; and
- the Manufacturing Extension Partnership Program, to co-fund a nationwide
network of nonprofit technical and business assistance centers to help smaller
manufacturers identify and implement modern production techniques.
The ComSci Fellows started their exciting and very hectic day of touring with
Dr. Arden Bement, the 12th Director of NIST. Dr. Bement gave the group an introductory
tour of the visitor center exhibits, followed by a brief overview of the Institute.
He started by explaining that NIST is located in the Department of Commerce
under the Technology Administration. Counting the scientists, engineers, guest
researchers, technicians, support and administrative staff, NIST has approximately
3,000 people working at its two major sites in Gaithersburg, Maryland and Boulder,
Colorado. NIST has maintained a relatively stable operating budget of $864
million, which includes income from standard reference materials and other
collected fees.
Mr. Marc Stanley, Director of the Advanced Technology Program (ATP) followed
with a high-energy presentation starting with the statement that the ATP, through
its co-funding of powerful new technologies that underlie a broad spectrum
of potential applications, has reaped $16 billion dollars worth of advanced
technology from the $2.1 billion it has invested in the past 13 years of its
existence. The ATP has fostered economic growth by encouraging high-risk, high-payoff
industrial research and development. By sharing the cost of these projects,
the ATP augments industry’s ability to pursue promising technologies,
often accelerating their development for the highly competitive national and
international marketplace. Because of the speculative nature of the projects
it co-funds, ATP has been susceptible to much controversy and criticism. In
spite of the constant need to fight for its budget every year, ATP has managed
to co-fund projects from more than 1,000 small and large organizations in areas
such as electronics and semiconductors, manufacturing technology, information
technology, computing, chemicals, biotechnology and advanced materials.
The next stop of the day took the ComSci Fellows to the NIST Center for Neutron
Research where Dr. J. Michael Rowe, the Director, explained to the group how
the Center is used yearly in collaborative and individual research by over
1,600 scientists and engineers from industry, academia and the government.
The reactor is primarily used to help measure materials at the atomic or molecular
level. It is considered a world-class level instrumented laboratory for cold
neutron research and has figured prominently in determining specifications
for measurement technology.
Maintaining consistency with radiation dosage is a very critical area in measurement
technology. Ms. Lisa Karem, Acting Chief, and staff led the ComSci Fellows
to the next stop, which took the group to the Quality Assurance in Radiation
Measurement Laboratory in the Physics Laboratory. The importance of getting
consistent dosage of radiation is just as crucial for medical protocols as
it is for irradiation processing of mail. NIST is currently leading a subgroup
of the interagency working group on irradiation processing requested by the
Homeland Security Office.
The Advanced Metrology Laboratory (AML) is nearing completion and will be
a state-of-the-art facility capable of providing research environments not
available in any other laboratory in the world. Mr. Jim Bartlett, Quality Assurance
Manager in the Plant Division led the ComSci Fellows through various sized
lab configurations. They were able to see how the heating and cooling systems
throughout one of the main buildings were installed vibration isolated, and
sometimes double vibration isolated to maintain environment consistency. When
the AML is ready for occupancy in 2004, the 47,480 square-meter (yes, that
is completely metric) building will aid NIST and its industry partners to achieve
higher quality reference materials, improved measurements and standards, and
more rapidly developed research advances.
The ComSci Fellows learned one of the reasons why nanotechnology is big at
NIST on our next stop. Mr. John Henry Scott, Physicist in the Surface and Microanalysis
Science Division of the Chemical Science and Technology Laboratory showed the
group that with the use of a powerful analytical electron microscope one can
image individual atoms and characterize both natural and engineered nanostructures.
Images from this special microscope can also be used to determine atomic-scale
defects in materials as well as determine the spatial distribution of chemical
elements.
Finally the tour ended with Mr. Kevin McGratten, Mathematician from the Fire
Research Division of the Building and Fire Research Laboratory. The ComSci
Fellows were shown a special IT model developed to show where sprinklers would
be most useful in a room depending on how the room was used. Various applications
were configured depending upon the model calculations input. The collapse of
New York City's World Trade Center structures following the terrorist attacks
of Sept. 11, 2001, was the worst building disaster in recorded history, killing
some 2,800 people. NIST is taking the lead in conducting a three-part plan
to investigate and study the contributing factors, devise a program to provide
technical basis for improving codes, standards and practices, and developing
a program to provide guidance and better prepare facility owners to respond
to future disasters—especially in regards to how long fire marshals have
to get people safely away from danger.
Gregory Tassey
Senior Economist
National Institute of Standards and Technology
(October 22, 2003)
Topic: Economic Impact of Government R&D Investment
Dr. Gregory Tassey’s presentation, “R&D Investment Trends
and the Role of Government,” provided an opportunity to look at research
and development (R&D) funding within the Federal Government and how R&D
policy impacts the future United States economic climate. Dr. Tassey has spent
his career at the National Institute of Standards and Technology (NIST) engaged
in analyzing the economics of high-tech industries, conducting economic impact
studies, and making economic policy assessments for NIST. He has participated
in governmentwide and joint industry-government policy development. Dr. Tassey
has written three books on the economics of technology policy, published 30
articles in economics policy journals, and recently published Methods for
Assessing the Economic Impacts for Government R&D.
Looking at current R&D policy and its influence on funding, Dr. Tassey
distinguishes between the amount and composition of R&D investment. Beginning
with a set of macro investment and economic impact trends, he uses the life
cycle progression of technology to show the evolution of technology policy
analysis and policy options, including specific insights into the NIST laboratories’ role
of providing technology infrastructure.
The steps in R&D policy analysis, which include review of the causes for
underinvestment, estimation of underinvestment, and technology-based inefficiencies,
were stated at the national aggregate level. Dr. Tassey’s view is to
start with economic policy rationales when determining the scope of technology
infrastructure needs and use NIST funding over time as an example of inadequate
role development and strategic planning.
In terms of R&D composition, the percentage and type of high-tech industries
are what is most important. Dr. Tassey noted that less than 10 percent of
today’s economy is considered high-tech (electronics, pharmaceuticals,
communication services, software and computer-related services). And two-thirds
of America’s R&D is concentrated in the ten states that make up
less than half of the United States population.
A real problem for policy-makers is not having clear R&D composition indicators
to help determine amounts of underinvestment over the technology life cycle.
From years of studying government R&D investment, Dr. Tassey developed
a conceptual framework to probe for causes of R&D underinvestment. The
framework, which he presented, generalizes how the major elements of an industrial
technology combine to create value-add (Gross Domestic Product). Dr. Tassey
contends that the conventional “black box” model by not explicitly
considering these elements fails to give decision-makers the ability to identify
policy leverage points. Using his disaggregated model instead, Dr. Tassey pointed
to the importance of “infratechnologies” and “generic technologies” for
the adaptation of biotechnology’s basic science to commercial products.
Underinvestment in the public good technology elements has been growing steadily
as companies shy away from risky technology application investment and government
support of programs such as the Advanced Technology Program (ATP) fluctuates
on a yearly basis. Bridging the gap to lessen the barrier would support more
technology commercialization, but would require a long-term perspective on
economic growth.
According to Dr. Tassey’s studies of R&D intensity indicators such
as the Industrial Research Institute’s “Sea Change” Index,
a steady decline in annual planned investments in generic technology shows
a systematic shift under way in the composition of R&D. This indicates
a general decrease in longer-term investment. Unfortunately, the indicator
does not provide information on specific industries and technologies.
A better understanding of the longer-term structural trends in policy that
support R&D’s evolution from basic science to commercialization,
instead of short-term business agendas, would help to guide the amount of funding
and the composition of R&D investment. Dr. Tassey advises that the use
of economic impact assessments can also help influence R&D policy in terms
of technology transfer.
Finally, Dr. Tassey shared his substantiated “Wish List” of what
R&D funding should look like in the United States based on a national innovation
system strategy. He suggested increasing current funding levels from $130 billion
to $400 billion, matching the higher rate of investment that occurs in the
manufacturing sector. He also cited studies that indicated national R&D
should increase by a factor of between two and four. Dr. Tassey noted that
Europe’s answer to ATP is now funded at $4.5 billion a year whereas the
United States’ counterpart for civilian technology, ATP, has to fight
for an annual budget of less than $200 million.
To review the Assessment on Economic Impact of R&D Report, visit: http://www.nist.gov/director/prog-ofc/report03-1.pdf.
William H. Hooke
Senior Policy Fellow and Director
American Meteorological Society
(October 29, 2003)
Topic: Atmospheric Policy: What is it? Why Should I Care?
Dr. William Hooke is the Director of the Atmospheric Policy Program at the
American Meteorological Society (AMS). In this position, he is responsible
for atmospheric policy development across a wide range of disciplines. His
current policy research interests include natural disaster reduction, historical
precedents as they illuminate present-day policy, and the nature and implications
of changing national requirements for weather and climate science and services.
Dr. Hooke began by indicating that atmospheric policy has been impacted by
basic population needs, as well as by population growth. In addition, policy
has been influenced by the increase in resource consumption and by advancements
in technology. These advancements have been far-reaching, including those in
computation, biotechnology, and transportation. Other basic considerations
in the development of policy have concerned assumptions that we have lived
by for years: 1) the assimilative capacity of the atmosphere is infinite, 2)
the climate is unchanging, and 3) that weather is unpredictable. As it has
been realized over the years, these assumptions were erroneous. Instead, it
was the changes in our understanding of these phenomena that motivated the
development and evolution of current atmospheric policy.
Dr. Hooke then examined each of the agencies represented by the ComSci Fellows,
describing specific policies associated with each. These policies and programs
ranged from plume modeling and atmospheric dispersion, relevant to homeland
security concerns, to air quality, which pertains to the health concerns related
to respiratory ailments.
Dr. Hooke then explored the grand challenges in atmospheric policy, which
he articulated as:
- global observations,
- international data sharing (e.g., the Global Earth Observation System),
- public/private/academic roles, and
- education and training.
Following an enumeration of these challenges, a discussion ensued, which included
all aspects of Dr. Hooke’s previous points. Topics that were discussed
included marketing with respect to the global environment and the value of
weather, the formation of policy and the techniques to move it forward, the
role of public opinion in swaying policy, using common sense when enacting
regulations, and the Global Earth Observation System effort.
Mark Boroush
Senior Policy Analyst
Office of Technology Policy
Technology Administration
U.S. Department of Commerce
(November 5, 2003)
Topic: The Department of Commerce’s Role in Technology Transfer
The Department of Commerce and, specifically, the Technology Administration
(TA) play a significant role in developing the Nation’s policy of technology
transfer and reporting the results. Mr. Mark Boroush, an economist with extensive
experience in science and technology policy, delivered a comprehensive presentation
on this role, tracing its legislative history and progress.
Technology transfer refers to the process by which science is commercialized.
Mr. Boroush described the federal effort as harvesting high-risk research funded
by taxpayer dollars. It can take the form of:
- Cooperative R&D agreements (CRADAs),
- Invention disclosure and patenting,
- Licensing of inventions and other intellectual property, and
- Other means of knowledge dissemination, such as publishing research results
in peer-reviewed journals and participating in standard-setting activities.
The last two decades have seen substantial legislative activity as a result
of a huge concern over the ability of the United States to compete globally.
While the United States was unmatched in basic science, it was losing the game
of transferring technology into commercially viable products. Adequate infrastructure
was not in place.
Seeking to bolster the commercialization of federal research, Congress enacted
the Technology Innovation Act of 1980, referred to as the Stevenson-Wydler
Act, and the University and Small Business Patent Procedures Act of 1980, known
as the Bayh-Dole Act. The Stevenson-Wydler Act established the “Office
of Research and Technology Applications” at the laboratory level. A 1986
amendment called for the Department of Commerce to prepare reports on federal
technology transfer activities, the biennial report series.
The Bayh-Dole Act granted patent and licensing rights to universities and
small businesses, along with federal agencies, for their inventions. In 1984,
an amendment to the Act gave new oversight authority to the Secretary of Commerce,
which included developing guidelines and techniques to help agencies in their
technology transfer efforts.
Recent legislation has added new reporting requirements. In the spirit of “management
for results,” substantive annual performance reporting requirements
were mandated by the Technology Transfer Commercialization Act of 2000. The
ten major federal agencies with significant laboratory operations (Agriculture,
Commerce, Defense, Energy, EPA, Health and Human Services, Interior, NASA,
Transportation, and Veterans Affairs) are required to include a report on
technology transfer accomplishments and plans in their annual budget submissions
to the Office of Management and Budget. Under its reporting mandate, the
Department of Commerce provides its assessment in a summary report to the
President and Congress based on the agency submissions, which can be seen
at: http://www.ta.doc.gov/Reports.htm.
As part of its policy-making role, TA chairs the Interagency Working Group
on Technology Transfer, which consists of representatives from major agencies
with sizable science and technology operations. This group, which holds monthly
open meetings at Commerce, has played a significant role in shaping policy.
TA also works with other influential organizations, such as the President’s
Council of Advisors on Science and Technology (PCAST at http://www.ostp.gov)
and the Organization for Economic Cooperation and Development (OECD). Mr. Boroush
cited the Federal Laboratory Consortium as another good source of information
(http://www.federallabs.org).
According to Mr. Boroush, the United States has recovered significantly in
its ability to compete globally because of its technology transfer and intellectual
property policies. Yet, other nations are not standing still. Mr. Boroush sees
publicly-funded research as increasingly important to United States economic
competitiveness. Performance reporting and improved metrics on technology transfer,
which include statistics on patents and licenses and analysis of downstream
outcomes in terms of effects on the commercial entity, the economy and the
laboratory, are allowing a clearer assessment of the technology transfer process
and the public/private partnership. Policy-makers will continue to work to
make the process more efficient and productive, seeking balances such as that
between the open nature of scientific discovery and the protection of intellectual
property rights.
Workshop on Converging Technologies, Emerging Challenges: Societal, Ethical
and Legal Issues at the Intersections of Nanotechnology, Biotechnology, Information
Technology and Cognitive Science
(November 6, 2003)
This special seminar, which was attended by the ComSci Fellows, was hosted
by the Technology Administration of the U.S. Department of Commerce.
Mr. Phillip J. Bond, Under Secretary of Commerce for Technology, introduced
the overall topic of nanotechnology and then turned the floor over to Ms. Sonja
Miller, Founder of Converging Technologies Bar Association. Ms. Miller briefly
reviewed the field of converging technologies by pointing out that this area
is composed of four major branches of science: 1) nanoscience and nanotechnology;
2) biotechnology and biomedicine, including genetic engineering; 3) information
technology, including advanced computing and communications; and 4) cognitive
science – the convergence of psychology, cultural anthropology, linguistics,
economics, sociology, neuroscience, artificial intelligence, and machine learning.
In these technologies, she pointed out, the next Renaissance will be convergence – the
merging of distinct technologies or industries into a unified whole. Scientific
leaders across disciplines, industry experts, and policy-makers envision that
unification of science and technology will achieve vast improvement in our
physical, mental, and social capabilities and well-being, as well as in enhancing
our quality of life, upgrading our educational systems, and increasing our
Nation’s security and economic clout.
Ms. Miller continued by summarizing the societal, environmental, ethical,
and legal issues for the aforementioned four technologies. Nanotechnology,
she pointed out, has already caused concern in the government where $50 million
has been allocated for studies addressing the environmental implications and
remediation of such technology. In the realm of nanobiotechnology, the question
of who has access to, owns, controls, monitors, and stores your genomic-phenomic
profile will have to be determined. She added, can a prospective employer or
insurance company discriminate against you because your genetic profile reflects
a propensity for a certain disease or addiction? Moving to information technology,
Ms. Miller posed the following questions: will the gap separating the information
haves and have-nots be closed or further widened by nanotechnology; and will
we have lost our sense of privacy forever, or will the meaning of privacy change
to parallel the leaps in scientific advancement? Finally, she addressed cognitive
science by asking the question: what is the morality of using genomics, molecular
imaging, and pharmaceuticals based on nanotechnology, to affect the human brain?
The future may well see converging technologies producing direct interfaces
between the human brain and machines, thereby transforming the workplace environment,
sports and communication between peoples.
She summarized by saying that we have entered the Age of Transition. The
unification of science, she continued, from the nanoscale has brought with
it the identification of new ethical areas: nanoethics, bioethics, infoethics,
and neuroethics. She concluded by saying that convergence of these technologies
will accelerate our understanding of who we are and what our capabilities
may be. Strategies for effective transformation must be developed across
all disciplines and sectors of society. By forming unique and novel collaborations
and partnerships, together, we ensure that science and technology will be
developed for the benefit of mankind.
Richard G. Newell
Fellow
Resources for the Future
(November 19, 2003)
Topic: How Economists View Technological Change and Technology Policy in the
Energy Area
Energy policy, technological change, and global climate change converge at
the leading edge of today’s environmental policy debates. In order to
help the ComSci Fellows attempt to disentangle and think clearly about these
issues, and understand the role of public policy in this debate, Dr. Richard
Newell provided an economist’s perspective on energy and technological
change. Dr. Newell is a fellow at Resources for the Future (RFF), a leading
policy institute dedicated to providing objective economic analyses of environmental
and resource issues.
The core issue of energy policy is how to provide as much energy as possible
at an affordable price, while minimizing the deleterious aspects, or by-products,
of energy production. Dr. Newell described a central question of economic analyses
concerning both energy policy and technical change: what are the incentives
to engage in either one of them? Earlier in the year, the ComSci Fellows heard
from Dr. Gregory Tassey of the National Institute of Standards and Technology,
who talked about the role of technology research and development (R&D)
in economic growth. Dr. Newell pointed out that adoption may be a question
very different from technology development. Especially with respect to energy
technology, one may not be able to build it and they will buy it (re-phrasing
a well-worn adage). This is particularly true when an environmental technology
may be costly and intended to remove or reduce production of an unmarketable
product, such as pollution or greenhouse gases.
Dr. Jeffrey Leonard, in another ComSci seminar, pointed out cases where profit-increasing
investments in environmental technology facilitate adoption. Dr. Newell posed
the question, however, of what governments can or should do when the gains
from energy technology may not improve the bottom line. He introduced the concept
of market failure – where private markets fail to create demand (or supply)
for commodities or technologies that might be socially desirable.
If low-emission energy production is something that would benefit society,
Dr. Newell suggested several reasons why markets for such technology may fail
to develop. If energy users do not pay the full cost of pollution originating
from energy production, such as health or ecological costs, then energy producers
may have little incentive to invest in low-emission technologies. If the benefits
of a new technology are widely disseminated, then there may be underinvestment
in R&D – the so-called intellectual property problem. Even when technology
exists, adoption may be slow because potential users may want to see how it
performs before investing in the infrastructure (supply, distribution, support
industries) necessary to promote adoption.
Dr. Newell described two areas where government policies can ameliorate market
failures in clean energy markets – support for technology R&D and
support for technology adoption. Direct research investments are a key government
mechanism to support technological development. Because industry R&D investments
track very closely with energy prices, government investment during periods
of low energy prices can facilitate continual innovation. Directed research
may be used to promote specific directions of new knowledge, but questions
do remain as to who “owns” new knowledge and whether market incentives
will exist to bring new developments to market.
Adoption policies are another area where public involvement has been active
in the past. Tax credits can provide incentives for both producers and consumers
of energy. Credits on production technologies can lower the cost of energy
production using low-emission technologies. Credits for energy efficient machinery
can equalize the cost to energy users of reducing energy costs through purchases
of new machinery. A portfolio system for energy production, which defines a
proportion of production that producers must meet, using low-emission technologies,
may provide producers with the incentives to find maximum efficiency through
trading of emission permits.
The real challenge facing those involved in energy policy, according to Dr.
Newell, is in recognizing that a tension exists between technology policies
and behavioral policies in achieving energy goals. Government policy-makers
need to understand that technological possibility may not align with economic
feasibility and that the most effective lever of policy may be to find ways
to solve market failures.
The website for RFF is http://www.rff.org.
Visit to Western Maryland and Pittsburgh, Pennsylvania on Coal and Power Topics
(December 10-11, 2003)
The ComSci Fellows were very fortunate to have the opportunity to participate
in a program designed to inform them on various aspects of coal and other power
topics (excluding nuclear topics). The program included not only a visit to
a coal-fired power plant, but also the National Energy Technology Laboratory,
and a pilot facility for fuel cells.
The first stop was at the AES Warrior Run coal-fired power plant in Cumberland,
Maryland. The AES staff provided an overview of the unique capabilities of
their facilities, which was then followed by a tour.
Warrior Run is a relatively small power plant, producing up to 180 MW of electricity.
The plant went operational in February 2000, as a part of the Pennsylvania-New
Jersey-Maryland (PJM) power grid. The primary fuels are coal and limestone,
which feed a Circulating Fluidized Bed (CFB), a fairly new design that helps
reduce NOx and sulfur emissions. Beverage-grade CO2 gas is co-generated from
the steam of the power plant and is sold commercially. The plant is co-located
with a CO2 processing facility, thus simplifying the delivery process. Another
by-product – flue ash – is used for mine filling and road construction.
The limestone in the flue ash makes it desirable for mine filling by reducing
the acidity of the ground water around the coal mining sites.
The plant burns approximately 650,000 tons of coal per year, all from nearby
mining sites in Maryland, and uses 120,000 tons of limestone. Besides electricity,
the plant produces approximately 45,000 tons of CO2 gas. The unique mixture
of limestone and coal in the generation process helps reduce harmful emissions
of NOx, and SO2, and aids the extraction of food-grade carbon dioxide.
A question to the AES Warrior Run staff on the benefits of the clean coal
technology of the CFB resulted in an answer with information on emissions.
The NOx numbers directly after combustion and prior to further screening techniques
are typically 0.15 lb/mm BTU, which are already beneath the U.S. Environmental
Protection Agency (EPA) limit of 0.16 lb/mm BTU. By applying a Selective Non-Catalytic
Reduction (SNCR) using Anhydrous NH3, the NOx number is reduced further to
0.10 lb/mm BTU. This makes the plant below EPA New Source Performance Standards
and considerably below typical coal-fired power plants in operation today.
To review more on the Warrior Run Power Plant visit:
http://www.aes.com/businesses/business.asp?projectID=7.
The second stop on the tour was at the Siemens Westinghouse Laboratory in
Pittsburgh, Pennsylvania. The Laboratory is a pilot manufacturing facility
for fuel cell development. Fuel cells are comparable to battery power in their
applications and mobility. However, while batteries operate on stored energy,
fuel cells run on chemical reactions, which create their own energy. The chemical
reaction is turned directly into electricity and heat. Hydrogen and carbon
monoxide (CO) are the most commonly used fuel sources. Although there is a
variety of fuel cell power sources and designs in use in the field, Siemens
Westinghouse is focusing its efforts on the Solid Oxide Fuel Cell (SOFC) in
a tubular design (rods). The SOFCs offer a vast improvement over combustion
engines with regard to fuel efficiency (50 percent efficiency versus 30 percent),
waste products (environmentally-friendly water and heat instead of greenhouse
gases), and an increased longevity of power over batteries. In addition, SOFCs
provide low noise operation and are air-cooled, so no water is needed. If coupled
with a gas turbine, a hybrid capability could increase the energy efficiency
of fuel consumption up to 60 percent.
Siemens Westinghouse initially sees this technology as a player in the stationary
back-up power market until costs can be decreased. Also, once the fuel cell
size can be reduced in addition to a decrease in cost, the company may be able
to replace batteries with fuel cells in mobile applications, such as in cellular
phones. Currently, a 250 kW prototype fuel cell generator is about the size
of a tractor-trailer.
It is felt that the markets that can be currently targeted by the SOFC are
stationary applications, including hospitals, office buildings, and utilities,
which use in the range of 5 kW – 250 kW. These would include portable
power of < 1 kW (e.g. military uses), residential power of 1 kW – 10
kW, and remote applications in the range of 1kW – 50 kW. The first commercially-available
generator will be the CHP-125, a 125 kW, with an electricity efficiency of
47 percent, which is planned for delivery in late 2006 to early 2007. Future
plans include the development of hybrid systems.
Following the presentation, a tour of the laboratory facilities took place,
where one of only two hybrid SOFC generators in existence was viewed.
To learn more about Siemens Westinghouse fuel cell development visit: http://www.pg.siemens.com/en/fuelcells.
On day two of the ComSci Fellows’ visit to Pittsburgh, they found themselves
visiting the National Energy Technology Laboratory (NETL) for a series of presentations.
These lectures ranged from the overall view of America’s energy picture
to how NETL measures and analyzes particulate matter relating to human health
issues.
The morning lectures started off with two very informative presentations given
by Mr. James Ekmann, Associate Director, Office of Systems and Policy Support,
entitled, “America’s Energy Picture” and “NETL Overview.” Mr.
Ekmann first gave the group a big picture look at America’s growing energy
issues by showing how energy profoundly impacts our quality of life, the economy
and the environment. The statistics he revealed showed that the world’s
demand for energy is ever increasing, especially in the use of fossil fuels.
Current projections show that fossil fuels will continue to be dominant until
at least 2030, which means carbon emissions from these fuels will continue
to rise. Many scientists, educators and politicians believe CO2 emissions from
various energy sources are a major contributor to the global warming potential.
Besides discussing processes such as sequestering carbon and developing technologies
(e.g., fuel cells) to efficiently produce, distribute and store hydrogen for
future energy needs, Mr. Ekmann provided an overview of our Nation’s
energy infrastructure and presented what NETL’s key role will be in monitoring
and protecting this infrastructure. Challenges to energy assurance in the United
States are vast, requiring diligence and perseverance to ensure that our Nation
is not susceptible to terrorist attack or cascading system failures.
Mr. Ekmann’s second talk focused on the mission and program outcomes
of NETL. NETL is one of the Department of Energy’s (DOE’s) 17 national
labs, which has an annual budget of approximately $740 million and implements
a science and technology development program to resolve the environment, supply
and reliability constraints of producing and using fossil resources. It has
a technology transfer emphasis, conducting business with 43 foreign countries
and throughout the United States. NETL also has a focus on education, which
includes community initiatives and over 200 extramural research projects with
academic institutions. Outcomes from NETL’s programs assist our Nation
in the following areas: technology, policy, competitiveness, stability, work
force and region. For example, in the areas of technology and workforce, NETL
assists in providing acceptable, affordable, and available energy to the United
States in mid- to long-term timeframes, as well as providing a trained energy
work force through university research programs. Further information on NETL
can be found at: www.netl.doe.gov.
The third presentation was given by Dr. Richard P. Noceti, Director, Fuels
and Process Chemistry Division, on “Computational Energy Research at
NETL.” Dr. Noceti explained computational approaches that help experimental
research to address energy production, storage, distribution, conservation
and security. Computational energy research focuses mostly on the interaction
of atoms at the nano-scale level. The goals of computational energy science
are to develop science-based computational tools and apply them to stimulate
clean, highly effective energy plants of the future, as well as developing
virtual simulation capabilities. The development of such capabilities provides
NETL a prediction tool for interactions of turbines, fuel cells, combustors,
environmental control systems and other major components. This virtual simulation
concept can also predict dynamic responses of an entire energy plant.
The last four lectures focused on particular areas that have a direct influence
on America’s total energy picture. Mr. Terry E. Ackman, Water and Energy
Team Leader, Environmental Science and Technology Division, spoke on the “Water
and Energy Interface.” Mr. Ackman made it very clear that water and energy
are closely linked. It is recognized that water is vital for all forms of energy
production, and that there is a need to ensure that energy requirements are
met in a sustainable manner. The growing demand for fresh water in commercial,
industrial and public sectors is staggering. Power plants are the second largest
users of fresh water in the United States, requiring 132 billion gallons per
day. Another water and energy interface with a major effect on freshwater supplies
in the United States is underground coal production by various mining methods.
The possibility of contaminated ground and surface water mixing with rivers
and streams is a very real problem.
Mr. Robert Warzinski with the Environmental Science and Technology Division
in the Office of Science and Technology spoke about the processes needed for
safe, cost-effective and verifiable carbon sequestration. The principal focus
of this type of research is the capture and permanent storage of CO2, as well
as other toxic emissions. Coal seam, brine and ocean sequestration and the
core facilities to study and refine the necessary capture and storage techniques
are being developed to address this important issue. Though there are still
significant barriers facing scientists in the capture, separation and sequestration
of CO2, the Unique Modular CO2 Capture Facility, the Geological Sequestration
Core Flow Lab and NETL’s High-Pressure Water Tunnel Facility are leading
the way in establishing trained scientists and engineers in sequestration research.
The talk from Mr. Evan Granite with the Clean Air Technology Division in the
Office of Science and Technology was entitled, “GP-254 Process for Photochemical
Removal of Mercury from Flue Gas.” He is involved in a $14 million external
and in-house mercury research program at NETL. This program is the largest
in the United States and its major goal is to develop more effective mercury
control options in anticipation of future regulations. Mr. Granite explained
that mercury is difficult to capture and measure and the present benchmark
technology has some application deficiencies. One type of application that
does show more promise in measuring elemental mercury in the environment is
a photochemical oxidization process.
The final lecture by Mr. Donald Martello, also with the Clean Air Technology
Division, was on “Ambient Air Sampling and Analysis,” and focused
on coal power and PM2.5 (particulate monitoring at 2.5 microns and lower).
Power plant emissions probably contribute significantly to secondary PM2.5
mass and the effect of those emissions on human health is not well-known. The
program objective is to provide high-quality scientific data and analysis for
use in policy and regulatory determinations. In other words, they are attempting
to identify knowledge gaps between ambient air measurements and personal exposure.
The FY 2004 work plan objective is to continue comprehensive ambient air monitoring
at NETL with specialized monitors and to submit monitoring and characterization
data to scientific databases.
Ari Patrinos
Associate Director of Science for Biological and Environmental Research
U.S. Department of Energy
(December 17, 2003)
Topic: Public Support for Gene Research and Biotechnology: Public/Private
Partnerships and the Ethics of Genetic Research
While many people are familiar with the recent Herculean effort to sequence
the human genome, few people are probably aware that the federal program to
understand the human genetic code originated from concern about the effects
of ionizing radiation on humans. The link between radiation and human biology
is based on observations that genes are the mechanism through which radiation
effects are expressed. As Dr. Ari Patrinos describes it, these basic concerns
and observations were the basis for what is now an extremely diverse exploration
of the frontiers of biology. As the Associate Director for Biological and Environmental
Research in the Department of Energy’s Office of Science, Dr. Patrinos
oversees a wide research agenda at the Department of Energy (DOE), one which
combines elements of the biological, physical, and computational sciences.
As part of his management portfolio, Dr. Patrinos is also Director of the
DOE component of the U.S. Human Genome Project. He provided the ComSci Fellows
with an in-depth view of this aspect of his office’s research, relating
both its history and its future direction. He also provided some evocative
lay facts about DNA – it is a molecule six feet long when unraveled;
all the DNA from a single persons cells, when placed end-to-end, would reach
to the moon and back three times; typing out the sequence of the base pairs
(pair of amino acids making up DNA) using just a single letter, A, T, G, or
C to represent a pair, would fill three New York City telephone books for a
single person’s DNA.
By the 1980s, some science policy-makers came to the conclusion that, in order
to really understand genetic phenomena, a complete sequence of the human genome
would be necessary. This was not uncontroversial because some scientists believe
that sequencing the three billion base pairs of DNA making up the human genome
would be wasteful, given that genes are made up of small sets of base-pair
DNA sequences and much of the complete genome is thought to be “junk” DNA.
Additionally, there was concern that a project of this scale would take on
an industrial appearance to research and crowd out research in other programs.
Moreover, the science philosophy behind the human genome project departed from
traditional approaches in biological research, which is based on formulating
hypotheses and then testing them with data. In very broad terms, some saw the
effort to sequence the human genome as a massive effort to collect data that
would then be used to generate hypotheses.
In spite of these controversies, Congress appropriated funds to support the
Human Genome Project, which was launched in 1990 and expected to take 15 years
and $3 billion to complete. At the time, sequencing technology was crude, compared
to today’s technology, and sequencing large amounts of DNA cost approximately
$1 per base pair (today, the cost is one-tenth of one cent per base pair).
After an initial period of developing sequencing techniques in a large number
of laboratories, the program focused on the most promising techniques and concentrated
the sequencing effort in three United States laboratories.
Dr. Patrinos shared some insight into the relationship between the public
program run by DOE and the National Institutes of Health (NIH) and a private
venture to sequence the human genome organized by Celera. The public/private
partnership generated some controversy over where public information ended
and private intellectual property began. Many in the public sector were concerned
about a for-profit company being able to control information about human gene
sequences (both access and timing of access), even if the information was free
to researchers but on a subscription basis for other life-sciences companies.
These controversies have not yet been resolved.
Similarly, there are a number of other social, legal and ethical implications
of the Human Genome Project. Dr. Jim Watson, co-discoverer of DNA, was the
first director of the project in DOE and was quick to realize that scholarly
research was needed on these social issues, and NIH and DOE spent three to
five percent of the research budget to address these questions. Given the degree
to which genetic sciences are poised to affect modern society, there is a great
need to educate those public policy experts on the science involved. Dr. Patrinos
has funded and attended many week-long workshops to educate judges on biotechnology.
He has also provided educational programs for legislators responsible for drafting
legislation regarding genetic information to assure that they have an understanding
of modern molecular genetics.
Although the Human Genome Project may be the most visible, the Office of Biological
and Environment Research has four areas of research. The life sciences program
includes the Human Genome Project, but also includes research on structural
biology, low dose radiation, and genomes to life, which develops novel research
and computational tools to understand and predict the behavior of complex biological
systems. The climate change program started its carbon dioxide research work
in 1977, and now engages in climate modeling, atmospheric radiation measurement,
carbon and ecosystem research, and integrated assessments to develop and evaluate
environmental and economic costs/benefits of options to reduce carbon dioxide
emissions. The environmental remediation program develops strategies for bioremediation
of metal and radionuclides at DOE sites and explores computational methods
for modeling molecular and environmental processes. Lastly, the medical sciences
program is at the forefront of nuclear medicine, including radiotracers, imaging
techniques and advanced imaging instrumentation.
Further information can be found at: http://www.science.doe.gov/ober/hug_top.html.
Lawrence H. Landweber
Senior Advisor
Directorate for Computer and Information Science and Engineering
National Science Foundation
(January 7, 2004)
Topic: Cyberinfrastructure: Revolutionizing Science and Engineering
Dr. Lawrence H. Landweber is a Senior Advisor to the Assistant Director of
the National Science Foundation (NSF), as well as Research Professor and Professor
Emeritus of Computer Science at the University of Wisconsin. Since 1977, he
has been involved in many important developments in the evolution of computer
networks. In 1979, Dr. Landweber chaired CSNET, the Computer Science Network,
which served as a validation of the Internet concept. He subsequently led the
University of Wisconsin’s participation in the NSF-Defense Advanced Research
Projects Agency (DARPA) Gigabit Testbed project.
Dr. Landweber’s presentation focused on the emergence of cyberinfrastructure – an
integrated system of computation/communication/information that supports a
range of applications – and the role played by NSF, in facilitating the
development of computing network and infrastructure. In the 1960s, NSF provided
early support of campus computing and, in the 1970s supported advanced scientific
computing research. By the early 1980s, NSF had begun to support cross-country
networks, such as CSNET. The 1980s also saw the beginning of the NSF supercomputer
initiative and the birth of NSFNET. In the early to mid-1990s, the National
Center for Supercomputing Applications, an NSF partnership at the University
of Illinois, Urbana-Champaign, released the MOSAIC web browser, and in 1995,
the Internet was privatized. NSF has recently funded the Extensible Terascale
Facility and the National Middleware Initiative.
Information is being collected at a rate that is challenging our ability to
store and analyze it. There are an increasing number of scientific data collection
efforts that have current archive sizes measured in petabytes (a petabyte is
1015 bytes or about one-half of the combined contents of the Library of Congress).
The ability to acquire huge volumes of data poses exceptional challenges regarding
storage, analysis and transmission. Dr. Landweber mentioned that cyberinfrastucture
is in its infancy and is a key emergent technology.
The development of grid computing is expected to meet the future cyberinfrastructure
needs. Large clusters of computers, as many as 1,000, are interconnected and
computing jobs are executed by components of the cluster according to scheduling
agents. An extension of this concept is distributed computing in which large
computing tasks are broken into discrete units that are independently computed
and then recombined upon completion. Commercial grids using distributed computing
are emerging, providing vast resources to users who may not otherwise have
access to this level of resources.
Future cyberinfrastructure faces many challenges, including the need to transmit
hundreds of terabits (1012) per second in network backbones and bring online
hundreds of petaflops (1018 floating point operations per second). Storage
technology will have to be developed to facilitate exabyte (1018 byte) collections.
There will also be the need to integrate and process data from millions of
sensors.
In addition to government-funded efforts, universities and corporations have
been partnering to address cyberinfrastructure needs. In closing, Dr. Landweber
discussed the National Lambda Rail (NRL) project, which he views as an especially
important and innovative example of a university-corporate partnership. The
NRL project is a dedicated network of fiber optic lines spanning the country,
with the goal of providing ultra-high capacity and high-speed links between
selected major research centers. The United States has fiber optic capacity
in excess of what is needed commercially. This vast capacity will enable researchers
at member institutions to share data and computing resources across institutions,
in addition to conducting research on new networking technologies.
The NSF website is: http://www.nsf.gov.
Gary Smith
International Advisor to the Turkish Patent Institute and Consultant on International
Patent Issues
World Intellectual Property Organization
(January 14, 2004)
Topic: International Intellectual Property Issues and WIPO
With an extensive career in intellectual property, Mr. Gary Smith provided
an overview of the World Intellectual Property Organization (WIPO) and related
issues. From 1995 until 2002, Mr. Smith had served as the Director of the Patent
Cooperation Treaty at WIPO. Prior to joining WIPO, he had had a distinguished
25-year career with the U.S. Patent and Trademark Office, part of the U.S.
Department of Commerce.
Mr. Smith began his talk with a short primer on intellectual property, which
consists of patents, trademarks, industrial designs and copyrights. As he explained,
protection of intellectual property provides economic and legal incentives
to share works of creativity with the public. To be considered for a patent,
the invention must be “new, useful and not obvious,” which means
that it must be sufficiently different. The idea of international cooperation
in the area of intellectual property protection came about with the growth
of trade at the end of the 19th Century.
Headquartered in Geneva, Switzerland, WIPO was established in 1967 and became
a specialized agency of the United Nations in 1974, with its own director general.
Its membership currently stands at 179 countries, which pay dues based on ability.
WIPO works toward establishing substantive normalization by negotiating and
administering treaties and providing legal and technical assistance to developing
countries.
Considered by Mr. Smith as the most successful of the WIPO treaties, the Patent
Cooperation Treaty allows applicants to start a filing process for a number
of countries.
There is no such thing as a “world patent.” Applicants then have
to go to each country to have a patent come into force. For the United States,
the U.S. Patent and Trademark Office handles the administrative functions for
patents, trademarks and industrial designs, while the Library of Congress handles
copyright. The protection of intellectual property in this country can be traced
back to the Constitution, with Thomas Jefferson as the first Commissioner of
Patents.
Mr. Smith wrapped up his presentation with a look at some of the major issues
on the table at WIPO, such as whether and how patent protection could be extended
to genetic resources, traditional knowledge or folklore. WIPO has also been
involved with Internet domain name dispute resolution.
More information on WIPO can be found at http://www.wipo.int. The website
for the U.S. Patent and Trademark Office is http://www.uspto.gov.
Andrew W. Reynolds
Deputy and Chief of Staff to the Science and Technology
Adviser to the Secretary of State
U. S. Department of State
(January 14, 2004)
Topic: Science and Technology in 21st Century Foreign Policy
Mr. Andrew W. Reynolds is currently Deputy and Chief of Staff to the Science
and Technology Adviser to the U.S. Secretary of State. Mr. Reynolds joined
the State Department in 1990 as Deputy Director in the Office of Science and
Technology Cooperation, where he worked extensively in Europe, Russia, the
former Soviet Republics, India, Indonesia and Japan to facilitate bilateral
and multilateral cooperation in science and technology (S&T). He has also
served in the capacity of a counselor for basic S&T research, technology
policy, export controls, intellectual property rights protection, economic
analysis, health and sustainable development issues. From 1986-1990, Mr. Reynolds
was Deputy Director in the U.S. Department of Energy, Office of International
S&T Cooperation, where he worked on the United States/Soviet agreement
on peaceful uses of atomic energy to facilitate joint research in high-energy
physics, controlled thermonuclear fusion, nuclear reactor safety, environmental
restoration and waste management. He has also worked in private industry on
energy and public health issues.
Mr. Reynolds began the presentation by stating that S&T are the drivers
for economic development and are ubiquitous in the functioning of the modern
world and the framing and execution of domestic policies and international
relations. S&T - the engines of the modern industrial economies - are seminal
to international cooperation, and are the three pillars of national security—intelligence,
diplomacy and military readiness. Mr. Reynolds commented on the pervasive role
of science, technology and health in foreign policy as imperatives for the
Department of State. Science-based issues are increasingly prominent in the
foreign affairs agenda, from nonproliferation and arms control to global environmental
threats, such as ozone layer depletion and global climate change, HIV/AIDS,
and international S&T cooperative agreements.
In 2000, the Secretary of State, Madeleine K. Albright, established the position
of S&T Adviser to serve as the State Department’s principal liaison
with national and international scientific communities. Dr. George Atkinson
is the second Science Adviser to the Secretary of State. The office of Science
and Technology Adviser to State (STAS) is operating under the Bureau of the
Undersecretary for Global Affairs. STAS has been successful in implementing
a departmentwide initiative, “Science and Diplomacy: Strengthening State
for the 21st Century,” to integrate science, technology and health issues
into foreign policy and to strengthen ties with S&T communities inside
and outside the government. The responsibilities of STAS are to promote more
effective integration of sound scientific and technical information and scientifically
literate personnel into the United States foreign policy process, to develop
more effective foreign policy for United States S&T that will permit the
United States to draw on and benefit from S&T strengths and resources available
abroad, and to enhance utilization of the potential of United States S&T
cooperation with other countries to strengthen overall relations with those
countries and to address common problems of global concern.
In keeping with these responsibilities, the strategies for achieving the broad
objectives of the Science and Diplomacy Initiative are to ensure that sufficient
and sustainable capacity exists to address science, technology, and health
issues in foreign policy by increasing the number of scientists in the Department
and by providing more training and exposure to S&T issues, and outreach
to the scientific community by building partnerships with the outside S&T
community in the United States and abroad. Since 2001, the number of Science
and Diplomacy Fellows in the Department has tripled. Forty Fellows are now
working in twelve Bureaus in the Department and at several missions abroad.
Mr. Reynolds mentioned various education initiatives at the State Department,
including student internships and scholarships as well as fellowship programs
such as Fulbright Scholars, Jefferson Science Fellows (JSF) and others for
scientists and engineers, that are designed to increase mutual understanding
between the people of the United States and people of other countries by providing
opportunities to study and teach in each other's country, exchange ideas, and
develop joint solutions to address shared concerns. He mentioned the establishment
of the JSF program as a three-year pilot program for tenured professors at
United States degree granting academic institutions to spend one year at the
U.S. Department of State for an on-site assignment in Washington, D.C. and/or
U.S. foreign embassies and/or missions. The JSF program is administered by
the National Academy of Sciences and is supported by grants from the John D.
and Catherine T. MacArthur Foundation and Carnegie Corporation. Mr. Reynolds
noted that 25 Fulbright masters from Iraq would be coming to the United States
in June 2004 to be trained in public health programs for two years. Mr. Reynolds
identified the United States as a nexus of science and diplomacy.
Examples of current efforts to promote S&T cooperation, education, and
economic stability in nations across the globe were also discussed, including
United States participation in the World Summit on Sustainable Development
held in Johannesburg in 2002, which had its roots in the 1992 United Nations
Conference on Environment and Development and the resulting Agenda 21 that
addressed ways to reach global sustainable development in the 21st Century.
The Millennium Challenge Account, an initiative to decrease poverty and increase
economic stability in developing nations that was proposed by President Bush
in 2003, was also discussed. Mr. Reynolds noted that having the United States
again as a partner in UNESCO opens up many important avenues for discussion
on basic science, mathematics and engineering education and cultural heritage,
and that the United States will make important contributions to UNESCO’s
mission to seek practical improvements in human life around the globe.
In closing, Mr. Reynolds noted that many challenges lie ahead for the STAS
and the United States, and that S&T will play a key role in the solutions.
He made reference to several global issues that must be addressed in the 21st
Century, including cloning, nanotechnology, biotechnology, cyber security,
global economic sustainability, energy (nuclear, hydrogen fuel cells, fossil
fuels, renewal sources), health (HIV/AIDS, SARS), and genetically modified
food. Mr. Reynolds emphasized the importance of both food and energy to world
stability, observing that science will be a central element in changes in food
production, energy use, and the ability to extend life. Most importantly, S&T
will determine how societies will be organized in the 21st Century.
Website: http://www.state.gov/g/stas/.
James L. Connaughton
Chairman
White House Council on Environmental Quality
(January 22, 2004)
Topic: Attaining Productive Harmony in Environmental Policy in the 21st Century
On January 22, 2004, the Resources for the Future (RFF) Policy Leadership
Forum hosted Mr. James L. Connaughton, Chairman of the White House Council
on Environmental Quality, to address future environmental policy. Mr. Connaughton
is the senior environmental advisor to the President, as well as Director of
the White House Office of Environmental Policy. The RFF Policy Leadership Forum
provides prominent public policy-makers with a neutral site to present and
discuss their ideas on important energy, environmental, and natural resource
issues.
Mr. Connaughton began by stating that the country has made great progress
in protecting the environment. He pointed out that new technologies have expanded
the United States economy since the 1800s also enabling the United States to
implement sound environmental solutions. Over the past 30 years, air pollution
in the United States from the six major pollutants decreased by 48 percent,
even as population grew 39 percent; domestic energy consumption increased 42
percent, and the economy grew 164 percent; in just three decades the number
of our citizens who benefit from modern wastewater treatment doubled from 86
million in 1968 to 165 million citizens today; and, in the wake of celebrating
the 30th anniversary of the Clean Water Act, the United States has dramatically
improved the overall health of its marine waters, lakes, rivers, streams, and
wetlands.
He then reviewed the core drivers for the Bush Administration’s environmental
policies. First and foremost, he said, the Administration is focused on results.
Is the air cleaner? Is our water better protected? Are contaminated lands being
cleaned up? Are United States’ parks well-maintained and managed? Second,
Bush Administration decisions must be based on sound science and quality data.
Third, the government places a strong emphasis on innovation in technology
and policy. Fourth, the Administration also places a strong premium on understanding
the impact of a policy on the people directly affected by it, and fosters more
local collaboration to develop more local solutions. Fifth and finally, he
stated, that the Administration looks for solutions that will harness and amplify
America’s ethic of personal stewardship and responsibility.
Mr. Connaughton then summarized some of the major environmental accomplishments
of the Bush Administration. He noted that one of today’s most compelling
challenges is the effort to restore the health and vitality of our cities.
He stated that two well-intentioned environmental policies converged to impose
significant barriers to new investment in our urban centers. The first is the
prospect of Superfund liability for taking on the task of redeveloping abandoned
industrial sites; and the second is the prospect of regulatory uncertainty
and high cost of expanding existing facilities, or siting new facilities to
meet increasingly stringent air quality standards. Mr. Connaughton pointed
out that the President’s Brownfields and Clear Skies initiatives, along
with new interstate air quality and diesel pollution regulations, are designed
to substantially turn this situation around while still meeting public health
and environmental protection objectives.
In his view, the next bundle of challenges occurs in the rural community.
Mr. Connaughton said that, whether addressing the impact of non-point source
run-off on water quality, responding to the need for wetlands and ecosystem
restoration, helping to preserve species, or finding ways to cost-effectively
sequester greenhouse gases, the Nation’s farmers and ranchers figure
prominently. He maintained that efforts to impose more classic regulatory
approaches to address these issues expose the practical, economic, and political
difficulty of completing and sustaining such measures. That is why the Administration
decided to look at these issues from the opposite perspective – a way
that was certain to evoke a more meaningful response, that would help align
agricultural and environmental policies, and that would tap into the stewardship
potential inherent in the agricultural community. He noted that the Administration
pushed for, and Congress resoundingly enacted, an historic expansion of the
Farm Bill’s conservation programs to roughly $40 billion.
Mr. Connaughton then moved on to discuss climate change policy. He stated
that the most constructive path to progress on climate change policy for the
United States should begin with the following key factors: 1) slowing the increase
in emissions, and then, as the science justifies, stopping it, and then reversing
it; 2) assuring that the Nation operates within the practical, political, and
technological realities of its domestic greenhouse gas emissions profile; 3)
finding more constructive ways to partner with the developing world through
understanding its global profiles of greenhouse gas intensity; 4) concentrating
its effort on, and being realistic about, the timelines for deployment of the
transformational technologies; and 5) building on the substantial common ground
that exists among policies at the national level.
In concluding, Mr. Connaughton noted that, in order to achieve harmony, the
Bush Administration must harness the power of economic growth, better integrate
environmental objectives into other policy arenas, continue the work of constructing
human networks at the national, state, and local levels, capitalize further
on the investment in information and the powerful new technologies, place even
more of a premium on collaboration and consensus processes, and, lastly simplify
to better enable the environmental stewards to produce real results.
The website for the White House Council on Environmental Quality is: http://www.whitehouse.gov/ceq and the website for Resources for the Future is: http://www.rff.org.
Alan I. Leshner
Chief Executive Officer
American Association for the Advancement of Science
(January 29, 2004)
Topic: The Role of Large Professional Societies in Shaping Scientific Policy
and Progress
As Chief Executive Officer of the American Association for the Advancement
of Science (AAAS), Dr. Alan I. Leshner provided his perspective on “The
Role of Large Professional Societies in Shaping Scientific Policy and Progress,” which
he dubbed “Lessons from the Drug Wars and Other Stuff I’ve Done.” With
a membership of 130,000 individuals and 272 affiliated societies, AAAS is the
oldest and largest multidisciplinary scientific and engineering society in
the United States. Dr. Leshner also serves as executive publisher of Science magazine, which is published by AAAS.
Prior to assuming his present position in December 2001, Dr. Leshner, a behavioral
endocrinologist, had served as Director of the National Institute on Drug Abuse.
In addition to senior positions with the National Institute of Mental Health
and the National Science Foundation, he was Professor of Psychology at Bucknell
University for ten years.
Dr. Leshner set the stage by describing the two different forms of science
policy – policies that relate to science or those using or informed by
science – and the different roles played by scientists in each category.
Regarding policies about science and its conduct, scientists are viewed as
advocates. In the second category, scientists act as advisors to the policy-makers,
providing scientific expertise.
He also outlined a number of principles for scientists acting in an advisory
role, making a distinction between scientific data and values, or beliefs.
While policy is made on the basis of facts and values, scientists should act
as the providers of facts on a given issue within their specific areas of expertise.
Scientists have an obligation to guard the integrity of science. They should
be bound by the data, while policy-makers may not be. Dr. Leshner views the
latter group as the value people.
Dr. Leshner also shared his thoughts on the role of scientific societies in
public policy. Unlike AAAS, specialized societies advocate funding for their
fields. AAAS represents the entire scientific enterprise and, therefore, needs
to be balanced across all fields. AAAS rarely advocates funding and does not
take agency-specific positions. AAAS has a long tradition of not joining group
causes. However, the AAAS Board will take a position on issues of sufficient
magnitude, such as cloning.
Noting the role of AAAS in public engagement, Dr. Leshner does see an obligation
to get out to the general public on issues of priority to them – the
public agenda. AAAS has an array of science policy programs on issues such
as science, ethics and religion, scientific freedom and science and human rights.
AAAS also does an analysis of the federal research and development budget,
holds an annual science and technology policy colloquium and runs a science
and technology fellowship program.
Closing with some comments on Science magazine, Dr. Leshner pointed out that
its editorial operations are completely independent of AAAS, with its own staff
of reporters and editors. The content breakdown is about 55 percent on the
life sciences and about 45 percent on the physical sciences. Using a rigorous
review process, the magazine publishes about one out of every eleven of the
11,000 manuscript submissions received a year.
The AAAS website is: http://www.aaas.org.
James. R. Zimbelman
Geologist and Department Chair
Center for Earth and Planetary Studies
Smithsonian Institution
(February 4, 2004)
Topic: Site visit to the Smithsonian Center for Earth and Planetary Studies
Since 1988, Dr. James Zimbelman has been Chair of the Center for Earth and
Planetary Studies at the Smithsonian Institution, which serves as a NASA-supported
Regional Planetary Image Facility. The Center, located in the Smithsonian’s
very popular Air and Space Museum, also utilizes Dr. Zimbelman, whose doctorate
is in geology, as a staff geologist specializing in volcanic and sand dune
issues.
The primary mission of the Center is to act as a reference library by providing
planetary science researchers with access to an extensive collection of image
data obtained from planetary missions. A small, but dedicated, staff of 16
works at the image facility, which houses and manages over 300,000 public domain
photographs and images of the planets and their satellites.
The Center is known to house excellent quality photos, particularly from the
early manned missions, such as Gemini. Photo images from the Center’s
extensive collection of space shuttle photographs, as well as selected images
from remote sensing experiments, are used for research in comparative planetology.
Most of the Center’s focus is now on other planets. With funding from
NASA, the Center is also obliged to institute proper care for the images and
perform outreach efforts to prospective users.
Dr. Zimbelman introduced the ComSci Fellows to the world of planetology by
presenting a wall mural map of the equator on Mars. Craters, volcanoes and
large canyons on Mars were easily found. He then allowed the ComSci Fellows
to see firsthand the photo images that are catalogued and used by visiting
researchers.
During the visit of the Center, NASA’s latest unmanned mission to Mars
was sending back photos with encouraging images that showed evidence of water
having been present on the planet. This evidence of water is most important
to scientists who conclude this could mean the possibility of life having been
on Mars.
The extremely sophisticated cameras that were sending back photo images from
Mars contained large numbers of pixels that strain the current equipment. Dr.
Zimbelman told the ComSci Fellows that in 2005 there will be a mission with
even more powerful cameras that will need tetrabyte size computers in order
to capture and contain the images.
The dazzling pictures from Rover sparked many questions from the group about
how the Center staff was involved with cataloguing the images from the mission.
Dr. Zimbelman told the ComSci Fellows some of the early photo images are so
dense that current technology does not exist to digitalize them.
An unexpected feature of the tour was a visit to Dr. Zimbelman’s office
that has a grand view of the Mall and the other museums. After talking with
Dr. Zimbelman, the ComSci Fellows toured the “Exploring the Planets” exhibit
and then were guests of Dr. Zimbelman in the state-of-the art, digital technology-equipped,
Albert Einstein Planetarium for "Infinity Express: A 20-Minute Tour of
the Universe."
The new digital all-dome system uses the Zeiss VI – a projector presented
by West Germany to the United States as a bicentennial gift combined with 12
powerful projectors to pump seamlessly blended space imagery onto the entire
surface of the 70-foot-high planetarium dome. The images extend beyond peripheral
vision, creating the sensation of a three-dimensional journey as if zooming
through the solar system, past the Milky Way, to the very edges of the cosmos.
With this digital all-dome technology the Smithsonian meets the challenge of
providing solid educational content that is both engaging and fun.
The CEPS website is: http://www.nasm.si.edu/ceps/.
Mario Cardullo
Counselor, Technology and Entrepreneurism
Office of the Under Secretary
International Trade Administration
U.S. Department of Commerce
(February 11, 2004)
Topic: Department of Commerce International Science Issues
Mr. Mario Cardullo currently serves as the Counselor for Technology and Entrepreneurism,
to Mr. Grant Aldonas, the Under Secretary of Commerce for the International
Trade Administration. An experienced engineer and management professional specializing
in technology management, Mr. Cardullo introduced the ComSci Fellows to his
world of the business side of technology by giving a presentation on the structure
of the United States venture capital industry.
He explained the importance of knowing and understanding each sector’s
investment philosophy as it applies to technology transfer. The investment
strategy of the semiconductor industry is very different than that of the mobile
power source industry. Mr. Cardullo went on to say that not only is it important
to know the technical terms, but, to define policy, it is also necessary to
know the market and the competition it faces.
In the newly formed Office of Technology and Entrepreneurism, Mr. Cardullo
employs his entrepreneurial skills to lead a deputy and an office of detailees
who work with business angel networks, or venture capitalists, to sponsor technology
showcases for entrepreneurs around the world. Mr. Cardullo has focused on finding
sponsors for the international technology forums that he organizes. To date,
there have been several very successful forums in Japan, France, Russia and
Thailand. This year, there are plans for additional forums in Peru, China,
Chile, Jordan, Italy, Hungary and Romania. Much of what is organized is done
by employing “bootstrapping” techniques that utilize other agency
funds to support the technology forums. At the forum in Japan, Mr. Cardullo
was able to attract 25 entrepreneurs and several government officials, venture
capitalists and lawyers interested in growing companies around promising technology.
Mr. Cardullo takes a macro level view of his mission to stimulate technology
entrepreneurism and has proposed establishing a $10 billion venture capitalist
fund managed by a non-governmental organization tied to the Department of Commerce.
He views trade as a link to entrepreneurism and democracy, particularly in
the emerging markets of Eastern Europe and South America. Mr. Cardullo is also
currently writing a report listing the hot technologies and outlining where
the global opportunities exist.
Mr. Cardullo has been the founder or principal in a number of technology companies
and is the inventor of one of the basic patents for the soon-to-be ubiquitous
radio frequency identification tag (RFID-TAG). For this invention and for conceiving
the mobile communication satellite concept, Mr. Cardullo was nominated for
the 2003 Presidential National Medal of Technology.
To read more about Mr. Cardullo and his invention of the RFID Tag read “Genesis
of the Versatile RFID Tag,” go to: http://216.121.131.129/article/articleprint/392/-1/2/.
Harold Craighead
Co-Director for Research
Nanobiotechnology Center
Cornell University
(February 25, 2004)
Topic: Biological Applications of Micro and Nanoscale Devices
Dr. Harold Craighead is currently the Co-Director for Research at the Nanobiotechnology
Center at Cornell University. With a background in physics, Dr. Craighead has
previously held positions at Bell Laboratories and Bellcore before joining
Cornell as a Professor in the School of Applied and Engineering Physics in
1989.
The main topics of Dr. Craighead’s presentation dealt with different
kinds of nanoscale devices that are used as biological probes. He emphasized
that his work has nothing to do with the popular concept of nanotechnology
being synonymous with “nanobots.” On the contrary, the tools used
for his research include fabrication technologies, such as photolithography,
electron beam lithography and scanning probes. All of these can be employed
to approach the molecular level and apply nanofabrication techniques to biological
systems.
The first part of Dr. Craighead’s talk dealt with single molecule studies
and the use of nanoprobes to gain insight into biochemistry and biological
systems in general. These probes include detector arrays that test cellular
functions, both spatially and temporally. For example, chemical patterning
has been used to examine antigen response at the cellular level by using fluorescent
tags on patterned proteins fixed on a substrate. The patterning aids in “seeing” the
antibody-antigen response. This technique has been used to study how toxins,
such as tetanus or cholera, enter a cell through the cellular membrane, here
simulated by a lipid bilayer model. Electrical and optical detectors were also
used to test surface topography models, such as rust spore entry into a leaf
stoma.
The second area of discussion was on the molecular level; i.e., to use nanodevices
to probe molecular activity. Dr. Craighead described a DNA molecule being driven
into an electrical field. This stretches out the normally coiled molecule and
its length can be measured. This data can be used as a kind of crude DNA fingerprint.
Another research area is to use DNA polymerase for sequencing the base pairs,
as the enzyme makes a copy of a test DNA molecule. Optical spectroscopy is
used to view the binding of single base pairs, as it appears to be a burst
of fluorescent activity. Another detection method is using a cantilever coated
with antibodies, dip it into a cell suspension and use a resonant device to
detect the oscillation shift. The difference in the binding of cell mass is
proportional to the mass shift, with a calculated sensitivity of 0.4 attograms.
Dr. Craighead also discussed the ethics of nanotechnology with the ComSci
Fellows. He noted that the ethics considerations should be the same as in any
branch of scientific research. Much of the apprehensions of the general public,
such as decrying the ethics of “nanobots,” can be answered by increased
public education to promote better understanding of what nanotechnology is
(and is not). He described the nano-oscillators designed by his research group,
where vibration measured using electro-optical-thermal-mechanical effects due
to a laser. The shape of the oscillators can be modified, and the idea of a
benign “harp” or “guitar” shaped oscillator was used
to counteract the public fear of “nanobots.”
In summary, Dr, Craighead emphasized that Cornell, as an academic institution,
has goals that are science and technology oriented, and not for definitive
applications. The work he described will feed applications in the future, but
scientists, as well as the public, need to have realistic expectations of what
nanotechnology will do.
The Nanobiotechnology website is: http://www.nbtc.cornell.edu.
Visit to the U.S. Botanic Garden
(March 3, 2004)
The ComSci Fellows’ visit to the U.S. Botanic Garden (USBG) provided
a welcomed respite from the cold spring of 2004. Dr. Christine Flanagan, Manager
for Public Programs, presented an overview of the history, policies and plans
of the USBG followed by an in-depth “behind the scenes” tour. The
visit truly helped the ComSci Fellows to understand the mission of USBG, which
is to demonstrate the ecological, economic, cultural, and aesthetic benefits
of plants, and to maintain a collection of rare and endangered plant species
through partnerships with other organizations and countries.
The idea of a national botanic garden first emerged in 1816 when the Columbian
Institute for the Promotion of Arts and Sciences in Washington, D.C. proposed
the creation of a garden for the benefit of the American people. In 1820, Congress
designated an area west of the Capitol grounds between Pennsylvania and Maryland
Avenues for the purpose of establishing the U.S. Botanic Garden. In 1842, the
idea of a national garden was further invigorated with the addition of a collection
of living plants acquired from the recently returned United States Exploring
Expedition to the South Seas (the Wilkes Expedition).
The USBG moved to its present location on Independence Avenue S.W. in 1933,
and includes a Conservatory and two acres of surrounding exterior grounds,
the outdoor display gardens in Frederic Auguste Bartholdi Park, and the Administration
Building. A plant production and support facility, opened in Anacostia in 1993,
includes 34 greenhouse bays and maintenance shops. In addition, there are plans
to build The National Garden, funded by the National Fund for the USBG on three
acres directly west of the Conservatory. Currently, the USBG maintains about
26,000 plants that are used for exhibition, study, and exchange with other
institutions. Plant variety is immense, including economic plants, medicinal
plants, orchids, cacti and succulents, bromeliads, cycads, and ferns. At any
one time, about 4,000 of these are on public display in the Conservatory and
around the grounds.
The Architect of the Capitol through the Joint Committee on the Library of
Congress is responsible for the maintenance and operation of the USBG, and
for any construction, changes, and improvements made to the buildings and grounds.
In the mid-1990s, the USBG initiated a major renovation and reorganization
of buildings and staff. Renovations required the Conservatory and other buildings
to be closed for four years. Staff changes were accomplished through buyouts
and retirements to address reorganization needs. The long-awaited changes have
provided not only state-of-the-art environmental controls for the benefit of
both plants and people, but also an educational living plant museum that will
help ensure long-term protection of our precious plant resources.
Additional information on the U.S. Botanic Garden can be found at: http://www.usbg.gov.
Kei Koizumi
Director, Research and Development Budget and Policy Program
American Association for the Advancement of Science
(March 8, 2004)
Topic: Federal R&D Investment in Fiscal Years 2004 and 2005
Offered as a special ComSci session on the federal budget, a presentation
on the research and development (R&D) segment was given by Mr. Kei Koizumi,
Director of the R&D Budget and Policy Program at the American Association
for the Advancement of Science (AAAS). A recognized authority on the Federal
Budget and R&D funding, he is the chief writer and editor of the annual
AAAS reports on R&D. Serving as a nonpartisan source of information, AAAS
has been doing budget analysis for 30 years and Mr. Koizumi, for the last nine.
R&D is very much a part of the discretionary portion of the federal budget,
accounting for one-sixth of it. While 25 agencies are involved, seven of them
fund about 96 percent of R&D. The Departments of Defense and Health and
Human Services, primarily the National Institutes of Health, have the largest
proposed budgets for FY 2005. The remaining five of the seven, in descending
order, are the National Aeronautics and Space Administration, the Department
of Energy, the National Science Foundation, the U.S. Department of Agriculture,
and the Department of Homeland Security.
Federal R&D funding proposed for FY 2005 is at $132 billion – a
4.3 percent increase over FY 2004. This increase would be spent on weapons
development and homeland security R&D. According to Mr. Koizumi, the FY
2005 Federal Budget in line with recent trends would provide flat or declining
funding for most R&D programs. Funding for non-defense basic and applied
research, or, “6.1” and “6.2,” would remain basically
flat at $55.7 billion. Among multi-agency science and technology initiatives,
nanotechnology would receive the most funding, with the networking and information
technology initiative and the Climate Change Science Program coming in second
and third, respectively.
Regarding trends in research by discipline, funding for the life sciences
has been increasing since 1970, while it has remained mostly flat for the other
disciplines, such as the physical sciences, engineering, environmental sciences
and mathematics/computer science. Mr. Koizumi did cite the sharp decline in
increased funding for the National Institutes of Health in the FY 2005 budget
in contrast to recent years, with most of the institutes up only three percent.
Looking at R&D worldwide, Mr. Koizumi pointed out that the United States
funds the largest amount – approximately 37 percent of the total. Japan
comes in second at 14 percent and Germany, third at 7 percent. China and South
Korea are substantially expanding investment. Unlike other nations, the United
States sets R&D funding according to national priorities, such as defense
and health.
Over the next few years, Mr. Koizumi predicts that both defense and non-defense
discretionary spending will decline, as both the President and Congress try
to reduce the budget deficit by cutting domestic spending. And, in the nearer
term, none of the domestic appropriations bills may be passed before the elections
this year.
Information on the AAAS R&D Budget and Policy Program and its reports
can be found at: http://www.aaas.org/spp/rd.
David S. Trinkle
Program Examiner, Science and Space Programs Branch
Office of Management and Budget
(March 8, 2004)
Topic: The Federal Budget Process
Mr. David S. Trinkle, a program examiner with the Office of Management and
Budget (OMB) in the Executive Office of the President, focused his remarks
on the development of annual discretionary spending in the Federal Budget.
With OMB since 2000, Mr. Trinkle is the principal analyst for government research
and development (R&D), handling many R&D-related management, budget
and policy issues that cross agency lines. He oversees selected interagency
R&D efforts, including the National Nanotechnology Initiative, Networking
and Information Technology R&D, and coordination groups of the National
Science and Technology Council.
Mr. Trinkle began by pointing out that discretionary, as distinct from mandatory,
spending is controlled through the annual appropriations process. Each federal
agency submits a budget in the fall, with the respective OMB examiners providing
help in the months prior to this time. OMB will then provide feedback on the
submitted budget. Each agency then has a limited time to appeal funding differences
to OMB. After the appeal process, there is a “settlement” for each
agency, and the entire budget is then officially announced by the President
in February. The announced budget sets the tone for future discussions and
is quite detailed. It is then sent to Capitol Hill for review by the House
and the Senate.
The House and Senate Budget Committees develop their own versions of a budget
resolution. Mr. Trinkle described a budget resolution as basically an agreement
Congress makes with itself not to spend more than the specified amount of money.
If the traditional schedule holds, both are developed by early April and the
leading budget committee members from both chambers develop a consensus agreement
called a conference report that is typically adopted in April/May. The two
chambers arrive at a concurrent budget resolution, which is not formally a
law and does not require the President’s signature. The budget resolution
sets in motion legislation that, when enacted, has the force of law. The House
and Senate Appropriation Committees then report changes in law to comply with
the budget resolution and develop the appropriation bills for the government.
The House and Senate separately vote on 13 different appropriations bills.
The different versions of these bills must be reconciled between the House
and Senate and final approval given. The appropriations bills are then sent
to the President for either his approval or veto.
With some debates on appropriations becoming difficult to resolve, the different
appropriations bills are often not finalized prior to the start of the fiscal
year that the appropriation addresses. Congress must then pass a Continuing
Resolution for stopgap funding, which ties an agency to a specific spending
account. If the agency wanted to change it significantly, it would need OMB
approval. As Mr. Trinkle pointed out, there is not a lot of flexibility.
Going through this entire process, the activities related to a single fiscal
year usually stretch out over a period of at least two-and-a-half calendar
years. As the budget is being considered, federal agencies must deal with three
different fiscal years at the same time – implementing the budget for
the current fiscal year, seeking funds from Congress for the next fiscal year,
and planning for the fiscal year after that.
Mr. Trinkle also noted some special budget cases. One is the preparation of
a transition budget in the case of an election year, such as 2004. The OMB
examiners act as long-term, non-political appointees, and, as such, are the “corporate
memory.” At the time of an election, the examiners prepare transition
papers; summarize relevant issues for the transition team. Another special
case is the reaction to severe and unforeseen events, such as September 11.
To handle such emergency instances, Congress may have to pass an emergency
relief bill. The question then becomes how to pay for this – whether
to shift the existing funds of an agency to handle the response or pass a supplemental
funding bill.
The OMB website is: http://www.whitehouse.gov/omb/.
H. Jeffrey Leonard
President
Global Environment Fund
(March 10, 2004)
Topic: Balancing Sustainable Development and Private Sector Investing
Dr. H. Jeffrey Leonard is currently the President of the Global Environment
Fund (GEF), an investment firm with a specific mission toward environmentally
sound business investments. His previous experience has been in public policy
(regulations, government, non-profit organizations), especially with respect
to the environment.
Dr. Leonard opened his presentation with an overview of investment strategies.
Venture capital, or VC, is a huge investment, and GEF often sits back to avoid
getting caught up in technology hysteria. His firm works hard to avoid fads.
He also stressed that investments should be in emerging markets – NOT
in replacement markets. He cited nanotechnology as a replacement market. Dr.
Leonard reviewed two GEF fundamental beliefs. In the United States, GEF moves
from “end-of-pipe” solutions to “efficiency and new technology,” and,
internationally, it moves from local to global – “what’s
happening in India affects us.”
He also reviewed the GEF mission: to develop, finance and manage successful
businesses that promote sustainable natural resource management, a cleaner
environment and improved public health. GEF added the concept of “manage” over
time because just financing and then going “hands-off” was not
giving a return on investment. He reiterated this as a triple bottom-line philosophy:
- Economic return,
- Social development, and
- Environmental improvement.
GEF looks for a subset of investments that achieve all three. It also makes
an effort not to invest in money losers, such as environmental improvements
that won’t make any money, leaving those sorts of investments to government
regulators. Integrating multiple technologies is looked upon very favorably.
Responding to a question regarding the extent to which regulations are viewed
as a driver to open markets, Dr. Leonard said regulations are a double-edged
sword. They can help and hurt, but generally GEF avoids assets whose value
relies on regulations because the regulations can be changed and the assets
then become liabilities. Dr. Leonard said that his firm monitors the marking
of regulations somewhat, but not to a large extent.
A brief discussion about the Small Business Innovation Research (SBIR) program
followed. Dr. Leonard said that they used to scour SBIR reports and track SBIR
closely, but found that almost all will never be more than “SBIR companies.” The
only time SBIR is good is when it helps provide a match to ideas that they
embrace, independent of the program. To put it another way, Dr. Leonard said
that the SBIR program is good at funding technology advancements, but not good
at forming companies.
Dr. Leonard agreed that it is very important for VCs to have broad portfolios.
He also acknowledged that investing overseas must be done in a country that
respects private property and has a judicial system.
Dr. Leonard then reviewed two examples of diversified investments they’ve
made – Essex and Athena – two companies that have had recent success,
one as a supplier of advanced optical technologies, and the other as a developer
of the avionic control guidance system. One of the overall themes that GEF
is building into its investments is modeled after the “Intel Inside” campaign.
In this case, GEF is marketing environmentally sound products and services
with the tag, “Environmental Inside.”
In closing, Dr. Leonard spoke about the degree to which his firm has had to
wrestle with balancing the environment with defense. Some of his investors
don’t want to invest in defense, but they must balance that with the
fact that so many technology advancements are based on investments in the defense
industry, such as those funded by the Defense Advanced Research Projects Agency
(DARPA).
Ways in which the government helps or hurts private sector venture capital
investment were briefly discussed. Government’s weaknesses include
unpredictable or frequently changing regulations, favoring the status quo
at the expense of technology advancements, choosing an unpredictable or unprofitable
market, and not particularly capable of picking “winners” or
forming companies. However, these weaknesses are offset by the government’s
strengths, which include administering the Overseas Private Investment Corporation
(OPIC) grants, picking up the financial slack on environmental money losers,
and supporting DARPA and SBIR as technology advancers.
Additional information about GEF can be found at:
http://www.new-ventures.org/resources.gef.html.
Richard Bissell
Executive Director for Policy and Global Affairs
The National Research Council
(March 17, 2004)
Topic: National Academies of Science (NAS) Overview, and Role of NAS in Shaping
U.S. Science and Technology Policy
Dr. Richard Bissell is the Executive Director for Policy and Global Affairs
of the National Research Council, as well as Director of the Committee on Science,
Engineering, and Public Policy (COSEPUP) of the National Academy of Sciences
(NAS), National Academy of Engineering (NAE), and the Institute of Medicine
(IOM). With a Ph.D. in international economics, Dr. Bissell has had extensive
experience in the area of economic development
Dr. Bissell presented an overview of the structure of the National Academies.
There are four academies operating within the umbrella of the National Academies – the
National Academy of Sciences, the Institute of Medicine, the National Research
Council, and the National Academy of Engineering. The oldest component, the
National Academy of Sciences, was established in 1863 to advise the Federal
Government on technical and scientific matters. Each component has at least
1,000 members who are leaders in their fields; membership terms are for life.
Dr. Bissell spoke extensively about the National Research Council (NRC). The
NRC conducts reviews of questions of scientific or engineering interest, primarily
at the request of the Federal Government. The NRC operates according to three
fundamental principles when conducting studies: 1) independence, 2) balanced
perspective across the membership, and 3) objectivity. Consensus studies are
conducted by groups of scientists and engineers who agree to serve without
compensation and are selected to represent a broad range of opinions. Study
topics are negotiated with the requester and the NRC bases acceptance of study
requests on whether the information derived from the study will be of significant
value to the Federal Government or a scientific or engineering community. Panel
participants are both members of the academies and ad hoc participants. Meetings
are conducted according to Federal Advisory Committee Act rules. In addition
to consensus studies, the NRC conducts Convening Activities, such as workshops
and round tables, and Operational Programs, such as fellowships and surveys.
More information on NAS can be found at: http://www.nationalacademies.org/nas/.
James R. Shoemaker
Program Manager
Orbital Express Space Operations Architecture/ASTRO Program
Tactical Technology Office
Defense Advanced Research Projects Agency (DARPA)
(April 7, 2004)
Topic: DARPA and the Orbital Express Space Operations Architecture/ASTRO Program
Lt. Col. James Shoemaker introduced himself and provided a brief description
of DARPA’s organization and operations to the ComSci Fellows. He included
a short description of some DARPA programs in the aerospace field. He then
provided a summary of the Orbital Express program for which he serves as program
manager.
Program development ideas are often formulated at lower levels and then run
up through the office director to the DARPA Director who is a political appointee.
If the Director approves of the idea, he provides money to the appropriate
office to begin formulating a plan. One of the more interesting aspects of
DARPA is that most employees are only temporary hires for an estimated period
of about four years; DARPA believes that this shortened employment timeframe
forces motivation.
Lt. Col. Shoemaker’s Orbital Express program is located in the Tactical
Technology Office (TTO). TTO and the Special Projects Office (SPO) focus on
space but any DARPA office can pursue space technologies with approval from
the Director. TTO’s Orbital Express program is aimed at a number of technology
demonstrations. The key technology is on-orbit servicing (OOS). OOS promises
to extend the life and maneuverability of military satellites by allowing them
to be refueled. Current satellites end their useful lives when they deplete
their propellant and do not significantly change their orbital parameters in
order to conserve fuel. This technology would also open the possibility of
upgrading satellite electronics. OOS would provide an enormous tactical advantage
to the United States military.
The Orbital Express concept is to develop a service satellite capable of mating
to an existing satellite and transferring propellant and/or swapping out electrical
boxes. The current program schedule plans for an OOS demonstration in 2006
by launching two satellites (servicing satellite and mock satellite) on an
Atlas V launch vehicle. The two satellites would be used to demonstrate mating
operations using two different capture methods, fluid and electronic box transfer,
and uncoupling operations. The ComSci Fellows look forward to following the
results of yet one more unique DARPA technology demonstration.
The DARPA website is: http://www.darpa.mil/.
John Haskell
Senior Fellow
Government Affairs Institute
Georgetown University
(April 14, 2004)
Topic: The Congressional Budget and Appropriation Process
Dr. John Haskell, a Senior Fellow with the Government Affairs Institute (GAI)
at Georgetown University, spoke to the ComSci Fellows about how Congress develops
the annual budget and federal appropriations bills. This concentrated presentation
was scheduled at the request of the ComSci Fellows, who had heard many pieces
of the budget and appropriation puzzle during their congressional orientation
week, but still wanted a more coherent summary.
Dr. Haskell compared the congressional budget process to a jazz piece, incorporating
creativity and improvisation. With the Congressional Budget and Appropriation
Process Worksheet as the “sheet music,” Congress then improvises,
within certain rules, to be able to get the budget passed through both houses.
He told the group that 80 to 90 percent of the budget items are non-controversial.
The catch is the last 15 to 20 percent.
The blueprint is the Congressional Budget Resolution, which is fairly general
in nature. It is not really binding, but is of political importance. Some of
the items will have detailed program information, but this is not always the
case. This resolution is where the majority party states its priorities in
spending, taxes and the deficit. In contrast to the previous administration,
the same party (Republican) controls the White House and Congress, so it is
more difficult to blame “someone else” when the economy takes a
downturn. Since the Budget Resolution is a majority statement, no filibuster
is allowed.
Each Congress handles deficit spending differently. There is a feeling that
something needs to be done, but any action is not politically rewarding, since
cuts would need to be made. In 1993, President Bill Clinton raised taxes and
reduced spending, so the democratically-controlled Congress went along with
him. However, in 1994, the Democrats lost control of Congress, which may or
may not have been a reaction to these actions.
The current President has chosen a different route. He feels that tax cuts
are more politically important than deficit handling. From Congress’ point
of view, however, something more tangible, such as building roads, is more
politically advantageous.
Last year, the Budget Resolution was $786 billion. However, Congress could
not stick to this figure and had to go $6 billion over. This was due to actions
in the Gulf and other increased defense and homeland security spending that
had not been foreseen. This year, there is no extra money for defense, which
may cause the need for a supplemental defense bill before January 2005.
Dr. Haskell gave two reasons why it is difficult to “stick to the sheet
music.” First, there is not enough money in the allocation to get votes
on the floor. Politically-important items, such as roads, are always added
in. Second, since every Member of Congress knows that pet legislative needs
may not go anywhere; amendments get added to the appropriations bills. In other
words, policy issues that should be dealt with elsewhere, and really unrelated,
get added onto the bills. Therefore, the appropriation bills are put together
as larger “omnibus” bills, with the extra programs included as
incentives for votes in Congress.
The lesson learned in recent budget and appropriations cycles is that, even
with a “unified” government, the process has not been simplified.
29th Annual AAAS Forum on Science and Technology Policy
(April 22-23, 2004)
Each spring, the American Association for the Advancement of Science (AAAS)
sponsors the AAAS Forum on Science and Technology Policy in Washington, D.C.
The purpose of the forum is to promote discussion about current policy trends
and issues facing the science and technology community. The forum routinely
attracts top experts in various science-related fields as speakers and participants,
and this year was no exception.
Held at the Hyatt Regency on Capitol Hill, the Forum opened with a warm welcome
from Dr. Shirley Ann Jackson,
President of Rensselaer Polytechnic Institute and current President of AAAS.
She introduced the Forum Keynote Speaker, Dr. John H. Marburger, III, Director
of the White House Office of Science and Technology Policy. Dr. Marburger emphasized
the Bush Administration’s commitment to science, especially in the areas
of homeland security, defense, and key areas of science and technology related
to long-term economic growth. In addition, he covered science policy issues
and science programs he perceives as priorities including balancing science
and security, societal impacts of science and technology, health sciences,
National Science Foundation, National Aeronautics and Space Administration,
Department of Energy, energy and environment, economic vitality, and space
science and exploration.
Following the keynote address, Dr. Jackson moderated a Plenary Session on
Budgetary and Policy Context for research and development (R&D) in FY 2005.
Speakers included Senator Tom Daschle, Senate Minority Leader (D-South Dakota);
Mr. Kei Koizumi, Director of the AAAS R&D Budget and Policy Program; Dr.
Daniel Yankelovich, Founder and Chairman of Viewpoint Inc. and Public Agenda;
and Dr. Luke Georghiou, Professor of Science and Technology Policy and Management,
University of Manchester, United Kingdom.
Three concurrent sessions were held during the afternoon to address major
issues in science and technology policy. Session A examined policy implications
of converging new technologies, especially nanotechnology, biotechnology, information
technology, and cognitive technology. The speakers discussed the status of
new technologies, their promise and uncertainties and prospects for intersections
among them, and social and ethical implications. Session B addressed policy
and civic implications of information technologies. Topics included the merits,
risks, and vulnerabilities of new voting technologies; digital-divide issues;
IT and privacy concerns; control of the Internet; and using the Web to build
a more informed and engaged citizenry. Session C discussed the question – how
sustainable is the modern research university? Discussants reviewed a number
of issues including changes in institutional mission, funding profile, and
management structures; public expectations of universities; stakeholder interests;
reward structures; public universities and autonomy; and structural strains
between R&D and teaching, graduate and undergraduate education, and academic
departments and research centers.
The first day of the Forum came to an end in the early evening with the William
D. Carey Lecture and Reception. Dr. Harold Varmus, President of the Memorial
Sloan-Kettering Cancer Center, gave an address on “Science, Government,
and the Public Interest.” Dr. Varmus discussed six policy issues of strategic
importance to all scientists – financing research, immigration practices,
independence of peer review, separating religion and science, globalizing science,
and disseminating scientific knowledge.
The morning agenda for the second day of the Forum focused on the challenges
for the United States in the evolution of the global economy, with Ambassador
Ira Shapiro as moderator. The speaker list featured Dr. Catherine Mann (Institute
for International Economics), Dr. Ron Hira (Rochester Institute of Technology),
Ms. Diana Hicks (Georgia Institute of Technology), Mr. Dave McCurdy (Electronic
Industries Alliance), and Mr. William Bonvillian (Office of Senator Joseph
Lieberman). The topics for the session included the outsourcing of United States
jobs in high-technology fields, science and technology (S&T) education
maturity in developing nations, the United States use of foreign-born talent,
and the United States ability to lead the way in the global community through
innovation. Session highlights included information on the emergence of China
as a science and technology power. This is considered by many to be the greatest
challenge the United States has faced economically, even greater than that
posed by Japan in the 1960s and 1970s. It was felt that the Japanese S&T
economy is similar to that of the United States in providing a high-value,
high-salaried, highly trained work force, and having a rule of law to protect
commercial rights. In China, the United States faces a low-value, low-wage,
highly trained emerging work force and a nation that does not adhere to an
international code for commercial properties.
According to many of the session speakers, one of the biggest weapons that
the United States has against the encroachment of developing S&T countries,
such as China, India, and Korea, is its power of innovation. Innovation is
the way to economic growth and will keep a technology-dependent country like
the United States competitive in the global marketplace, despite losses in
manufacturing capabilities. In order to foster innovation, it was felt that
the U.S. Government should invest in broad sciences and not limit research
and development spending to specific S&T applications that may be favorable
at the moment.
The closing session of the Forum was on the impact of post-9/11 policies on
science. The session speakers were Dr. Alice Gast (Massachusetts Institute
of Technology), Dr. John McGowan (National Institute of Allergy and Infectious
Diseases), and Mr. David Heyman (Center for Strategic and International Studies).
It was pointed out that one quiet but high-impact fallout of the 9/11 tragedy
was the strict review of visa applications, resulting in a reduction of foreign
scientists working in the United States. It was noted that the United States,
from the industrial age to today, has enjoyed the presence of the world’s
brightest minds, especially in our educational institutions. The reduction
in foreign talent in our educational system is directly related to the tight
restrictions on visa applications.
Many of the presentations can be found on the AAAS website: http://www.aaas.org/spp/rd/forum.htm.
Visit to the Smithsonian Environmental Research Center
Edgewater, Maryland
(April 28, 2004)
The Smithsonian Environmental Research Center (SERC) is the largest of five
research centers operated out of the Smithsonian. SERC is located in Edgewater,
Maryland, about 20 miles east of Washington, D.C., on the western shore of
the Chesapeake Bay. SERC has been in operation since 1965, on land donated
from a former dairy farmer. On the 2,600 acres of SERC, scientists do various
ecological studies, involving the Chesapeake Bay and its surrounding forests,
pastures, freshwater wetlands, tidal marshes and estuaries. In addition, there
are also a number of affiliated sites, including some in the Caribbean.
The first item on the tour for the ComSci Fellows was inspecting the mist
netting for birds in the gardens, accompanied by Dr. Pete Marra, a bird ecologist
from the Avian Laboratory. Due to cold temperatures of the morning, the birds
were not very active, and the nets empty, but Dr. Marra described the function
of the netting and the network of volunteer “citizen” backyard
observers throughout the area, who help inventory the bird population. He also
described some of his own research interests, which included winter-limiting
conditions on bird dynamics and specifically birds that over winter in areas
of the Caribbean. One particular research area is to develop an isotope and
genetic base map for American redstarts, which will aid in identifying particular
sites where the birds over winter.
Dr. Wayne Coats gave the ComSci Fellows a summary on protistan ecology and,
more particularly, aspects of phytoplankton blooms, which occur along certain
areas of the coastal United States, and also in the Chesapeake Bay. His research
is concerned with biological factors that regulate the cycles of the blooms.
Although the blooms occurring locally are not usually of a species directly
toxic to fish and other aquatic organisms, they can create adverse effects,
such as oxygen depletion. He illustrated his talk with short videos and slides
and discussed the effects that parasites have on bloom cycles. The ComSci Fellows
saw a video demonstration of how parasitism can affect the population dynamics
of dinoflagellate red tides.
In the Phytoplankton Laboratory, Ms. Sharyn Hedrick described the research
topics of this laboratory, and the ComSci Fellows were given a demonstration
comparing the optical properties of water, including light absorption and transmittance,
in a water sample from the Rhode River with one from a polluted site in the
St. Johns River in Florida. Such measurements will enable SERC scientists to
interpret parameters that are indicators of a variety of natural and human-induced
processes that determine the overall health of the system. Dr. Hedrick told
the ComSci Fellows that measurements of water quality in the Rhode River are
among the longest running observations at SERC, with concentrations of nutrients,
suspended sediments, and phytoplankton pigments measured at biweekly intervals
since the mid-1980s.
During lunch, Mr. Paul Fofonoff joined the ComSci Fellows and discussed some
of the aspects of invasive species on aquatic areas. Of particular interest
was the issue of ballast water from ships introducing non-native marine species,
and steps that have been implemented to reduce the occurrence.
The ComSci Fellows were also given a general overview of forest ecology by
Dr. Jess Parker, who pointed out observable differences between old forest
growth and new growth during a hike through some of SERC’s grounds. The
Forest Ecology Laboratory studies the structure, growth and function of forest
ecosystems, particularly the canopy portion of deciduous forests. The purpose
of dam weirs was also described, which is to help assess watershed status effects
through depth and flow measurements at the dam weir sites.
The final event of the site visit was canoeing along part of the Rhode River
estuary, with Mr. Fofonoff describing wetlands ecology. The canoeing part of
the exercise was challenging for some due to low tide conditions, but all agreed
that was an interesting way to observe the wildlife.
Additional information on SERC can be found at: http://www.serc.si.edu.
Mihail C. Roco
Senior Advisor
National Science Foundation
(May 5, 2004)
Topic: National Nanotechnology Initiative
The ComSci Fellows received an interesting and comprehensive presentation
on nanotechnology and its future in science and industry. Dr. Mihail C. Roco
provided an overview of the National Nanotechnology Initiative (NNI), which
included some general definitions of just what nanotechnology is and where
it is going in the next 15 to 20 years. Dr. Roco is a senior advisor for nanotechnology
at the National Science Foundation (NSF), and chairs the U.S. National Science
and Technology Council’s Subcommittee on Nanoscale Science, Engineering
and Technology (NSET). He also formally proposed NNI in a presentation to the
White House Office of Science and Technology Policy’s Committee on Technology
on March 11, 1999.
Dr. Roco began his presentation by stating that the NNI is the only initiative
of its kind totally based on science that had no initial support from any government
agency. He went on to say that certain foundational nanotechnology documents,
such as Societal Implications of Nanoscience and Nanotechnology, were developed
and distributed to support NNI, which showed the effects of nanotechnology
on our society and gave a clear picture of just what this new science entails.
The main focus of NNI is to reveal how nanotechnology can change the face of
industry.
Nanotechnology is working at the atomic, molecular and supramolecular levels,
in the length scale of approximately 1-100 nm range, in order to understand,
create and use materials, devices and systems with fundamentally new properties
and functions because of their small structure. This NNI definition of nanotechnology
is also opening up new doors of scientific discovery that were not previously
possible. Dr. Roco believes nanotechnology provides the scientific community
the ability to measure, control and manipulate matter at the nanoscale level
in order to change their properties and functions.
The long-term societal implications of nanotechnology are staggering – comprehension
of nature, improved quality of life in health and environment, as well as an
economy that will see a $1 trillion market by 2015, creating more than two
million jobs worldwide. Dr. Roco also emphasized that the miniaturization process
to which most people relate when discussing or thinking about nanotechnology
is not the main scientific interest. In fact, the more important, radical improvements
center on the process of generating new property changes at the nanoscale.
For example, structures such as carbon nanotubes can be designed from raw materials
into products with desired properties and performance in large quantities.
Dr. Roco sees new generations of nanostructures and nanosystems developing
over the next 20 years or so. Instead of passive nanostructures, such as nanostructured
metals, polymers, ceramics, he envisions three-dimensional nanosystems and
molecular nanosystems, where there will be complex, non-repeating systems of
complex nanosystems, as well as molecular processing that will allow each molecule
to act as a single device that forms other systems.
Coordinating the NNI involves many government, state and local entities. Partnerships
with industry, states and international organizations, as well as interaction
with the public and the media, are extremely important. Research and development
(R&D) funding by government agencies shows an encouraging growth pattern
from $270 million in 2000 to a possible $1 billion in 2005. Dr. Roco highlighted
the fact that, once the United States announced in 2000 that it was going to
invest more dollars in nanotechnology, other governments also began to invest
in it quite vigorously. Approximately 40 countries, including Japan, South
Korea and Germany, are now sponsoring comprehensive programs (over $100 million
annually) in nanoscience and nanotechnology.
Dr. Roco insists a strong nano-network infrastructure is necessary if nanotechnology
is to maintain the level of progress desired by industry and science. He declared
that NNI is already setting new goals by focusing on research, education, and
significant infrastructure. Research is reducing the time of reaching commercial
prototypes by at least a factor of two for several key applications. In the
area of education, an early emphasis on nanotechnology is being achieved as
all science and engineering colleges have introduced courses related to nanoscience
and engineering. Over 60 universities have become part of the nano-network
infrastructure, which includes at least five major networks and a work force
totaling almost 40,000 individuals.
Information on the National Nanotechnology Initiative can be found at: http://www.nano.gov.
Visit to the Federal Bureau of Investigations (FBI) Laboratory
Quantico, Virginia
(May 12, 2004)
The FBI Laboratory is located in Quantico, Virginia, at a facility completed
in April 2003. The five-story building has about 500,000 square feet of laboratory
and office space, compared to the previous 100,000 square feet of retrofitted
laboratory facilities and office space in its previous location at FBI headquarters
in downtown Washington, D.C. Dr. Joseph DiZinno, Deputy Assistant Director,
gave the ComSci Fellows an overview of the Laboratory and its facilities. There
are about 650 employees in the Laboratory Division of the FBI in nine case-working
units, including support staff. All FBI cases involving forensic analysis are
worked at the Laboratory. In addition, any United States enforcement agency
and some foreign agencies can apply to submit evidence for evaluation by the
FBI at no charge. Dr. DiZinno described the change in priorities at the Laboratory
since 9/11 – from an emphasis on violent crime investigation to the current
emphasis on preventing terrorism. The scientists do not go through agent training,
but they must complete a rigorous laboratory training, which includes very
specific protocols and how to present evidence for court submission.
The ComSci Fellows were able to get short glimpses of the work done in several
of the case-working units. For example, in the Explosives Unit, the ComSci
Fellows learned that much of the work done involved post-blast assistance,
where they need to analyze many pieces of possible evidence to aid in piecing
together what type of improvised explosives device was employed, and evidence
to trace the origin of the pieces. To this end, the Explosives Unit has extensive
cooperation with most of the other units in the Laboratory.
Work being done in the Chemistry Unit ranges from more general analyses, such
as drug quantification, to toxicology, paint and polymers, and elemental analysis.
For many of these analyses, the areas for trace and bulk analyses are separated
to minimize contamination.
Firearms identification is performed in the Ballistics Unit. There is a reference
firearms collection, which serves as both reference library and parts resource,
if a part is needed to obtain a “bullet signature” from a gun.
There is also a computerized bullet database, the National Bullet Information
Network (NBIN), which is used only as a screening device, as actual identification
must still be done with microscope.
The two DNA units, Mitochodrial DNA Unit (DNA Unit II) and Nuclear DNA Unit
(DNA Unit I), serve different functions, and the physical state and amount
of evidence determines which type of analysis will obtain the most useful data.
DNA Unit II usually is involved in cold case work where the evidence is often
degraded, and where there are only small samples available. This type of DNA
provides more general identification, as it indicates familial DNA, and not
identification specific to a single individual. DNA Unit I, on the other hand,
works more with bodily fluid samples, and determines the unique DNA signature.
This unit is connected with the Combined DNA Index System (CODIS) and the National
DNA Index System (NDIS).
The Trace Evidence Unit deals mostly with hair and fibers comparisons, which
are extremely useful, but not as concrete as DNA evidence. This Unit also does
work with fabric, cordage, feathers, wood, tape, and soils, as well as skeletal
remain analysis in conjunction with the Smithsonian Institution.
The Latent Print Unit deals not only with fingerprint analysis, but also with
latent prints in general, which are prints that require some kind of development
for visualization. Along with the demonstration of several techniques for print
development, it was explained that all prints are now photographed and digitized
for storage and can be computer-enhanced for ease of studying, but never altered.
The ComSci Fellows were told that the friction ridges of an individual’s
fingers are totally unique, and even when worn down, grow back again with the
same pattern.
The final presentation was on the research arm of the FBI Laboratory, the
Counterterrorism Forensic Science Research Unit, which has a separate facility
from the case-working units. The research and development strategy has been
on applied research, and each year Laboratory needs are reassessed. Some of
the current projects include automation of various aspects of DNA analysis,
imaging improvements and scent analysis.
The ComSci Fellows agreed that the tour presented a fascinating look into
the range of forensic work being done at the Laboratory. The amount of information
and types of analysis presented the group with an overload of information,
but some of the more memorable stories included a description and inside story
of the “Shoe Bomber,” with the use of a novel paper fuse, the blood
stain pattern analysis from the “Table Saw Suicide,” that presented
evidence that it was no suicide, and the description of the “Dead Body
School” for specialized forensic analysis of skeletal remains.
30th Anniversary of the AAAS Science and Technology Policy Fellowship Programs
(May 13-14, 2004)
Topic: Vision 2033: Linking Science and Policy for Tomorrow’s World
This one and one-half day symposium was the focal point of the celebration
marking the 30th Anniversary of the American Association for the Advancement
of Science (AAAS) Science and Technology Policy Fellowship programs. The event
was held at the Carnegie Institute of Washington.
The opening remarks were made by Dr. Alan Leshner, Chief Executive Officer
of AAAS, followed by Dr. Albert Teich, Director, Science and Policy Programs,
AAAS; Dr. Richard Meserve, President, Carnegie Institute of Washington; Dr.
Maxine Singer, President Emeritus, Carnegie Institute of Washington; and Dr.
Richard Scribner, former Director of Science and Technology Policy Fellowship
Programs, AAAS.
Dr. Leshner mentioned that the Science and Technology Policy Fellowship Programs
began on Capitol Hill in 1973 with seven Fellows serving in congressional offices,
providing their scientific expertise to policy-makers facing increasingly technical
legislative issues. The success of the program has led to the establishment
of AAAS public policy fellowship programs in nearly a dozen agencies. AAAS
sponsors the Science and Technology Policy Fellowship programs for scientists
and engineers, from recent Ph.D. recipients to mid-career professionals, to
learn about policy-making while contributing scientific expertise to the Federal
Government. Thirty other scientific and engineering societies participate in
the program, but select and fund their own fellows.
Dr. Leshner further mentioned that the Fellows, representing a cross-section
of science and engineering fields, bring a common interest in learning about
the interface of science and government, and a willingness to apply their technical
training in a new arena. The host offices value the Fellows for their external
perspectives and critical thinking skills, as well as for their technical expertise.
Before beginning their fellowships, the Fellows participate in a comprehensive
two-week orientation program, which is followed throughout the year with seminars
that provide the opportunity to hear noted speakers on issues relating to science,
technology and public policy. At the conclusion of the fellowship year, some
Fellows remain in Washington, D.C. while other Fellows return to their previous
positions. In 30 years, nearly 1,600 scientists and engineers have spent a
year in Washington bringing good science to government decision-making through
the various fellowship programs.
The first session, “Science, Technology and the Human Condition,” explored
possible advances in science and technology over the next 30 years and highlighted
societal impact and policy implications of scientific discovery and advances.
The presentations reflected on issues such as bioethics, health, genomics,
personal privacy and information technology to help ensure that the impact
of science and technology is considered in advance of scientific discovery.
The past 30 years have demonstrated that science and technology can alter our
lives in many ways, some positive and some negative. In the session, potential
problems and benefits of future scientific discovery were examined, as were
ways to identify and minimize problems, and to maximize the benefits of advances
in science and technology. The session explored the juncture of science, technology
and humankind from both domestic and international perspectives.
Ms. Dee Perry, moderator, opened the session by noting that science and technology
constitute the prism through which one can observe the shaping of the world
for the better.
Dr. R. Alta Charo, Associate Dean for Research and Faculty Development and
Professor of Law and Bioethics at University of Wisconsin, discussed the role
of government by quoting several examples, including health law, food and drug
law, voting rights, environmental laws, abortion law and medical genetics.
She enumerated several personal and social issues related to stem cell research,
pornography and reproductive choices, which were adversely affected by public
policy and regulations. Dr. Charo further observed that it is incumbent upon
the scientific community to relate to the general public so that they should
not fear the outcome of scientific discoveries.
Dr. Kenneth F. Schaffner, Professor of Medical Humanities and Professor of
Philosophy at George Washington University, gave a presentation on “Behaving:
What is genetic and What is not, and Why should we care?” Dr. Schaffner
discussed ethical and philosophical issues in human behavioral and psychiatric
genetics (BPG).
Dr. Irving Wladawsky-Berger, Vice President, Technology and Strategy at IBM,
discussed IBM’s next-generation Internet efforts in supercomputing and
parallel computing, including the transformation of large commercial systems
to parallel architectures. He further discussed IBM’s grid and autonomic
computing efforts to make the Internet a self-managing, distributed computing
platform capable of delivering computing services on demand. He defined autonomic
computing as self-predicting, self-healing, self-optimizing and self-configuring.
The discussants were Dr. Patrick Hines, University of North Carolina at Chapel
Hill and Mr. Bruce Sterling, author, journalist, editor and critic. Dr. Hines
expressed the hope that advanced research would lead to the use of single gene
as an identifier of defects or diseases within the next 30 years.
On the second day of the Symposium, Dr. Albert Teich, Director, Science and
Technology Programs, AAAS, made some opening remarks regarding the day’s
agenda. Dr. Stephen Nelson, Associate Director, Science and Policy Programs,
AAAS, then introduced Dr. Ismail Serageldin, Director, Library of Alexandria,
Egypt.
Dr. Serageldin set the theme for the rest of the day by pointing out that
scientific knowledge and the resulting technological applications are accumulating
at an accelerating rate as a result of ever more powerful computers and lightning-fast
communications. The international community, unfortunately, has given inadequate
attention to the needs of capacity building in science and technology as the
engine that drives knowledge-based development.
He noted that business-as-usual will leave an ever-growing gap between “have” and “have-not” nations
in technological innovations. It is absolutely essential that developing nations
strengthen their science and technology capacity. And they must do so soon,
through their own efforts, and those of their friends. There is no time to
waste if the majority of humanity is not to fall farther behind the developed
countries. We, in the developed parts of the world, must bring the benefits
of science and technology to every human being on the planet so that he or
she too has a chance to live in dignity, comfort, health, and happiness.
The rest of the morning session was devoted to a panel addressing Science,
Technology and Global Security. Dr. Joel Primack, Professor of Physics, University
of California, Santa Cruz, moderated the panel consisting of speakers Dr. Frank
von Hippel, Professor of Public and International Affairs, Program on Science
and Public Affairs, Princeton University; Dr. Victor Utgoff, Deputy Director,
Strategy, Forces and Resources Division, Institute for Defense Analyses; Dr.
Julie Fischer, Biological and Chemical Weapons Analyst, Henry L. Stimson Center;
and Discussants Dr. Maureen McCarthy, Director, Office of Research and Development,
Science and Technology Directorate, U.S. Department of Homeland Security; and
Dr. George Fidas, The Elliott School of International Affairs, The George Washington
University.
Dr. Primack picked up on Dr. Serageldin’s comments on the need to globalize
science and technology by noting that science needs wise involvement in policy
formulation. Most of all, he said, humanity needs to transition from an emphasis
on growth to a sustainable relationship with the extent of resources worldwide.
Dr. von Hippel focused on nuclear terrorism by pointing out that high enriched
uranium (HEU) might be coveted by terrorists. He urged for greater funding
to convert research reactors from using HEU to using low enriched uranium,
thereby removing a possible source of nuclear materials from terrorists. Dr.
von Hippel also urged the AAAS to expand its Fellowship Program by creating
an international component.
Dr. Utgoff followed by asking the question: What can humanity do to stop it
from being destroyed by weapons of mass destruction? He answered saying that
fewer nations have nuclear weapons than first projected in the middle of the
last century. But more still needs to be done. He cites the need for: international
agreements that creating weapons of mass destruction would be illegal; stronger
action against proliferators; and increased emphasis on improving technologies
for detecting unauthorized nuclear, chemical, and biological materials.
Dr. Fischer took the podium from Dr. Utgoff. Dr. Fischer examined the threat
of global disease. She pointed out that infectious diseases primarily affect
the developing world. In those countries these diseases can have a security
impact through creating socioeconomic instability, weakening indigenous military,
and possibly resulting in the spread to United States personnel abroad. Dr.
Fischer said that measures needed to stem these diseases include: safe and
adequate water and food, availability of effective medical treatment, maternal
health and education, a comprehensive public health infrastructure, and a well-organized
strategic framework. She noted that science and technology may contribute to
improved worldwide health by: developing systems for global disease surveillance,
providing low cost, rapid, robust, and acceptable disease screening and diagnostic
equipment, and through improved vaccines and drug delivery systems.
Dr. McCarthy commented that the Department of Homeland Security has to develop
three core science and technology capabilities: 1) awareness of the future
threat; 2) deployment of countermeasures; and 3) exercise scientific leadership.
This investment must be for decades not just for the short term. It must focus
on people, infrastructure, and programs. She concluded by noting that we have
had the shock to our systems and it is now time to make sure that it does not
happen a second time.
Dr. Fidas concluded the morning’s agenda by discussing the impact of
infectious diseases on human beings and national security. He stated that infectious
diseases are a leading cause of death, accounting for a quarter to a third
of all deaths worldwide. The spread of infectious diseases results from human
behavior such as lifestyle choices, land-use patterns, increased trade and
travel, and inappropriate use of antibiotic drugs. He argues that the infectious
disease threat will complicate United States and global security over the next
20 years. Dr. Fidas concluded that these diseases will endanger United States
citizens at home and abroad, threaten United States forces deployed oversees,
and exacerbate social and political instability in key countries and regions
in which the United States has significant interests.
The afternoon session had the topic of “Energy, Environment and Global
Change.” The first of two panels to address this topic was moderated
by Dr. Neal Lane, University Professor and Senior Fellow, Baker Institute for
Public Policy, Rice University; and included as speakers Dr. Donald Boesch,
President, University of Maryland Center for Environmental Science; and Dr.
Mohamed El-Ashry, Former CEO and Chairman, The Global Environment Facility.
Dr. Lane began by quoting Dr. George Brown: “The path through the 21st
Century cannot be an extension of the 20th Century.” Dr. Lane likened
the energy issue to a “three body problem.” The three bodies are:
science, policy, and politics. He pointed out that we can deal with science
and policy but then there is politics. He sees energy as number one on the
list of humanity’s top ten problems of the next 50 years – we are
going to need two to three times the energy in 2050 – and where are we
going to get it? The answer is unclear at this time but one point is sure and
that is if there is any problem that science and technology should solve, it
would be energy.
Dr. Lane introduced Dr. Boesch who addressed the topic of “Energy, Global
Change, and the Future of Coastal Waters.” Dr. Boesch noted that United
States ocean and coastal resources should be managed to reflect the relationships
among all ecosystem components, including human and non-human species and the
environments in which they exist. This will make it necessary to define relevant
geographic management areas based on ecosystem, rather than political boundaries.
He said that effective policies should be based on unbiased, credible, and
up-to-date scientific information. Such an approach must have as its goal to
enhance the Nation’s ability to observe, monitor, and forecast ocean
and coastal conditions in order to better understand and respond to the interactions
among oceanic, atmospheric, and terrestrial processes.
Next to speak was Dr. El-Ashry who spoke about the need to balance the pressures
for growth with environmental concerns. He pointed out that there is a need
to incorporate environmental issues into the broader agenda of poverty eradication
and sustainable development. People of all nations, rich and poor alike, are
experiencing the effects of ecosystem decline in one guise or another: from
water shortages in India to soil erosion in Russia, to fish kills off the coast
of North Carolina in the United States. Dr. El-Ashry concluded by stating that
strong scientific programs are needed to move the global change agenda forward
and that the public must put pressure on politicians to achieve these goals.
The final panel was moderated by Dr. David Rejeski, Director, Foresight and
Governance Project, Woodrow Wilson International Center for Scholars, with
speakers Dr. Theodore Gordon, Futurist and Management Consultant, and Dr. Mary
Evelyn Tucker, Professor, Department of Religion, Bucknell University.
Dr. Rejeski introduced Dr. Gordon, who reviewed the results of the “Millennium
Project – Future Issues in Science and Technology Management.” This
study was designed to identify science and technology developments of importance
in the next 25 years and how the evolution of these developments might be managed.
The central objective of the study was to seek a broad range of international
perspectives on the future of science and technology. Through a series of interviews
based on a set of scenarios, the project concluded that spectrums of five science
and technology management levels exist simultaneously at any point and the
mix of these techniques controls the course of research. These levels are:
global organizations, national advisory commissions, national agencies of government,
the disciplines themselves, and individual researchers. He said that the study
recommended an ongoing forecasting and risk assessment system to deal effectively
with future developments in science and technology.
The last speaker on the panel was Dr. Evelyn Tucker. She noted that the environmental
crisis is well-documented in its various interlocking manifestations of industrial
pollution, resource depletion, and population explosion. Clearly religions
need to be involved in the development of a more comprehensive worldview, along
with ethics, to assist in reversing this trend. How to adapt religious teachings
to this task of revaluing nature so as to prevent its destruction marks a significant
new phase in religious thought. She concluded by saying that the time is right
for a broad investigation of the contributions of particular religions to solving
the ecological crisis, especially by developing a more comprehensive environmental
ethic.
The day concluded with Dr. Alan Hoffman, Office of Energy Efficiency and Renewable
Energy, U.S. Department of Energy introducing The Honorable Rush Holt, U.S.
House of Representatives (D-New Jersey), and a 1982-1983 APS Congressional
Fellow. Representative Holt, one of only two practicing physicists in the Congress,
spent a few minutes reviewing the AAAS Fellowship Program. He noted that in
the 30 years of the Program nearly 1,600 scientists and engineers have spent
a year in Washington bringing good science to government decision-making. He
ended by challenging the AAAS to grow and make the next 30 years infinitely
better than the last.
Melvin Bernstein
Director of University Programs
Science and Technology Directorate
U.S. Department of Homeland Security
(May 19, 2004)
Topic: Linking University Research and Education to the DHS Mission
Serving as the Director of University Programs at the Department of Homeland
Security (DHS), Dr. Melvin Bernstein spoke briefly about the Department and
then focused his remarks on its Science and Technology Directorate, where his
office is located, and his mission. A materials scientist by training, he arrived
at DHS in June 2003 through an IPA with Tufts University, where he was a Research
Professor in the Department of Mechanical Engineering. He expects to serve
through June 2005.
DHS has been in existence for just over a year. Twenty-two federal agencies
have been rolled in, presenting logistics and administrative systems challenges.
The DHS strategic goals include awareness, prevention, protection, response
and recovery. Describing the approach to homeland security issues as a “new
take on national imperatives,” Dr. Bernstein pointed out the many that
have been around for some time, such as drug interdiction, export controls
and cybersecurity. More attention is also being given to biometrics and information-gathering.
DHS is looking to strengthen United States leadership in science and technology – both
as an enabler and in interaction with other organizations. The science and
technology (S&T) budget is about $1 billion out of the total DHS budget
of $40 billion. At DHS, S&T refers to the entire cycle of research, development,
test, and evaluation (RDT&E), with threats and responses defining critical
technology needs. The needs include critical infrastructure protection, border
and transportation security, and chemical/biological/radiological/nuclear countermeasures.
Dr. Bernstein was asked to build a university program for DHS. The function
falls within the Office of Research and Development under the Science and Technology
Directorate. Dr. Bernstein has already spoken with 500 universities, mostly
the engineering and political science departments, to assess capabilities – a
way of “tapping the best of the best.” Dr. Bernstein has been involved
in setting up university-based Homeland Security Centers of Excellence, which
are mission-focused and targeted at research areas that leverage multidisciplinary
capabilities. Essentially aimed at filling scientific and knowledge gaps, these
centers will complement the project-focused research funded by the Homeland
Security Advanced Research Projects Agency, another organization within the
Science and Technology Directorate.
There are currently three centers – University of Southern California
with its focus on risk analysis and economic modeling, and Texas A&M University
and the University of Minnesota, both with a focus on agricultural biosecurity.
Dr. Bernstein expects additional centers to be established – more than
six but less than ten. The centers are funded initially for a three-year period,
which can be renewed.
Dr. Bernstein’s office has also established a Homeland Security Scholars
and Fellows Program, which currently is limited to undergraduate and first-year
graduate students. The first class has been announced. Out of some 2,500
applications, 100 students from disciplines such as engineering, math and
computer science, and the life and physical sciences were selected. Dr. Bernstein
expects to extend eligibility to post-doctoral students and expand the disciplines
to include the humanities, such as religion. He pointed out that, since there’s
no field called “homeland security,” there’s a need to
create people sensitized to think about it.
More information on the DHS can be found at: http://www.dhs.gov.
Visit to the U.S. Army Medical Research and Materiel Command
Fort Detrick, Maryland
(June 16, 2004)
The ComSci Fellows were welcomed by the Commanding General of the U.S. Army
Medical Research and Materiel Command (USAMRMC), Major General Lester Martinez-Lopez,
who gave the group an overview. To support the health and fighting ability
of soldiers, sailors, airmen, and Marines, USAMRMC conducts programs in medical
research, medical materiel development, medical logistics and facility planning,
medical information systems, and development of new technologies to improve
health care on the battlefield.
Speaking of his mission, Major General Martinez-Lopez, a physician, explained
that people don’t usually associate the military with medical research.
However, the United States armed forces go to places where they encounter diseases
such as malaria. Of all the drugs in the world today used to fight malaria,
only two are not the work of Fort Detrick. Clinical trials are conducted in
places where the disease is found.
USAMRMC is in the “discovery business,” which means doing the
early work and then letting industry produce the drug. Major General Martinez-Lopez
pointed out that the focus is on finding solutions rather than advancing science.
Most of the research is open source.
USAMRMC is engaged in a broad spectrum of activity – basic research
in the laboratory, product acquisition, and the fielding and life cycle management
of medical equipment and supplies for deploying units. The command focuses
on pre-hospital trauma care – treating the wounded on the battlefield.
USAMRMC has been in the forefront of telemedicine for the past ten years.
Major General Martinez-Lopez was followed by Colonel Gina Deutsch, Chief of
Staff, who showed a video and gave the command briefing. There are about 5,000
employees working in the USAMRMC worldwide – one-third are contractors;
one-third, military; and one-third, government civilians. The organization
has $1.7 billion annual budget. Within this budget, $800 million is devoted
to medical research in areas such as prostate and breast cancers. Grants are
awarded to industry and academia. Submitted proposals are evaluated through
a peer review process. The review panels consist of external peers, which include
users, such as soldiers and patients.
Lieutenant Colonel Harry Slife, Director of the Medical Chemical and Biological
Defense Research Program, gave an overview of the aspects of his program – intelligence,
education and training, and medical and physical countermeasures. The program
is threat and requirements driven. The goal is to produce a pharmaceutical
product that will combat the effects of a toxic agent at the pretreatment,
diagnostic or therapeutic stage. He described the medical research and development
process as a conveyor belt, starting at the technology base and proceeding
to discovery, development and production and deployment. Product or technology
insertion can happen anytime. For example, a product already on the commercial
market could be adapted to military use.
Colonel Erik Henchal, Commander of U.S. Army Medical Research Institute of
Infectious Diseases (USAMRIID) gave a briefing on his program. His mission
is to conduct basic and applied research on biological threats resulting in
medical solutions to protect the war fighter. Biological agents include bacteria,
virus, and toxins. USAMRIID is involved in the full spectrum of medical product
development for biological defense – prevention, detection, and diagnosis
and treatment. Colonel Henchal also talked about Biosurety, which includes
security, safety, personnel reliability, and inventory management, and plans
to build the national interagency biodefense campus at Fort Detrick, which
will include agencies such as the National Institutes of Health and the Department
of Homeland Security.
In the afternoon, the ComSci Fellows visited the USAMRMC’s Telemedicine
and Advanced Technology Research Center (TATRC). TATRC is charged with managing
core and congressionally mandated advanced technology projects. Telemedicine
reflects a convergence of technological advances in a number of fields, including
medicine, telecommunications, space science, computer engineering, informatics,
artificial intelligence, robotics, materials science, and perceptual psychology.
In the TATRC, Captain Ed McDonough, who had recently returned from a tour
of duty in Baghdad, demonstrated some of the advanced computer and communications
technologies used on the battlefield – all aimed at improving the standard
of medical care. The Battlefield Medical Information System – Tactical
(BMIS-T), a handheld electronic device, enables information to be captured
at the point of care on the battlefield. Also, every soldier has an electronic
dog tag in addition to a regular one. The electronic one – Personal Information
Carrier (PIC) – is a portable storage card containing the soldier’s
medical history. Battlefield medics and medical doctors are given various levels
of access and functional capabilities for these electronic devices.
The ComSci Fellows were also shown the components of a combat support hospital – Forward
Deployable Digital Medical Treatment Facility – which is designed for
a 72-hour stay prior to a move to a regular medical facility. Each container
of components must be lightweight enough so that a maximum of four soldiers
can carry it. An intensive care unit is built into the patient’s litter.
The wounded soldier can then be evacuated on this litter.
The day’s series of highly informative briefings ended with an overview
of the Telemedicine and Advanced Medical Technology Program. The program’s
goals are to improve joint medical readiness (pre-deployment), battlefield
medical awareness (real-time), and effective employment of medical forces (care
management). The program includes both basic and applied research. The five
technical areas, or product lines, are information management/information technology,
medical informatics, mobile medical technology, bioinformatics, and biosurveillance.
Funding comes from a wide variety of sources, such as the Defense Advanced
Research Projects Agency (DARPA), Department of Homeland Security, Army and
the Special Operations Command.
The USAMRMC website is: http://mrmc.detrick.army.mil. Information on TATRC
can be found at: http://www.tatrc.org. The USAMRIID website is: http://www.usamridd.army.mil.
Mary Kavanaugh
Science, Technology, and Education Counselor
Delegation of the European Commission to the United States
(June 17, 2004)
Topic: European Science and Technology Policy
Dr. Mary Kavanaugh spoke to the ComSci Fellows about European science and
technology policy. A biologist by training, Dr. Kavanaugh is the Science, Technology,
and Education Counselor with the delegation of the European Commission to the
United States. Prior to coming to the United States in 2003; she was in the
European Commission headquarters in Brussels for ten years.
Dr. Kavanaugh began her presentation with a brief introduction to the European
Union (EU) and its history. She emphasized that each of the 25 individual countries
of the EU funds their own science and technology (S&T) national policies
and programs. She said that the joint cooperative programs of the EU are meant
to complement those of individual countries. The EU should be involved where
a regional effort could accomplish goals that individual countries could not.
Launched in 2002, the idea of a unified European research policy, the European
Research Area (ERA), is meant to be a plan for the future of scientific research
in Europe. It has broad support at the highest political, scientific and industrial
levels. It involves input from national programs, framework organizations,
and European organizations to develop a cooperative European research policy.
The instruments for implementing this policy are the EU Framework Programmes,
which are multiyear plans with specific described objectives. The First Framework
Programme was begun in 1984, and the EU is currently in the sixth cycle, or
Sixth Framework Programme, running from 2002 through 2007. The budget for individual
programmes has increased from €3.27 billion to €17.5 billion (~ $
21 billion). This can be compared to the current budget of the National Institutes
of Health (NIH) at $28 billion.
Dr. Kavanaugh described the three critical areas of the current framework
programme: 1) focusing and integrating community research; 2) structuring the
ERA; and 3) strengthening the foundations of the ERA. Under the first area,
there are several research topics where there is added value of European scientists
working together, and these are priority topics for funding cooperative research.
These include topics such as genomics and biotechnology for health; nanotechnology,
aeronautics and space, sustainable development and food quality and safety.
Under structuring of the ERA, there are topics emphasizing the importance
of developing small and medium-sized enterprises (SMEs), with the goal that
15 percent of the budget should be spent on SMEs. Another item under this general
area is on human resources, including scientist mobility grants, the Marie
Curie Program. There is also a current emphasis on addressing science/society
issues, with efforts to address the skepticism of the public through providing
education and information to better understand science and the role it plays
in modern life. Other major efforts are to educate and encourage young people
to take up science as a career and to promote the advancement of women in science,
especially in positions of higher management. All of these issues, of course,
are being similarly addressed in the United States. The third area, strengthening
the foundations of the ERA, includes coordination of the national activities
of European countries and carrying out studies and benchmarking activities
for scientific research efforts.
One of the new instruments being developed to carry out the ERA are Networks
of Excellence. These networks are virtual research centers, where institutions
working on similar research decide to combine efforts, instead of competing
for increasingly scarcer resources. Such a network works on a broad research
area. This is to be compared with another newly developed tool, the Integrated
Project, which is not concerned with general research, but rather with specific
research objectives. In both these cases, a consortium of at least three different
countries is involved.
Dr. Kavanaugh discussed the issues of intellectual property rights (IPR) with
the ComSci Fellows. She compared the treatment of IPR in the United States
via the Bayh-Dole Act, where it is clear that the institution that receives
federal money owns the results, to the European case, where in the 25 countries,
there is not yet any single IPR regulation. It has been agreed that participants – individual
institutions – in the Framework-funded projects own the results they
have generated, and that any consortia that are funded must sign an agreement
to this effect.
Dr. Kavanaugh also discussed some of the difficulties of developing research
areas for the framework programmes. Some proposed research areas are controversial,
and sometimes no agreement can be reached on including them in the Framework.
One such topic is stem cell research, which was proposed to be included in
the current Framework Programme, and which is adamantly opposed by several
countries, including Italy, Austria, and Ireland. Although no decision was
able to be taken and rather than hold up agreement on the Framework Program,
it was finally agreed that during the first period, no projects involving stem
cell research would be funded. Following this period, cases will be reviewed
individually, with no single rule being applied.
A question regarding the mechanism for the evaluation of proposals was asked.
Dr. Kavanaugh stressed that evaluations were taken very seriously, with a goal
to be fair and transparent, and have clear criteria. The primary priority is
for scientific excellence, but proposals are also judged for management capacity
of the group, and the European added value. International panels are used,
from a “European database,” which is generated both from Member
States, who send in names of potential experts, and from individuals who propose
themselves and send credentials. The Commission then selects appropriate experts,
trying to achieve a mix of nationalities, men and women, and different ages.
It was noted that an exchange between this EU “database,” and United
States expert databases, such as that from NIH, would be very useful for both
sides.
Dr. Kavanaugh concluded her discussions with the ComSci Fellows by telling
them that there is currently a big push for industrial participation in the
Framework Programme, with an emphasis on SMEs. The difference between EU efforts,
and similar efforts in the United States, such as in the Small Business Innovation
Research (SBIR) program, is that the EU does not fund individual companies,
but rather consortia.
The website for EU research is: http://europe.eu.int/comm/research.
Masanobu Shinozuka
Distinguished Professor and Chair
Department of Civil and Environmental Engineering
University of California, Irvine
(June 21, 2004)
Topic: Homeland Security – Technologies Used to Detect Terrorist
Activities and Protect Our National Civil Infrastructure
Dr. Masanobu Shinozuka’s research focuses on continuum mechanics, micromechanics,
stochastic processes and fields, structural dynamics and control, and earthquake
and wind engineering. He also studies systems engineering, with an emphasis
on structural and system reliability; risk assessment of lifeline systems,
including water, electrical power and transportation networks; and analysis
of the socio-economic impacts of natural disasters. Dr. Shinozuka also is interested
in advanced technologies, specifically remote sensing and geographic information
systems (GIS) for disaster assessment and mitigation, smart materials and structures,
and nondestructive evaluation. The applications for his work are in the earthquake
engineering of buildings, bridges, and lifeline and environmental systems.
He is a member of the National Academy of Engineering.
The overall goal of Dr. Shinozuka’s work has been to develop risk-based
economic models for the consequences of terrorism and improved preventive measures.
The specific goals include: 1) generation of terrorist disaster scenarios,
2) assessment of infrastructure performance under disaster scenarios, 3) estimation
of consequences and risk, learning from past disasters, 4) development and
demonstration of risk-based economic models for consequences, and decision
support systems for cost-benefit analysis of countermeasures and response actions,
and 5) development of innovative education, training and communication technologies,
and nurturing of talented individuals to pursue academic and research careers.
Why are the infrastructure systems important? Because infrastructure systems
are distributed spatially and are the most vulnerable to terrorist attacks.
Critical individual buildings could also be targeted by terrorists. The impact
of lifeline failures could be far-reaching and disruptive to society. The most
vulnerable systems are power systems, water systems and integrated water districts,
highways and ports, and emergency medical facilities. Integrated multidisciplinary
approaches are essential to prevent widespread lifeline failures. These approaches
should include:
- Preparedness planning, involving systems analysis for vulnerable assessment,
retrofit and rehabilitation, and monitoring for system integrity; and
- Emergency planning, using remote sensing, data fusion, and real-time
collection of data.
Dr. Shinozuka gave some examples of various applications areas. He first
talked about power systems using examples such as the Los Angeles Department
of Water and Power (LADWP) power system, the Southern California Edison
(SCE) power system, the Western Electricity Coordinating Council (WECC)
grid, and
the United States power grid. The LADWP’s transmission network handles
6,300 MW at peak hours for a population of 3.7 million. The WECC’s
grid covers a network of 14 United States western states, 2 Canadian provinces
and the northern part of Baja California. The United States power grid has
more than 6,000 power plants, 500,000 miles of aboveground and underground
transmission lines, 150 control area operators, and 3 main power grids. He
then showed the impact of a number of previous blackouts using the disaster
database. For example, he discussed the August 4, 2003 United States/Canada
blackout that affected New York City, Cleveland, Ohio, Detroit, Michigan,
and Toronto and Ottawa, Canada. Overall, 61,800 MW of load were lost, affecting
50 million people. Twenty-one power plants went off-line, including ten nuclear
plants. The “cascading blackout” destabilized the entire Niagara-Mohawk
power grid. This major blackout impacted transportation, shutting down airports,
subways, commuter trains, and roadways. It also caused a slowdown of the
Internet. “Boil-water” orders were issued and Lake Erie beaches
were closed due to sewage overflow. There was also an increase in trips
to the emergency room in New York City hospitals due to intestinal illness
related
to the consumption of spoiled food.
Regarding water systems applications, Dr. Shinozuka discussed several existing
systems, including the LADWP’s water system and the municipal water district
of southern California. The LADWP’s water system services 3.5 million
people with 12,000-km of distribution and trunk pipelines, and includes a 1200-square
km service area. He also discussed examples from the experience disaster database,
such as the January 17, 1994 Northridge earthquake (magnitude 6.6), which resulted
in trunkline damage in 74 LADWP locations, and 1,013 locations of the Metropolitan
Water District of Southern California (MWD) needing repairs. Another example
was a 108-year-old water main that ruptured in New York City on October 16,
2003, and inundated two city blocks, flooding cars and basements, and closed
off a stretch of the Trans-Manhattan Expressway.
Dr. Shinozuka also showed transportation applications. Existing systems include
the transportation network at street level, the Caltran freeway network in
Los Angeles and Orange County, and the ports of Los Angeles and Long Beach
in California. He showed examples from the disaster database of a bridge rebuilt
on a fast track, following a devastating crash that caused the collapse of
a high-traffic bridge in downtown Birmingham, Alabama. The bridge was back
in operation 53 days later. He showed a table in the Federal Emergency and
Management Agency format of the cost-benefit analysis on new construction versus
retrofit costs. A table of the estimation of loss due to drivers’ delay
(based on network analysis) was presented. The loss due to drivers’ delay
without retrofit was significant ($56 million versus $6 million).
Finally, Dr. Shinozuka talked about medical facilities and public health applications,
in an effort led by the University of California at Irvine (UCI), along with
the Orange County Sheriff’s Department, the Orange County Emergency Medical
System (EMS), and the Los Angeles County EMS. The key tasks relating to medical
facilities are:
- scenario development with the consideration of significant impacts
on health, such as those associated with explosives or chemical, biological,
nuclear, radiological (CBNR) weapons;
- development of medical system performance criteria in terms of hospital
surge capacity, including indicators such as available personnel and medical
supplies;
- performance modeling for the prediction of scenario-related health
hazards, estimate of economic and social losses by developing a library of
injury models relating exposure to injury type;
- building on previous modeling efforts; and
- review of epidemiological
and other approaches for estimating the value of human life.
Dr. Shinozuka stated that UCI is the only academic medical center serving
Orange County, California (2.8 million people). The center has a Level I trauma
center, burn center, emergency medical services paramedic base hospital and
teaching center, and a comprehensive 24-hour emergency department. The UCI
medical center has experience and expertise in disaster medicine, public health,
emergency medical services, medical informatics, toxicology, trauma and burn
management, and health services. It also has disaster experience in community
disaster planning and management, medical disaster response, community and
hospital bioterrorism surveillance, and hospital and health care capacity in
disaster response situations.
More information can be found at: http://shino8.eng.uci.edu/Homeland_Security.htm
back to top
Ruth Greenspan Bell
Director, Program for International Institutional Development and Environmental
Assistance
Resources for the Future
(October 2, 2002)
Topic: Environmental Policy for Developing Countries
Ms. Ruth Bell is an attorney with considerable experience with issues
related to environmental policy for developing countries. Her talk outlined
the reasons why sophisticated environmental control tools, such as market-based
emissions credit trading, may not be the most appropriate strategy for
improving environmental quality in developing countries.
While there has been much progress in improving environmental policy
in North America, Western Europe, and Japan over the past 30 years, this
has not been the case in the developing world. Improvements have been
made over the past ten years in some of the countries of the former Soviet
Bloc, but much remains to be done. Much of this has to do with cultural
and legal differences that divide the developed and the developing world,
though, of course, disparities in wealth are also important. Thus, policy
instruments that might be useful in a wealthy developed country may not
work as well in a developing nation.
Environmental regulation succeeds in the developed world because the
rule of law works. There is also what is known as a "culture of
compliance;" the regulated entities for the most part obey the law,
and enforcement action is focused on outliers. Even so, the policy instruments
used for environmental protection involve a mix of approaches. These
include mandatory discharge limits, adoption of specific environmental
control technology, fees and fines, and emissions trading. Market-based
emissions trading is one of the more recent innovations, and its introduction
has been only partially successful in the United States.
Market-based tools such as emissions trading appear to work under the
following conditions:
(1) The presence or threat of mandatory environmental controls allows
industry to know the real cost of compliance. Thus, industries know how
to properly value emissions credits and the market can fix a fair price;
(2) The availability of real-time monitoring of emissions, so that
the emissions traded are tied to actual emissions in the field;
(3) Transparency in the transactions, so that all of the stakeholders
and the public know who is trading with whom; and
(4) Significant legal and regulatory safeguards to ensure that these
trades are trades of real emissions and that the overall emissions caps
are maintained.
Making such a system work requires a society with familiarity with how
the market system works, and a legal system that can ensure the integrity
of the trades. Even then, these market-based tools have not always worked.
In one recent example, a NOx trading emission credit market in New Jersey
was recently suspended on the basis of allegations of false accounting.
In developing countries, these market-based tools may not always be
the most appropriate approach for pollution management. Many of these
countries don’t have strong environmental regulatory agencies,
a culture of compliance with the law, mechanisms in place to ensure transparency
and integrity in market deals, and sufficient economic resources to invest
in the advanced technology real-time monitoring of emissions. In some
cases, responsibility for environmental monitoring is combined with responsibility
for resource management, a merging of functions that encourages corruption.
Yet despite these problems, the development-aid community has insisted
on making market-based instruments the primary management tool for environmental
control.
Ms. Bell believes that suggestions from the aid community should not
be controlled by ideology or current fads among economists. Rather, they
should incorporate a flexible array of regulatory instruments that build
on what currently works, which set realistic goals, and which make steady
progress toward the ultimate goal of clean air, water, and land that
are necessary for any country to prosper.
Her presentation pointed out the need to backup market-based solutions
to environmental issues with effective measures for checking on everyone’s
integrity. Or, as President Reagan liked to quote, "Trust, but verify."
Arthur L. Caplan
Director, Center for Bioethics
University of Pennsylvania
(October 9, 2002)
Topic: Ethical Issues of New Technology
Dr. Arthur Caplan, Director of the Center for Bioethics of the University
of Pennsylvania, delivered a lively and thought-provoking presentation
that touched on the ethical implication of advances being made in a wide
array of biomedical subject areas including cloning, biodefense, reproductive
health, and death.
Dr. Caplan opened the discussion by describing his background and interests
and how he got involved in the field of bioethics. In addition to serving
on the faculty of the University of Pennsylvania, he has been the Director
of the Center of Bioethics and the Chief of the Division of Bioethics
for the University’s Medical Center since 1994. The Center supports
a dynamic program made up of faculty members representing the disciplines
of medicine, philosophy, theology, the social sciences, and patent law.
The graduate program currently includes approximately 15 percent of the
medical students. Most recently, he helped to establish the first Department
of Medical Ethics in this country. In the University of Pennsylvania
program, medical students learn about topics such as the definition of
death, the ethics around genetic testing, and how to deal with clinical
ethical issues. The Center places a strong focus on education and outreach,
providing information and access with the help of the Internet to both
the public and the media. (The Center’s website can be found at
www.bioethics.net.) Partnerships are also being developed to foster support
for a standardized bioethics curriculum for high school students. As
the Center’s Director, Dr. Caplan’s opinion is routinely
sought on scientific and medical ethic issues.
Dr. Caplan has served on several committees including the Hastings Committee,
a think tank that publishes the Hastings Center Report, a key ethics
journal; the Department of Health and Human Services Advisory Committee
on Blood Safety; and most recently, on the United Nations Committee for
Human Cloning. He shared anecdotes from his experience with these committees.
To distinguish the practical field of bioethics with its focus on medicine
and the biological sciences from pure ethics and/or politics, Dr. Caplan
described a model, which uses four key questions to help in formulating
an ethical analysis. He used case scenarios to outline the series of
fundamental questions that should be asked typically for each bioethics
case encountered. What are the facts? What are the value issues? What
are the ethical concerns around an issue? How can agreement be reached
by using existing areas of consensus? He also suggested that considerations
be made in looking at the law, religion, and philosophy for existing
traditions. He stressed that only as a last resort should someone seek
to establish a new principle.
As an example, he walked through the cloning debate and how the United
Nations Committee for Human Cloning used this process to come to consensus:
(1) What are the facts? Cloning isn’t purely a human activity.
In assessing the facts, the Committee agreed that clones in fact exist
in nature as either identical twins or twinned animals that have been
produced for medical research.
(2) What are the value issues? The difference with the cloning of Dolly
was that an adult cell was used in creating the clone (as compared to
an embryonic cell), thus, this procedure was "turning back" an
adult gene’s internal clock. Most societies do not support creating
new life outside of normal reproduction and value the ability of individuals
to have a choice in being reproduced.
(3) What are the ethical issues? Reproduction through cloning raises
safety questions and Dr. Caplan pointed out that, in fact, only 1 in
500 clones developed into fetal sheep. In addition, those clones that
did survive grew to develop tumors and aged abnormally. Thus, you can
conclude that Dolly is not normal and old DNA does not appear to be safe.
Because of this, several risk questions must be considered before one
attempts to clone a human, including the risk of creating an unhealthy
human, the risk to the surrogate mother, and the risk to the egg donor.
(4) How do you reason out from areas of consensus? Dr. Caplan stated
that all members agreed that genetic manipulation should only be done
by permission. All members support the premise that you don’t reproduce
without consent.
Based on these considerations, the Committee endorsed an international ban
on human cloning.
Dr. Caplan fielded a variety of questions that enabled him to interject more
ethics considerations into the equation. Stimulating discussion ensued on such
topics as biodefense research; the stem cell research debate; the balance between
voluntary and mandatory vaccinations; the appropriate role of the Federal Government
in biomedical issues such as reproductive technologies being practiced by the
in vitro fertilization industry; the potential use of information generated
from genome projects; and the definition of human life versus permanent vegetative
state. In a number of cases, Dr. Caplan pointed to the need for a legal framework
and suggested the potential role to be played by the Federal Government.
Asha M. George
Senior Program Officer for Biological Programs
Nuclear Threat Initiative
(October 23, 2002)
Topic: Bioterrorism in the 21st Century
Dr. Asha George delivered an enlightening and thought-provoking presentation
on the prevention of bioterrorism. She began by mentioning that bioterrorism
dates back to the 13th Century when bodies of plague-infected victims
were catapulted over the city walls during the Tatar siege of Kaffa.
Reflecting on the historical use of disease as a weapon, Dr. George
emphasized the importance of thinking about the bioterrorism threat in
the context of infectious disease. Disease weapons, she noted, are different
in important ways from other weapons of mass destruction such as nuclear
bombs or chemical weapons. Among their more nightmarish qualities, they:
(1) can be incredibly lethal, comparable to or worse than nuclear blasts;
(2) can be developed with relatively cheap materials that are easily
disguised because they have benign, "dual-use" applications;
(3) can be used in a way that is difficult to trace to the actual perpetrators;
and (4) are well-adapted to the "asymmetrical warfare" of terrorism.
In the United States, offensive biological warfare research programs
were started in the 1940s. These programs remained active until the late
1960s. Since then, world leaders and private organizations, such as the
Nuclear Threat Initiative, a private organization founded in January
2001 by CNN founder Ted Turner and former Senator Sam Nunn, have been
dedicated to reducing risks and preventing the spread of nuclear, biological
and chemical weapons.
Today, global concerns about bioterrorism have heightened awareness.
In addition to technical journal and newspaper articles, it is becoming
common to see biological warfare issues depicted in political satire
comics as well. This is a definite indication of how public awareness
and sensitivity to the topic is increasing.
Various records show that 12 countries currently possess biological
weapons capabilities. These countries are Iran, Iraq, Syria, Israel,
Libya, Egypt, Russia, China, North Korea, South Korea, Taiwan, and India.
Of growing concern surrounding the possession of these capabilities are
their intended use, security, proper containment, and disposal. Dr. George
emphasized that since the threat of a biological or chemical attack exists
throughout the world at variable and unknown levels, efforts should be
directed toward the prevention and deterrence of use. Fairly administered
prevention and deterrence methods are key.
The fact that all bioscience-related research can be destructive and
dangerous remains a serious matter. Government and private organizations
continue to struggle with the amount of information that is readily available
on the subjects of bioterrorism and biological weapons. The amount, completeness,
and accuracy of the information available to the general public are astounding.
Today, we live in a world where technology makes obtaining information
extremely convenient and easy. The Internet is a vast data bank of information
that contributes to the continued existence and capability of biological
threats. While knowing this, no entity (private or governmental) wants
to prevent the use of this information for legitimate research and science
progression. Intent and intended application are key. Today, we still
lack the identification of one responsible government agency to be solely
responsible for distinguishing what types of bioterrorism-related research
are legitimate.
In 1993, the Office of Technology Assessment (OTA) compared nuclear
and biological weaponry to assess their potential for destruction in
terms of possible casualties. According to OTA, a one-megaton hydrogen
bomb detonated in Washington, D.C., could destroy 570,000 to 1,900,000
lives, whereas 100kg of anthrax powder could kill one to three million
people. Clearly, bioterrorism has the potential to cause tremendous carnage.
The U.S. Government must address the gap between the threat of biological
warfare/terrorism and domestic preparedness. Terminology and differing
definitions associated with terrorism and biological warfare play a large
part in the struggle to create policy and regulation and to educate the
public. A tremendous amount of work still remains to be done in order
to clearly define the roles and responsibilities of various government
agencies, especially with the establishment of the Department of Homeland
Security.
With a background that includes stints as an intelligence officer during Desert
Storm and as a Special Forces Para trooper, Dr. George has a notable talent
for presenting the issues of apocalypse in a clear, concise, and thoughtful
manner.
Site Visit to the Bureau of Engraving and Printing
U.S. Department of the Treasury
Washington, D.C.
(October 30, 2002)
One of the first, lasting impressions of the U.S. Treasury Department’s
Bureau of Engraving and Printing (BEP) is that large sums of money can
take up a lot of space. The case in point is a drying room with shipping
skids stacked over four feet high with uncut sheets of dollar bills.
The piles are impressive, but what’s sobering is that the face
value of each skid (called a "tank") is only about $640,000.
If they’d been double sawbucks (twenties), the entire skid would
still come in at under $13 million.
Which makes one wonder about those spy thrillers where the character
is supposed to be toting around in an attaché case, the $5- or
$10-million ransom money for the secret virus. What denomination bills
are they using?
Torrential fall rains ushered us into the Bureau’s printing facility
at 14th and C Streets in Washington, D.C. Perversely handsome in a 19th-Century-Industrial
sort of way, the Washington establishment (together with its sister plant
in Fort Worth, Texas) is perhaps the most exclusive print shop in the
world. It handles, of course, the exclusive line of U.S. Government bank
notes, but also prints a number of other specialty items -- postage stamps
for the United States Postal Service, hand engraved invitations for the
White House, and sundry documents for other government agencies.
Printing United States bank notes is a complex and exacting process,
and the BEP takes justifiable pride in the product. And interestingly
enough, BEP employees -- and the Treasury -- don’t consider their
product "money." Rather it’s a very specialized print
job done for the Federal Reserve Banks, and not until the notes pass
into the Federal Reserve System do they become "money."
The process begins with high-quality cotton-linen currency paper, which
since 1879 has been provided by Crane & Company of Dalton, Massachusetts.
Starting with recycled blue jeans and underwear -- at least, to hear
BEP staff tell it -- Crane blends the fibers to achieve what it claims
as the most durable currency paper in the world. Even at that, the United
States $1 note has a combat life expectancy of only about 22 months,
and the $5 note’s lifespan is only about 24 months. As a result,
on any given day, you’re most likely to find the BEP’s presses
turning out $1 bills.
Giant high-speed rotary presses -- made in Europe -- print the basic
notes in two passes using high-pressure intaglio printing, the same process
used for fine art etchings. The reverse is printed in green ink, allowed
to dry for up to two days, and then the obverse is printed in black.
After the black ink dries (another stretch in the drying room), the
32-note sheets are cut in half and sent to an overprinting room where
the appropriate Federal Reserve District seal and code is printed in
black, and the U.S. Treasury seal and individual serial number is printed
in green. You can have fun searching your wallet for notes with a star
following the serial number. Star notes come from special sheets of currency
that are dropped into the process after the serial numbers are printed
to replace sheets that are rejected for poor quality.
Quality is an obsession at BEP. The intaglio printing process may date
to the 15th Century or earlier, but BEP is on the cutting-edge of automated
quality control procedures. The sheets are checked for printing errors
after every major step in the process. Before the sheets go to the overprinting
stage, high resolution digital cameras image both sides of each sheet,
automatically compare the images to a stored image of an ideal sheet,
and toss rejects on a pile to be destroyed. The entire digital inspection
happens on-line, as the sheets are being stacked, and happens faster
than the eye can follow.
The BEP may employ high-tech image inspection equipment, but they also
understand the proverb "Trust. But verify." At the very end
of the process, after all the presses have finished, and currency has
been inspected multiple times, diced into individual bills, gathered
into "units" of a hundred bills and banded -- at the very end
of a long row of automated machinery, a single trusted human inspector
sits on a stool randomly selecting bundles of bank notes and riffling
them one last time to look for errors.
The BEP is staffed with serious people doing a seriously good job, but
the Bureau is not without a certain institutional sense of whimsy, which
comes out in the gift shop. There you can buy small bags of shredded
money, notepads made from recycled currency, and "lucky" $1
bills for the Chinese trade. These are uncirculated bills (in handsome
presentation folders) that have lucky serial numbers according to Chinese
principles of arithmancy -- bills beginning with "8888" for
good fortune or "168," the combination meaning "prosperity
forever." Each $1 good-luck note retails for $5.95. Do the math.
Did we mention that the BEP is a self-funded agency?
Alfred R. Berkeley III
Vice Chairman
NASDAQ Board of Directors
(November 6, 2002)
Topic: How Science and Research Affect the Economy and NASDAQ
Mr. Alfred Berkeley was appointed Vice Chair of the NASDAQ Stock Market,
Incorporated on July 27, 2000 after serving as NASDAQ President since
1996. Previously, he was Managing Director and Senior Banker in the Corporate
Finance Department of Alex, Brown & Sons, Incorporated, where he
financed computer software and electronic commerce companies.
In 1961, Congress authorized the Security and Exchange Commission (SEC)
to conduct a study of fragmentation in the over-the-counter market. The
SEC proposed automation as a possible solution and charged the National
Association of Securities Dealers (NASD) with it implementation.
On February 8, 1971, NASDAQ began trading. NASDAQ surpassed the New
York Stock Exchange in annual share volume in 1994, and in 1999, it became
the largest stock market in the United States by dollar volume, repeatedly
breaking share and dollar volume records. In June of 1999, NASDAQ signed
an agreement in Tokyo with Softbank Corporation, jointly capitalizing
a new company called NASDAQ Japan. This proved to be the first leg in
NASDAQ's global strategy to link Asian markets with European and American
markets. In 2000, NASD membership voted overwhelmingly to restructure
the organization and spun off NASDAQ into a shareholder-owned, for-profit
company. NASDAQ continues to build capacity for the trading volumes of
tomorrow, with a capacity to trade 6 billion shares a day, a ten-fold
increase since 1997.
NASDAQ differs from the venerable New York Stock Exchange, according
to Mr. Berkeley, in that the NYSE was patterned after the European "bourse" model,
which in turn derived from ancient craft guilds. It is designed to preserve
monopolies -- in this case, trading monopolies. NASDAQ, on the other
hand, is modeled more after the international currency markets, highly
diversified, and highly decentralized.
Rather than forcing investors to go through a single financial firm
to buy or sell stocks, NASDAQ links up a variety of competitors and lets
participants choose with whom they are going to trade. Any number of
companies, large or small, may trade on NASDAQ and compete on an equal
basis with NASDAQ giants such as Intel, Microsoft, and MCI. Key to its
market structure is a core group of financial firms called market makers.
More than 500 market making firms trade on NASDAQ, acting as distributors
for NASDAQ-listed securities. Also known as dealers, market makers are
unique in that they commit their own capital to NASDAQ-listed securities
then turn around and re-distribute the stock as needed. By being willing
to buy or sell stock using their own funds, market makers add liquidity
to NASDAQ's market, ensuring that there are always buyers and sellers
for NASDAQ-listed securities, and enabling trades to be filled quickly
and efficiently.
As the world's largest electronic stock market, NASDAQ is not limited
to one central trading location. Rather, trading is executed through
NASDAQ's sophisticated computer and telecommunications network, which
transmits real-time quote and trade data to more than 1.3 million users
in 83 countries. Without size limitations or geographical boundaries,
NASDAQ's "open architecture" market structure allows a virtually
unlimited number of participants to trade in a company's stock. Today,
NASDAQ lists the securities of nearly 4,100 of the world's leading companies,
and each year, continues to help hundreds of companies successfully make
the transition to public ownership.
Mr. Berkeley offered several cautionary notes. The United States, he
argued, was rapidly losing ground in banking and finance to skilled foreign
interests, such as the Deutsches Bank, which is "working on world
financial hegemony." The United States, said Mr. Berkeley, is reaching
the point where it will have to abandon populist laws on banking regulation
if it expects to survive in a competitive international market.
As for rationality in the market, Mr. Berkeley said, there are always
three games going on simultaneously: the game of chance, played by day
traders; the game of skill, played by professional traders; and the long-term
game of strategy. Only about five to ten percent of the transactions
in the market have anything to do with investing -- the game of strategy.
The vast middle ground belongs to the "skill" players who are
simply trying to guess what the herd will do. "The Market as a mathematical
model," observed Mr. Berkeley, "looks pretty much like a herd
of wildebeests on the Serengeti."
Finally, Mr. Berkeley was asked about the reasons for the information
technology and Internet market downturn. In his opinion, the Internet
bubble was speculation-based and promised a new cost curve, which was
not based on true deliverables. Regarding the challenges associated with
globalization of the stock market, Mr. Berkeley pointed out that different
countries have different investment challenges and philosophies. For
example, in Europe, only a few companies control the market. Markets
in Israel prefer performance and not prestige. India follows a safety-based
approach. He believes that the United States equity-based model is more
practical than the debt-based model used in other countries. Further,
the laws and regulations for investor protection do not exist in several
countries making trading difficult. In conclusion, Mr. Berkeley stressed
the importance of technological innovation for advancement of global
economy.
Jack Sobel
Senior Director, Ecosystem Protection Program
The Ocean Conservancy
(November 20, 2002)
Topic: The Health of Our Oceans
Mr. Jack Sobel began his talk by mentioning the recent oil spill in
Spain. He said that oil spills get the headlines, but such events typically
have little lasting impact on the ocean. Conversely, long-term changes
to the oceans are more profound and many have been influenced by technology.
Technological advances in fishing gear and boat refrigeration have made
it possible for historically rich fishing areas such Georges Bank to
be threatened by overfishing. It was once thought that areas such as
Georges Bank had inexhaustible fish resources. That thinking no longer
prevails. Until the 1990s, very few marine species were included in the "Red
List" of threatened plants and animals. In 1996, a large number
of marine species were added to the "Red List" maintained by
the International Union for Conservation of Nature and Natural Resource.
While the listing was controversial, Mr. Sobel said, the results were
worrying. Somewhere between 50 and 100 United States species were listed
as ranging from "vulnerable" to extinct, including some common
fishery fish such as cod and haddock.
Mr. Sobel provided a number of facts about our oceans. The United States
manages an ocean territory of 4.1 million square miles -- nearly 20 percent
larger than our land area. While our land is both privately and publicly
owned, our ocean area is entirely a public resource. As a Nation, we
are protecting nearly 30 percent of our most spectacular lands by establishing
national monuments, national parks, national forests, and national wildlife
refuges. Nearly five percent are fully protected as wilderness, and cannot
be changed or altered by logging, mining, drilling, or development. According
to Mr. Sobel, marine protected areas account for less than 1 percent
of United States marine waters.
Ocean management in the United States reflects a lack of unity, leadership,
and vision. Jurisdiction over our ocean resources has been split among
a number of federal and state agencies with different -- and, at times
-- conflicting mandates. Congress, moreover, has enacted a series of
federal statutes that vest different federal agencies with responsibility
for overseeing specific areas or marine resource extraction, or other
activities. The lack of a unifying federal agency or authority is in
part responsible for our disjointed approach to ocean management. Multiple
agencies with conflicting visions cannot effectively protect marine resources.
After a spirited question and answer session, Mr. Sobel showed a video
about marine reserves. He believes that marine reserves are only part
of the solution to the problems plaguing oceans, but they can be effective,
increasing fish populations, average size, and breeding rate. Although
he spent much of his time discussing negative impacts that can be caused
by fishing, he also mentioned coastal development and pollution as major
contributing factors in the decline of our oceans.
An intriguing new option for marine reserves, Mr. Sobel said, is the
concept of a "network" reserve. Rather than fence off a large
section of coast or ocean in a traditional marine reserve, some scientists
believe the same effect can be achieved by a network of much smaller
reserve areas that are carefully selected to protect key areas in the
life cycle of target species. It requires being able accurately to predict
the movement of fish and other aquatic species.
The California Fish and Game Commission recently approved a plan to
create such a network of marine reserves off the coast of California.
The joint state and federal plan will ultimately protect nearly 25 percent
of the waters in the Channel Islands National Marine Sanctuary, creating
the largest marine reserve network in the continental United States.
It will provide the greatest chance of survival for both the marine resources
and the industries dependent upon them.
The Channel Islands' Marine Reserve will provide refuge for the many
fish and wildlife species whose populations have been declining dramatically,
some by more than 90 percent. Giant sea bass, sheepshead, sharks, and
rockfish are some of the most affected fish. The Channel Islands have
been designated as a National Park, a National Marine Sanctuary, and
a United Nations Biosphere Reserve because of their beauty and spectacular
diversity of life. Yet before the action by the California Fish and Game
Commission, less than one percent of the sanctuary was off-limits to
fishing.
In concluding his talk, Mr. Sobel distributed copies of a recently published
report by the Ocean Conservancy called the Health of the Oceans. Health
of the Oceans is a yearly assessment of ocean resources and ocean management.
In many cases, the news is not good. Pollution has rendered 44 percent
of United States estuaries unfit for uses such as swimming and fishing.
Numerous species of marine mammals, sea turtles, and sea birds are in
danger of extinction. The status of over two-thirds of our fish stocks
is unknown. Clearly, as a Nation we have not performed well in managing
these resources.
In its report, the Ocean Conservancy has suggested solutions, and ways
that individuals, communities, and lawmakers can work together to reverse
the failing health of the oceans. The Ocean Conservancy believes that
our oceans can only be as healthy as an informed public demands.
Site Visit to the National Institute of Standards and Technology
U.S. Department of Commerce, Gaithersburg, MD
(December 4, 2002)
The National Institute of Standards and Technology (NIST) may not yet
have quite the name recognition of the National Aeronautics and Space
Administration, but it’s getting there and during a day-long visit,
our NIST hosts demonstrated why.
The oldest of the "national laboratories," NIST was established
as the National Bureau of Standards in 1901 when the economic importance
of reliable national measurement standards was becoming increasingly
clear. It was conceived from the start as a "neutral" partner
to United States science and industry, an advisory agency without regulatory
powers. Although its role as custodian of the national standards of physical
measurement -- the meter, second, volt and so on -- gave it a certain
amount of automatic authority, the new bureau soon built its reputation
and influence on excellence in research.
The modern NIST underwent a major transformation in 1989 when its name
was changed by congressional authorization, and its duties expanded to
include not only the existing eight science and technology research laboratories
and services, but also three dramatically new functions:
(1) the Baldrige National Quality Program, which promotes performance
excellence in U.S. manufacturing and service organizations and manages
the highly respected Malcolm Baldrige National Quality Award;
(2) the Manufacturing Extension Partnership, which co-funds and coordinates
a nationwide network of local centers to provide technology assistance
to small and mid-sized manufacturers; and
(3) the Advanced Technology Program, an industrial research and development
(R&D) program that co-funds with industrial sponsors innovative technology
development projects that have the potential for important economic benefits
to the United States.
Our visit began with a quick tour of the exhibits in NIST’s visitor
center and an overview briefing by Dr. Arden Bement, Director of NIST.
Organizationally, NIST is an agency of the Commerce Department’s
Technology Administration. NIST had an operating budget around $720 million
in 2002 (counting revenues from fees), employs more than 3,200 researchers,
technicians, and support staff, and hosts about half again as many visiting
researchers from universities, industries, and other research institutions
both American and foreign. NIST has two major facilities, the headquarters
and main labs in Gaithersburg, Maryland, and an additional lab complex
in Boulder, Colorado.
NIST’s international reputation rests on the technical expertise
of its labs, such as the Center for Neutron Research, which was the first
stop on our tour. NIST has operated a small research nuclear reactor
(basically a source of neutrons) since the 1960s, but the facility really
came into its own in the 1990s when the agency modified and expanded
it to create a world-class cold neutron research facility. Cold (low-energy)
neutrons have emerged as one of the premier tools of materials and biological
research, and the NIST center is the most versatile, well instrumented
such facility in the country. Used annually by more than 1,700 researchers
from companies, universities, and other agencies, the center supports
research in a broad array of areas including superconductivity, basic
physics, polymer science, archeology, biotechnology, and even the formulation
of stronger concrete.
Radiation is also a tool for the NIST Physics Laboratory, as we learned
at the next stop, which highlighted the agency’s traditional role
of providing calibrations to assure the accuracy of the Nation’s
measurement system. Among its other duties, the Physics Laboratory provides
the accurate radioactivity measurements necessary to assure the safety
and efficacy of a variety of radiation therapies. We saw the tiny basement
laboratories where the radioactive "seeds" used to treat prostate
cancer and to prevent the re-closing of arteries after angioplasty are
calibrated. NIST expertise in radiation is occasionally pressed into
use for national security, as when the agency advised the White House
and the U.S. Postal Service on the appropriate amount of radiation to
ensure that harmful pathogens in the mail are neutralized.
Nanotechnology is today what plastics were to The Graduate, and industry’s
drive to develop and exploit both materials and devices structured on
the level of individual atoms or molecules require NIST to push measurement
capabilities to new heights, depths -- or whatever -- of smallness. One
place where this happens is the Surface and Microanalysis Science Division
in NIST’s Chemical Science and Technology Laboratory, where researchers
are using ions, photons, and electrons as sensitive probes both to measure
physical and chemical characteristics of small groups of atoms on surfaces
and to better understand how working at the scale of just a handful of
atoms affects both their physical properties and chemical behavior.
The latest ornament to NIST’s research capabilities isn’t
yet complete, so we were given an unusual work-in-progress hardhat tour
of the new NIST Advanced Measurement Laboratory (AML). When it opens
in 2004, the AML will be one of the world’s best facilities for
precision measurement, with laboratory temperature controlled to within
a tenth of a degree in some cases, humidity to within a percent, and
vibration in some places to less than three micrometers per second. Meeting
those specs and others required careful planning and design, and the
new facility will include two wings built entirely underground to help
isolate them from changes in temperature and vibration.
Our day concluded with two presentations that emphasized how NIST’s
role has expanded to meet national concerns. The September 2001, terrorist
attack on New York City brought to center stage NIST’s long experience
in fire and building research. NIST was assigned to conduct a detailed
scientific study of the collapse of the World Trade Center towers to
provide a basis for improved building and fire codes and practices, as
well as to provide better guidance for emergency personnel in responding
to major building disasters.
As an outgrowth of that effort, in October 2002, the National Construction
Safety Team Act was signed into law. The Act gave NIST authority similar
to that of the National Transportation Safety Board to assemble quick-response
teams of experts to investigate major building disasters; to establish
the likely technical cause of building failures; to evaluate the procedures
used for evacuation and emergency response; and if necessary, to recommend
specific changes to building codes, standards and practices, and any
necessary research, or other appropriate actions, needed to improve the
structural safety of buildings.
The 1989 Act that gave NIST its new name also gave it one of its most
controversial programs, the Advanced Technology Program (ATP). Often
mischaracterized as the "civilian DARPA," the ATP was designed
to advanced United States economic competitiveness by spurring industry
to undertake path-breaking, high-risk R&D by providing cost-shared
federal grants to offset to increased risk. The ATP has been criticized
as a government attempt to lead industry (although the project ideas
come from industry, not government) and as "corporate welfare" for
big business (although nearly 60 percent of ATP projects are led by small
businesses). With its funding often in doubt, the ATP has nevertheless
managed a string of technical successes, including the development of
the DNA analysis chips that have played a central role in modern biotechnology
research, new manufacturing technologies for the semiconductor and automobile
industries, and a suite of new technologies that has been credited with "saving" the
United States printed-wiring-broad industry.
John J. Hamre
President and Chief Executive Officer
Center for Strategic and International Studies
(December 18, 2002)
Topic: Science and Security at Risk
Dr. John Hamre has been the Chief Executive Officer of the Center for
Strategic and International Studies (CSIS) since January 2000. He has
also served on the Senate Armed Services Committee staff and as U.S.
Deputy Secretary of Defense. At CSIS, he directs a small, nonpartisan
group of independent scholars and analysts who partner with outside organizations
in need of analysis of Washington policy issues. The CSIS mission is
threefold: (1) to develop policy solutions for international and national
security; (2) to develop new methods of governance for the 21st Century;
and (3) to analyze regional dynamics for a better understanding of world
political, economic, and security policy issues.
In recent years, CSIS has partnered with:
-
The U.S. Department of Energy (DOE) -- CSIS assessed security policy
after alleged breaches in security at DOE laboratories. Dr. Hamre
shared his opinion that DOE's "zero tolerance" rule exacerbated
the problem by establishing an atmosphere of fear and mistrust. CSIS's
recommended that DOE attempt to establish an environment of trust
through organizational change and a streamlined chain of command.
This incident prompted CSIS to stop accepting government funds for
studies, which allows the government to control the disposition of
the results.
-
Johns Hopkins University -- CSIS helped run the "Dark Winter" scenario,
designed to develop a suitable policy for distribution of smallpox
vaccines in limited supply. Dr. Hamre explained why experts believe
that smallpox is the most likely weapon in the bioterror arsenal:
it has been weaponized, it is highly contagious, and there is very
little protection against a smallpox infection, with no effective
therapy and a 30 percent fatality rate.
-
The National Academy of Sciences -- the two organizations are working
with science journals on security procedures for publishing scientific
results.
-
The U.S. Department of Homeland Security -- CSIS is working to
develop a viable post-9/11 security policy.
Dr. Hamre's presentation focused on the process and problems of developing
effective policies for protecting science and scientific ideas in the
current global environment. The old notions of "security" dating
from World War II depended heavily on geographic and ethnic boundaries,
he observed, something that was not particularly bright then, and has
become increasingly useless in the modern world. Today's diverse and
complex threats make "security" more difficult than in the
days of a well-defined threat from Communist regimes.
According to Dr. Hamre, the biggest challenge for establishing effective
security practice is communication, understanding, and teamwork between
scientists and security personnel. Because the security team does not
understand the science, they don't know what to protect or how best to
protect it; and because scientists don't understand how security works,
they tend to view security policy as stifling the scientific enterprise.
Ultimately, both sides need to take more responsibility. Scientists
need to practice constructive, not destructive, censorship. Without self-censorship
they invite government intervention. This intervention has and could
lead to: (1) limited access of foreign scientists to United States scientific
meetings; (2) fear of government criticism and loss of government grant
support resulting in a decline in scientific publication; and (3) invocation
of a deemed export rule to limit flow of scientific information out of
this country. Security personnel, for their part, must learn that you
have to trust some people. Because security personnel don't understand
the degree of political risk in complex scientific decisions, security
policy implemented without input from the scientists will be symbolic
rather than sensible.
Only with dialogue between these two groups can we identify what constitutes
genuine risk and develop workable ground rules to protect it. This will
be the major battleground in development of effective security policy
for protection of science in the 21st Century.
Richard H. L. Marshall
Principal Deputy Director
Critical Infrastructure Assurance Office
U.S. Department of Commerce
(January 8, 2003)
Topic: Protecting the Nation's Infrastructure
The Critical Infrastructure Assurance Office (CIAO) helps coordinate
the development of the Administration's national strategy for Critical
Infrastructure Protection to address threats to the Nation's communications
and electronic systems, transportation, energy, banking and finance,
health and medical services, water supply, and key government services.
The CIAO also assists federal departments and agencies identify their
dependencies on critical infrastructure under the Project Matrix Program
and coordinates national awareness, education and outreach efforts to
private industry and state and local governments.
Mr. Richard Marshall, a confessed Grateful Dead fan and expert in international
communications law, transferred to the CIAO from the "white-hat
side" of the National Security Agency, the part entrusted with protecting
information systems from attack. He led a free flowing discussion of
the CIAO and what the office hopes to accomplish. The discussion changed
directions based on questions and comments we had.
A couple of interesting events relating to communications’ security
were presented. The first one, named "Eligible Receiver," was
a wake-up call on information infrastructure security. It was a war game
conducted by the Department of Defense. (Mr. Marshall served as the legal
counsel.) The "red" team was able to successfully shut down
the "blue" team's ability to fight by hacking into their command
and control computer system. The hacking was conducted by using readily
available information found on the Internet. After more study, it was
found that numerous vulnerabilities could be used to compromise telecommunication
networks (both public and private).
Even more worrisome was the event called "Solar Sunrise," a
coordinated cracking attack on Department of Defense (DOD) computers
in February 1998. Reportedly, some of the machines breached were working
on troop deployment for Iraq. The attackers managed to gain root access
-- the computer equivalent of God. The post-mortem, which traced the
attack to teenagers in California and an Israeli hacker who guided them,
confirmed the results of Eligible Receiver: important DOD computer systems
lacked effective protections against a cyber attack. As a result, the
CIAO was formed.
The CIAO was created in response to a Presidential Decision Directive
(PDD-63) in May 1998 to coordinate the Federal Government's initiatives
on critical infrastructure assurance. The CIAO's primary areas of focus
are to raise issues that cut across industry sectors and ensure a cohesive
approach to achieving continuity in delivering critical infrastructure
services. CIAO's major initiatives are to: (1) coordinate and implement
the national strategy; (2) assess the U.S. Government's own risk exposure
and dependencies on critical infrastructure; (3) raise awareness and
educate public understanding and participation in critical infrastructure
protection efforts; and (4) coordinate legislative and public affairs
to integrate infrastructure assurance objectives into the public and
private sectors.
The CIAO has 37 employees. Half are Schedule "A" contract
employees. About a quarter are on loan from other government agencies.
CIAO does most of its work by setting up partnerships with private industry
and other government agencies, although Mr. Marshall prefers the term "relationships." The
Nation's critical infrastructure is 95 percent owned by private companies
(e.g., phone lines, utilities, railroads, etc.); therefore, developing
relationships with private industry is essential.
A significant policy issue for CIAO was the problem of developing relationships
with private industry, given that most companies do not trust the government.
Companies were concerned that any information they provided, which would
almost certainly involve trade secrets, would be released under an FOIA
request. Also, companies believed they could not work together without
violating anti-trust regulations. CIAO needed (and eventually received)
FOIA exemption and the companies received an anti-trust exemption on
issues related to CIAO work.
Mr. Marshall closed by saying that the goal of CIAO, is to prevent accidents
from happening by finding vulnerabilities to the infrastructure and fixing
problems before a breakdown can occur. The three-prong process includes,
technological improvements, sound policy decisions, and education.
Henry C. Kelly
President
Federation of American Scientists
(January 15, 2003)
Topic: Learning Technology Research and Development
New advances in information technology have afforded new opportunities
for learning and education, but the changes have not diffused yet through
society. Technology can make learning more productive, compelling, personal
and accessible, and current implementations only use a fraction of the
potential the hardware possesses. Dr. Henry Kelly, President of the Federation
of American Scientists, sees many exciting possibilities on the horizon,
given proper support from the research community in both the public and
private sector.
How is learning facilitated by technology? Being provided clues on how
to organize information into a logical structure is the key to acquiring
expertise; this must include practical experience to fix the logic in
the learner’s mind, since knowledge is not retained, as long-term
memory if there is no practical application for it and the information
doesn’t make sense. Dr. Kelly endorses what he terms a "revolution
in learning," where ancient forms such as apprenticeships and internships
integrate learning and assessment and make best use of teachers and experts.
Advances in technology have the potential to do one better than one-on-one
tutoring by structuring immediate feedback and assessment.
Focusing on tool building through simulated instruments and environments
is what Dr. Kelly calls "enabling technology." There is a huge
investment required for systems to dispatch questions to teachers, FAQ
(frequently asked question) lists, tools that allow group formation,
and methods to monitor learning sessions for advancement. Such approaches
have been tried outside of the education sector and have been found to
work. However, there is a continuing need to streamline and aggregate
research approaches, which is challenging given the relatively low level
of spending on learning technology -- a mere $50 million out of an estimated
$900 billion in total education spending. Why is investment so low? Dr.
Kelly thinks it is because it doesn’t fit into anyone’s jurisdiction,
and foundations that might have taken an interest have had their assets
severely diminished by the stock market collapse.
Looking down the road, Dr. Kelly sees a future redefinition of the generalist
occupation of "teacher" to more specialized ones such as learning
specialist, curriculum designer, subject matter specialist, simulation
and virtual environments engineer, software engineer, and evaluation
and certification expert. Such expert personnel will know how to foster
learning in small group environments through teamwork -- something the
marketplace has requested.
Progress has been slow in Dr. Kelly’s view because of lingering
skepticism about the possibility of progress, along with the view that
users are trapped in a cottage industry model with a weak infrastructure
for innovation. Management has not been prepared for change, and there’s
been lukewarm support from traditional education lobbyists, and an absence
of a clearly articulated and exciting research program.
What Dr. Kelly calls a revolution in education might be approached at
the state and local levels where education spending is largely rooted.
However, teachers and other learning specialists often must overcome
attitudes voiced by parents that they want their kids learning, "not
playing games or cruising the Internet." The grades K-12 have been
the most politically difficult terrain for innovation, suggesting that
better platforms for progress might be adult or other vocational education,
community colleges, or private sector or military-government training
programs.
Indeed medical education may be a fertile ground for new learning technologies,
according to Dr. Kelly. Medical schools, he said, are finding it increasingly
difficult to find good teachers for first- and second-year classes such
as anatomy. There’s no glory or bucks in it to attract the upcoming
generation of medical teachers, and the older generation is dying off. "Homeopathic
medicine starts looking better and better the more you look into medical
education," Dr. Kelly observes.
Federal public-private partnerships brought us such innovations as global
positioning systems, parallel computing, computer graphics, the Internet,
agricultural research successes, and jet engines. Perhaps they will step
up to usher in a technology-fueled revolution in learning.
Rebecca Hanmer
Director
Chesapeake Bay Program Office
U.S. Environmental Protection Agency (EPA)
(February 5, 2003)
Topic: Intergovernmental Partnership and Science-Based Environmental
Restoration
The Chesapeake Bay restoration effort is noted for its partnerships.
Although Congress authorizes the program, EPA maintains regulatory authority
of the Chesapeake Bay. More than 600 people (including federal, state
and local governments, marine scientists, fishermen, farmers, environmental
activists, and others) are involved in program committees. One of the
keys to the partnership's success, according to Program Director, Ms.
Rebecca Hanmer, is its transparency -- all the way from meetings to decision-making
to operations. The process is collaborative and driven by stakeholder
participation.
Collaboration has not led to restoration, however. Only one jurisdiction
(the City of Washington, D.C.) has met the goal set in 1987 of reducing
nitrogen and phosphorous by 40 percent, primarily through voluntary means.
Yet in 2000, the partners agreed to even stricter goals, laying out 93
new commitments. Faced with this wish list, the Chesapeake Bay Program
is focusing on the top priorities: working with communities to develop
watershed management plans and correcting nutrient and sediment problems,
both with a target date of 2010. Other goals in the queue include preserving
20 percent of the land from development, developing management plans
for specific species of fish and wildlife, and increasing the native
oyster population tenfold.
The stakeholders have acknowledged that the Chesapeake Bay's nutrient
problems can't be solved through reduction of nitrogen streams alone,
but require reductions in air deposition and sediment loads along tributaries
as well. The Chesapeake Bay Program is pursuing multiple avenues to meet
these objectives ?? involving additional states; studying the relationship
between sediment, algae, and the food chain; regulating animal feeding
operations; and developing standards on water clarity.
As with many environmental regulations, EPA's plans to publish federal
guidance for the states on water quality standards have met with resistance
from industry and municipalities. (The states are responsible for adopting
and enforcing water quality standards.) EPA is considering various schemes
(such as flexibility in state standards, the use of technology and best
management practices, or even a trading program) to help states and stakeholders
meet water quality goals. The Chesapeake Bay Program is the first United
States program to pursue innovative methods to meet nutrient and sediment
goals. Its collaborative nature (over 600 people from different federal
and state agencies serve on the Program’s committees) calls for
managerial finesse. "You must exercise power subtly, adroitly, and
never in public," said Ms. Hanmer. Actual progress, she said, is
more important than requirements that exist only on paper.
EPA's overall strategy is to pursue regulatory authority as far as possible,
then rely on voluntary programs to meet environmental goals. The agency
uses funding incentives (i.e., "purchasing behavior") to coax
industry and other stakeholders to pursue voluntary opportunities in
advance of regulation.
One of the Chesapeake Bay's most visible problems, the native oyster's
decline, has generated many ideas, including a controversial plan to
introduce a foreign oyster. This alien species, though projected to be
more disease?resistant than the natives, could potentially decimate the
native population even further. A pilot program was approved by the partnership
for 2003.
Patricia O'Connell Ross
Team Leader for the Mathematics and Science Partnership Program
U.S. Department of Education
(February 12, 2003)
Topic: Filling the Pipeline with Young Scientists
Ms. Patricia O'Connell Ross has a background in educational administration
and policy, and is Team Leader for both the Mathematics and Science Partnership,
and the Javits Gift and Talented Students Programs at the U.S. Department
of Education. She presented provocative data showing a slide in math
and science competency of United States students from fourth grade through
high school, and discussed differences in curriculum and teacher preparation
between the United States and other countries that likely account for
our continuing poor performance in these subject areas. She concluded
her presentation with a discussion of the "No Child Left Behind" Act
of 2001.
Although there was variability among United States school districts,
United States fourth graders -- overall -- ranked near the top for achievement
in math and science while twelfth graders ranked at the bottom. What
happened during the intervening eight years? The Third International
Mathematics and Science Study (TIMS) of 1996 identified some significant
differences between our education system and that of better performing
countries. First, our curriculum is "a mile wide and an inch deep" and
focused on procedural rather than developmental knowledge. Whereas "fourth
grade" Japanese students are taught 20 topics in math and sciences,
United States students are taught 32. By the eighth grade, the gap widens
from 8 topics in Japanese schools to the 36 topics taught in this country;
by twelfth grade, the gap extends from 3 (Japan) to 25 (United States).
This unfocused curriculum, targeted to the lowest common denominator
among students, and lack of common standards for how to teach it are
seen to be key elements in the failure of United States students to be
globally competitive in math and the sciences.
Teacher preparation is another significant problem. In other countries,
a teacher is a respected professional who likely performed in the top
quartile during their education. In this country, most teachers scored
in the bottom quartile on their SATs. While primary education in math
and sciences is highly variable, depending on each teacher's comfort
zone, by middle school it gets worse, with less than 50 percent of math
and science teachers holding a major or minor degree in those subject
areas. In some districts, up to 25 percent of high school math and science
teachers do not have major or minor degrees in these subjects; however,
this varies widely, being more of a problem in inner city schools. Although
teacher retention is also a problem, many states and local districts
have initiated programs to bring more qualified individuals into the
classroom. These include the "troops to teachers" program,
which would transition retiring military personnel to teaching positions,
education programs specifically designed for individuals looking for
a mid-career change, and programs sponsoring active teachers to earn
a graduate degree in a related discipline.
Despite these obvious differences between our educational system and
that of other countries, achievement gaps continue to increase. To address
this problem, the "No Child Left Behind" policy was proposed
in 2001. This program is designed to make local school districts accountable
for the academic performance of their students. Students will be required
to pass competency tests at key points in the education process. By 2014,
all students will be required to perform at the "proficient" level.
If students fail to make "adequate yearly progress," a series
of escalating sanctions will be invoked, culminating in closure of the
under-performing school.
Sharon L. Hays
Deputy to the Associate Director for Technology
Office of Science and Technology Policy (OSTP)
Executive Office of the President
(February 12, 2003)
Topic: Science Policy Under the Bush Administration
Dr. Sharon Hays described why OSTP exists and what it does. She explained
that Congress established OSTP in 1976 with the mission of providing
the President with science and technology analyses on important issues.
She used examples of how technology policy is made in order to make two
points: (1) there is no one answer; and (2) nobody sits down and makes
policy except sometimes.
The White House asks and the OSTP responds. Issues are chosen usually
because they fit in one of the President's three top priorities -- the
war on terrorism, homeland security, and strengthening the economy.
There are about 15 permanent people in the organization with many more
detailees and fellows. The OSTP portfolio includes everything under the
science and technology sun, and the detailees and fellows are particularly
useful in bringing specific backgrounds to the issues being considered.
Rapid turnaround on unexpected issues is common. For example, the Space
Shuttle crash required immediate briefings for folks who had never worked
on space issues, but were the spokespersons for the Administration and
were presenting the Administration’s response.
OSTP is in the Executive Office of the President. There are two or three
divisions depending on how you look at things. There is a Science Division
and a Technology Division each led by an Associate Director who is confirmed
by the Senate. The "third division" is the Office of the Chief
of Staff. This office carries out administrative functions and has an
increased policy role since 9/11 including homeland and national security.
OSTP provided technical support to the Office of Homeland Security (the
predecessor to the Department of Homeland Security).
How does OSTP advise the President? Sometimes physicist, Dr. John H.
Marburger, III, OSTP Director, sits down with the President and gives
advice directly to him; and sometimes staff members are tasked to brief
the President. Memos are common devices for presenting advice, as well
as meetings with other White House offices.
Another major mechanism of providing advice is through interagency efforts
on science and technology. OSTP has a lot of connections with research
and development efforts in the various departments. Most often, OSTP
has people present at the discussions at intra/interagency working groups
to stay on top of issues and provide input on efforts being led by groups
outside the White House. In some cases, OSTP instigates and acts as the
lead group directing an effort on an Administration core science and
technology issue. Coordination and awareness of issues and efforts concerning
science and technology is a key goal of OSTP.
There are two major advisory councils that work on science and technology
policy. The National Science and Technology Council sets the plan. The
President chairs the Council, the Vice President is the co-chair, and
its membership includes cabinet level people from departments and agencies
with significant science and technology missions. The work of the Council
is mostly accomplished at lower levels, in subcommittees and working
groups.
The other major council is the President's Council of Advisers on Science
and Technology (PCAST). It includes people from industry and academia,
usually of a very high profile. PCAST provides an outside of the government
perspective on science and technology issues.
The OSTP website is at www.ostp.gov.
Mortimer L. Downey III
Principal Consultant
PB Consult Incorporated
(March 5, 2003)
Topic: Countering Terrorism in Transportation
Mr. Mortimer Downey is a principal consultant at PB Consult, a firm
that provides advisory and management consulting services to public and
private owners, developers, financers, and builders of infrastructure
projects worldwide. Mr. Downey is a member of the National Academy of
Sciences’ (NAS) Committee on Science and Technology to counter
terrorism and, as a Committee representative; he explained to us how
science and technology could be used as an efficient countermeasure to
terror.
The NAS Committee published reports on the role of science and technology
in "making the Nation safer" and in deterring, protecting,
and preparing the United States for terror attacks. Following a post-9/11
NAS conference to inventory resources, assess the threat, and identify
countermeasures, the Committee was formed. The Committee received the
support and sponsorship of the President’s Office of Science and
Technology Policy. When one starts examining the systems of modern life
for its vulnerability to terrorist attack, one quickly discovers how
fragile a technological civilization can be. As a simple example, Mr.
Downey observed, there are almost no spare parts for the transformer
stations in the Nation’s power grid, because big transformers are
basically custom-built for each application.
A sub panel on transportation, chaired by Mr. Downey, defined the strategy
for one element of the countermeasure project. The study would focus
on catastrophic terrorism, the combination of the likelihood and severity
of a terrorist event. Mr. Downey reported that science and technology
could play a role in the various levels of terrorism responses -- prediction,
prevention, protection, interdiction, response and recovery, and attribution.
The sub panel concentrated on threat, infrastructure, and integration
of information as general strategies and research needs were defined.
Weapons of mass destruction are an apparent concern. Whereas a "dirty
bomb" may not be life threatening, it would have enormous environmental/clean
up consequences. Continued research on chemical/explosives sensors and
filters is needed, as is research on biological weapons, including preparation
and response distribution. Weaknesses or vulnerabilities in information
technology, communications, and energy and power infrastructures require
attention. Additional areas needing attention include infrastructure
(stronger buildings), emergency responder support (better communications
and deployment capabilities), transportation (layered security system),
trusted spokespersons, complex systems (data fusion/data mining/red-teaming),
and cross-cutting technology (sensors/robots/SCADAs/systems analysis).
Finally, the sub panel identified the deployment of a Homeland Security
Institute as a need, as well as partnering and information sharing between
all appropriate agencies and institutions.
The sub panel clearly outlines the vulnerabilities of the transportation
system as a target of terror. The system is open and accessible by design,
which makes it equally easy for terrorists to penetrate. It is extensive
and ubiquitous, exposed at every node. Diverse and institutionally divided,
the transportation system carries federal, state, and private regulatory
responsibilities that cut across all agencies. Through global linkages
to society and the economy, the transportation system reaches every country,
with people and goods moving constantly. This system is a prime target
for terror attacks and also serves as a vehicle for transport of terrorists
and weapons.
The sub panel has set forth optimum security systems for countering
terror. It was suggested that the transportation industry establish technologically
sophisticated, yet operationally feasible security systems. For example,
they should ensure that screening equipment works in all (harsh) environments.
A layered approach to security would provide multiple challenges to terrorists
(i.e., ticket check, metal detector, and baggage search). Security should
maintain "curtains of mystery" by not giving away the screening
protocol, keeping terrorists guessing through random checks. Whereas "gates,
guards, and guns" are important impediments and deterrents, additional
security measures are critical. Finally, the sub panel suggests taking
into account economic consequences of both the terrorist action and the
countermeasure.
Mr. Downey summarized the sub panel’s findings by confirming that
methods to counter terror need to be researched and initiated. There
is a need for faster, better, cheaper, smaller, and "usable in the
real world" technology. It is important to understand how systems
work and design security accordingly. Because human factors have to be
considered, it is necessary to recognize that even perfect systems are
run by imperfect people. Lastly, a cadre of unconventional thinkers could
help security to stay ahead of the terrorist, particularly by anticipating
what might be in the terrorist’s mind by attempting to think like
the terrorist.
David S. Trinkle
Program Examiner
Science and Space Programs Branch
Office of Management and Budget
(March 12, 2003)
Topic: The Federal Research and Development (R&D) Funding Process
Diana Espinosa
Deputy Assistant Director for Management
Office of Management and Budget
(March 12, 2003)
Topic: The President's Management Agenda
Mr. David Trinkle opened the presentation with an overview and an organization
chart of the Office of Management and Budget (OMB) within the White House.
He described the multiple roles of OMB as: (1) developing the budget;
(2) assessing budget-related policy issues; and (3) overseeing the implementation,
coordination and management of agency programs.
Mr. Trinkle’s remarks focused on the process that the White House
uses to develop the federal budget. At any time of the year, budgets
for three different fiscal years are in various stages of preparation.
During the year, examiners from OMB will meet with their agencies to
develop budgets, and each agency will submit its budget to the OMB by
the late fall. OMB will provide feedback to each agency shortly afterward,
and each agency has a limited time to appeal funding issues after the
feedbacks. For FY 2004, the R&D budget is approximately $122.7 billion,
about one-half of which is for defense.
On May 30, 2002, President Bush identified the following R&D priorities:
Homeland security and antiterrorism R&D
· Networking and information technology R&D
· National nanotechnology initiative
· Molecular level understanding of life processes
· Climate change science and technology
· Education research
President Bush also provided new criteria for R&D programs, which
are: relevance (why), quality (how), and performance (how well).
In her presentation, Ms. Diana Espinosa highlighted the Program Assessment
Rating Tool (PART) and provided an overview of the President’s
Management Agenda.
PART was developed to: (1) measure and diagnose program performance;
(2) evaluate programs in a systematic, consistent, and transparent manner;
(3) inform agency and OMB decisions for management, legislative or regulatory
improvements, and budget decisions; and (4) focus program improvements
and measure progress (e.g., compare with prior year ratings).
To date, PART has been tested on 67 programs during 2002 and rated 234
FY 2004 program budgets. PART revealed that more than one-half of these
programs could not demonstrate results and needed significant improvement.
Ms. Espinosa also discussed the President’s Management Agenda,
which was launched in August 2001. The President's Management Agenda
is an aggressive strategy for improving the management of the Federal
Government. It focuses on five areas of management weakness across the
government where improvements and the most progress can be made. Ms.
Espinosa described in detail five crossing-cutting initiatives: (1) strategic
management of human capital; (2) competitive sourcing; (3) improved financial
performance; (4) expanded electronic government; and (5) budget and performance
integration.
Ms. Espinosa also discussed one of the nine program-specific initiatives
highlighted by President Bush, which was better R&D criteria for
federal programs. The results of the President’s Management Agenda
may be found at the website: results.gov.
Charles T. Owens
Chief Executive Officer
U.S. Civilian Research and Development Foundation for the Independent States
of the Former Soviet Union
(March 19, 2003)
Topic: Science and Technology Cooperation with Scientists and Engineers
in the Former Soviet Union
The U.S. Civilian Research and Development Foundation (CRDF) is a non-profit
charitable organization created by the U.S. Government (USG) in 1995.
This unique public-private partnership promotes scientific and technical
collaboration between the United States and the countries of the former
Soviet Union (FSU).
The CRDF's goals are to:
· support exceptional peer-reviewed research projects that offer scientists
and engineers alternatives to emigration and help prevent the dissolution of
the scientific and technological infrastructure of the countries of the FSU;
· advance the transition of weapons scientists to civilian work by funding
collaborative non-weapons research and development projects; and
· help move applied research to the marketplace and bring economic benefits
both to the countries of the FSU and to the United States.
The genesis for CRDF began in the early 1990s with philanthropist, George
Soros, who was concerned about unemployed weapons scientists and engineers
in the FSU being wooed by unfriendly nations in order to utilize their
knowledge and skills. As a way to keep these scientists and engineers
from emigrating from the FSU, and engaged in peaceful research endeavors,
Mr. Soros provided $10 million to the USG to establish a program of scientific
collaboration between the United States and FSU with the intent that
at least 80 percent of the funding go the FSU side. More than 3,000 proposals
were submitted in the initial grant competition and all $10 million was
quickly awarded. Given this success, Mr. Soros was willing to provide
an additional $5 million for another competition, but only if the USG
was willing to match this amount. With former Congressman George Brown
championing the cause, a number of federal agencies offered to contribute
funding. Consequently, it was decided that a non-profit organization
should be formed in order to establish a grants program and raise additional
funding. Thus, the CRDF was borne and was given a ten-year life span.
The CRDF offers a number of programs: Cooperative Grants; Industry;
Nonproliferation; Centers and Institution Building; and Grant Assistance.
The typical grant from CRDF is for two years and the average award size
is $60 thousand. It is a requirement that at least 80 percent of the
funding be used in support of the FSU side of the collaboration. Since
1995, more than 600 awards (totaling $30 million) have been made. Approximately
95 percent of the researchers continue their collaboration after the
grant has expired.
Mr. Charles Owens, Chief Executive Officer of CRDF, provided a number
of anecdotal success stories from the program. One of which involved
former weapons scientists switching their research focus from nuclear
explosions to volcanic eruptions. The scientists' findings are expected
to be useful in the detection and assessment of oil and gas prospects,
and the prediction of natural disasters.
Mr. Owens was asked about the future prospects of the CRDF and whether
or not it would survive beyond ten years. Mr. Owens responded that, in
his view, there will be a need for CRDF beyond 2005, and that given current
events (the war with Iraq) there may also be a need for CRDF to do its
work outside the borders of the FSU.
Michael Rodemeyer
Executive Director
Pew Initiative on Food and Biotechnology
(April 2, 2003)
Topic: Benefits and Concerns of Genetically Modified Foods and Other
Agricultural Biotechnology Products
The Pew Initiative on Food and Biotechnology was launched in March 2001
as a response to increasing global polarization over the relative risks
and benefits of bioengineered foods, according to Executive Director,
Mr. Michael Rodemeyer. The Pew Initiative is neither for nor against
the technology and seeks to establish an objective, independent, and
credible source of information about agricultural biotechnology.
Since introduction of the Flavr Savr tomato in 1994, use of genetically
modified (GM) crops has exploded in this country. Pest resistant and/or
herbicide tolerant cotton, soybeans, and corn have been particularly
successful because they make work easier for the farmer and have the
additional environmental (and cost) benefit of reducing chemical pesticide
use. However, current GM technology benefits the farmer, not the consumer.
This, said Mr. Rodemeyer, stands in stark contrast to other products
of biotechnology, including drugs and other pharmaceuticals, where consumer
concerns and suspicion are far outweighed by the obvious and potential
benefit(s) to the end user.
World market acceptance of GM foods is limited, with 99 percent of GM
foods planted and consumed in the United States, Canada, Argentina, and
China. Resistance to GM foods does not seem to be a simple matter of
ignorance (consider the Frankenfood revolt in the European Union); rather
it is a complex issue involving both safety and environmental concerns,
muddied by projection of local social values onto the technology and
variable faith in a country's regulatory system. Some concerns about
safety are reasonable: while this technology is "under control" we
can't guarantee it will be 100 percent safe forever. Environmental questions
are more difficult, said Mr. Rodemeyer, since "gene flow happens." Pollen
from GM crops is able to spread widely and we don't have the infrastructure
to isolate GM foods from farm to fork. Finally, complex issues such as
product liability have yet to be resolved.
The next five to ten years will be critical to determine the ultimate
success of bioengineered foods. Clearly, good science is not enough.
It will be incumbent upon agricultural biotech companies to communicate
effectively the relative risks and merits of their products to the end
user. Because United States-produced GM foods are sold in the global
marketplace, the needs and concerns of the global community must be considered.
Introduction of new plant forms that may be both food as well as drug
(or pesticide) stretch our current regulatory definitions, thus necessitating
new sophistication in our regulatory system that must also be transparent,
credible, and effective.
More information about the Pew Initiative on Food and Biotechnology
can be found at their website (www.pewagbiotech.org).
28th Annual American Association for the Advancement of Sciences
(AAAS) Colloquium
on Science and Technology Policy
(April 10-11, 2003)
Following a welcome by Mr. Floyd Bloom, Chairman and Professor of the
Department of Neuropharmacology at the Scripps Research Institute and
Chair of the AAAS Board of Directors, we heard from Dr. John Marburger,
the Director of the White House Office of Science and Technology Policy,
who gave the keynote address.
The next item on the agenda, a plenary session, addressed "Budgetary
and Policy Context for Research and Development (R&D) in FY 2004."
Dr. Kei Koizumi, Director of the Research and Development Budget and
Policy Program, AAAS, provided an overview of the Federal Government’s
FY 2004 budget priorities and the implication of these priorities for
R&D.
For FY 2004, R&D accounts for 5 percent of the total proposed United
States' budget. Much of that ($1 billion estimated FY 2004) will be allocated
to the Department of Homeland Security. Dr. Koizumi noted that there
will be competition for slim resources and that the President’s
budget is attempting to further restrain domestic discretionary spending,
which includes R&D. In addition, the government is now facing a record
budget deficit with no surplus in sight. Most importantly, the budget
does not include the cost of the war with Iraq and is therefore likely
to change.
Mr. Moises Naim, editor of Foreign Policy magazine, was next to speak. He offered
his perspective on the effect of the current war in Iraq and global policies.
He noted that after the war four reconstructions will be needed.
Reconstruction of Iraq will be the first step, both early infrastructure
needs and later institutional systems, such as the judicial system and
trade. Next, the United States must reconstruct its relationship with
the United Nations and its agencies/organizations. Third, said Mr. Naim,
the War has damaged our relationship with much of Europe. He noted that
stability in both regions has been dependent on a collaborative relationship
between both continents. The United States, he argued, should engage
Europe in discussions of who gets contracts to build infrastructure and
access to oil and other resources. He noted that our approach of "high-risk
gambles" is very different to the European approach. We need to
be aware of these differences in our discussions and plans. Lastly, he
said, Europe needs to be "reunited." The United Kingdom, Italy,
and Spain have supported the War and are at odds with countries like
France, Germany, and Portugal. Even in the countries that have supported
the United States, the divide is profound between governments and their
electorate. Mr. Naim cautioned that the cost to the United States in
terms of loss of goodwill will be very significant and not easily regained
if these four issues are not addressed.
Dr. Karen Holbrook, President of Ohio State University, introduced several
topics described in more detail in the remaining Colloquium. She described
the Ohio State experience of working in an environment in which scientific
knowledge is at an incredible and unprecedented state with remarkable opportunities
for new advances because of new technology, but is hampered by uncertainty
in funding and vacillating social and ethical policies. She noted that, from
her experience, there were three essential ingredients for good university
R&D.
First, universities need stability in funding sources and in policies.
She noted that federal R&D has fallen to its lowest point (measured
as GDP) in 50 years. She described her experience in dealing with new
and changing regulatory policies such as those governing select agents,
stem cells, and clinical trials.
The second challenge is balancing conflicting needs and priorities.
For example, she discussed the difficulty that Ohio State is having in
switching from basic science to more product-focused research. Lastly,
a clearly articulated vision is required to keep progressing. She suggested
that a universal definition for basic research be developed. She noted
that the importance of a shared vision in space research and reminded
us how this inspired new scientists. Likewise, we are at a point now
where a shared vision can galvanize the disciplines of science, information
technology, ethics, and technology. She concluded that shared vision
must include energy and enthusiasm to engage new scientists in this field.
Dr. Elias Zerhouni, Director, National Institutes of Health (NIH), closed
the plenary session with his thoughts on issues in the horizon for the
NIH. He reminded us that NIH funds nearly half of all science and research
in the United States and the bulk of research worldwide. Of this money,
83 percent of the NIH budget goes out to scientists at universities and
organizations around the country. Dr. Zerhouni said that, particularly
after a period of growth, it's important to harmonize interactions between
various functions. To achieve a so-called "soft landing" after
the doubling of the NIH budget during the past five years, Dr. Zerhouni
said he would advocate as strongly as he could to defend the value of
continued investment in biomedical research. He noted that the opportunities
in science have never been greater. Dr. Zerhouni is concerned that public
recognition of the agency is not as high as one would think. And yet,
all of the major advances in healthcare, and in discovery, over the past
30 years have come from NIH. He wants to promote NIH as being in the
vanguard in healthcare and research progress.
Dr. Zerhouni described the evolving challenges that NIH is facing. These
include: a shift from acute to chronic diseases; new issues with an aging
population and more disparate ethnic groups; emerging and re-emerging
diseases; and biodefense. He closed with an overview of the NIH Strategic
Roadmap that will guide NIH over the next three to five years.
Concurrent sessions were held in the afternoon of the Colloquium. One
such session was entitled, "Universities and Their Aspirations:
How Much Excellence Can We Afford?"
The first of the panel members to address us was Dr. Irwin Feller, AAAS
Senior Visiting Scientist, and Professor Emeritus of Economics from Pennsylvania
State University. He provided "An Overview of Current National Trends." Dr.
Feller stated two rules: (1) There is no cap on the number of universities
that may be ranked very high in quality, and (2) only 50 percent can
be in the top 50 percent.
Aspirations to improve, advance, and excel are reflected in the open,
competitive nature of the United States' university system. These aspirations
-- striving for Research I status, membership in the American Association
of Universities, and an overall top ranking -- have propelled the number
of "research intensive" universities from only six during World
War II to twenty by 1960 and then to over a hundred by 2000, with most
of the growth occurring in the public university sector. While aspirations
are important -- "the wannabees are the gonnabees" -- there
has been a lot of churn in the top 50 percent over the past sixty years.
R&D funds, more dispersed now than they were in the mid-20th Century,
still go mainly to the universities ranked in the top 100.
To break into the top ranks, a university must be able to answer several
key questions. How can it establish the right blend of teaching, research,
and outreach activities? How will it achieve the right balance among
different fields of study? What strategies can it use to increase research
grants to the faculty? How will it support the institutional cost of
the research infrastructure? Finally, what are the overall effects of
these aspirations on the overall competitiveness of the university?
Unfortunately, expectations of excellence are rising at a time when
funding is falling. While state support is limiting, universities have
to make major up-front investments to compete for research funding in
order to improve their research ranking. This leads to the final question:
how important is the relative ranking of one research university to another?
Dr. Feller concluded that, while the numerical value of the rank itself
is not a key issue, how that ranking affects allocation of R&D resources
is.
Next on the panel to speak was Dr. Lydia Villa-Komaroff, Vice President
for Research and Chief Operating Officer at Whitehead Institute for Biomedical
Research, and AAAS Board of Directors' Member, who address the growing
research competitiveness of United States' universities. "Strategies
for Moving Up: One University, One Story" draws from Dr. Villa-Komaroff's
seven years as Vice President of Research at Northwestern University.
In order to strengthen Northwestern's research competitiveness, they
chose to build on existing academic and research strengths, identified
opportunities in those areas, and then strengthened the elements critical
for excellence. This required a commitment from university leadership,
commitment of resources applied to appropriate targets, and a motivated
faculty who were able to articulate their visions for the future. How
did they make it happen? They encouraged a climate of responsibility,
rewarded the audacious, celebrated success, matched expectations to resources,
and discouraged complacency.
During Dr. Villa-Komaroff's tenure as research Vice President, Northwestern
University enjoyed a significant increase in research funding, although
the overall rank of the university remained approximately the same.
Another panelist, Dr. David Ward, President of the American Council
on Education spoke to "The Moving Target: Actions by 'Established'
Universities."
The best universities are in the top 100 out of 2,400 institutions (or
more). Variation in quality among the top twenty is subtle and the exact
ranking is not necessarily significant. However, outside the top 100,
there are many good institutions that can't compete for comprehensive
excellence. Instead, they are best served by a policy of "targeted
excellence," where they find their niche then express excellence
in an appropriate way.
How do they do this? First, resources are important. Revenues from tuition,
endowment growth, and state funds are all combined to support their development
plan. However, state supported schools are increasingly becoming "state
located" schools due to dwindling financial resources at the state
government level. This poses a continuing financial challenge. Second,
they need to create a critical mass of talent in key areas. Third, the
quality of the undergraduate class is important. Finally, geography matters
when attracting the best faculty and students -- there is a huge coastal
advantage.
How many public universities can we afford? Dr. Ward calculates that
one research intensive public university with a medical school requires
a tax base of six million people in order to generate adequate revenue
to support 20 percent of its operating costs (the average level of state
commitment to a research university). Thus, by his estimation we can
afford 70-75 public research universities in addition to the private
schools.
Panelist Dr. Debra W. Stewart, President, Council of Graduate Schools,
spoke to "The Effects on Graduate Education."
The 450 member institutions in the Council of Graduate Schools grant
90 percent of the doctoral and 85 percent of the master's degrees in
this country. The growth in doctoral programs has been substantial: between
1980-2000, the number of universities granting doctoral degrees rose
from 325 to 426. How much excellence can we stand? We can afford less
than we've been trying to achieve, and the potential for excellence has
diminished with diminishing resources.
Does striving trump excellence in graduate education? Aspirations trump
excellence if all universities share one aspiration. However, we're seeing
an increase in the diversity of graduate programs. New programs such
as professional master's degrees (the baccalaureate degree-equivalent
of the current generation) are growing in number. They allow universities
to enter the prestige economy without trying to put together a mediocre
Ph.D. program. This is one of the most promising elements of academic
diversity. Aspiration trumps excellence if it skews funding on non-merit-based
criteria. This is difficult to assess. It hasn't caused a pork barrel
(but the reverse might be true). Aspirations trump excellence if there
are no students to fill newly created graduate programs or if there's
no market for the offered degrees. This does not seem to be true. The
demand for graduate degrees continues to increase although doctorates
granted to United States' students are down. It will be a challenge to
develop the domestic science pool and limit the high attrition rate (20
to 50 percent of all Ph.D. students are lost to the programs before they
complete their degrees).
Overall, aspirational behavior is probably good rather than bad. It
advances institutionally articulated visions of excellence and drives
development of science and mathematics talent pool.
The final panelist, Mr. Dan Pearson, Minority Professional Staff Member
on the House Committee on Science, spoke to "The Realities of Distributing
Federal R&D Funds."
The overall federal budget situation is dire, a sea of red ink. We are
not going to see the National Science Foundation budget double in the
near future, the National Institute of Health growth will be reduced,
and growth at Research I universities will be limited to four to seven
percent per year. How do we get around this problem? Universities increasingly
rely on academic earmarks, a directed appropriation of funds, to supplement
their operating expenses. Earmarks were unheard of before the late 1980s,
but by FY 2002 totaled $1.9 billion.
An earmark can temporarily level the playing field between well-endowed
universities and smaller schools with aspirations of improving their
quality. The trick is to turn an earmark into an ongoing revenue stream.
Loma Linda University in California tops the list for the number of earmarks
in recent years, but it's still not in the top 100 universities and has
not established an ongoing revenue stream. On the other hand, Oregon
Health Sciences University used earmarks to break into the top 100. They
used the support to build infrastructure and to recruit faculty with
potential to bring in their own research funding. OHSU has not used earmarks
since 1996.
How does a university get an earmark? They either have a friend or alumnus
on the Committee, a subcommittee member from their state (and district),
or they use lobbyists (at $150,000/earmark) and other connections.
Another concurrent session on the first afternoon of the Colloquium,
addressed "Developments in Homeland Security and Science and Technology
(S&T)."
Mr. William Bonvillian, Legislative Director to Senator Joseph Lieberman,
described the history of the Homeland Security Agency. He noted that
this was the first new science and technology agency in 45 years. The
new department will oversee integrated research, development, testing,
evaluation, and deployment of biodefense products. The structure will
allow for the coordination of scientific research in security topics,
which he believes will enhance cross-discipline collaboration. The remaining
challenges will be to encourage industry’s participation in this
effort and to develop an agreed upon, common research roadmap for biodefense,
particularly since most of the R&D will be done by other agencies.
Following Mr. Bonvillian, Dr. Lawrence D. Kerr, Assistant Director for
Homeland Security at the White House Office of Science and Technology
Policy described the mission of the new Department and its plans for
science and technology. Part of the Department of Homeland Security's
(DHS) mission will be to stimulate, conduct and enable research for securing
our Nation; develop partnerships with industry; and develop a research
capacity dedicated to homeland security. The bill creating the agency
was signed into law on July 22, 2002, and it has an operating budget
of $37 billion.
Dr. Kerr noted that there are several challenges to implementing this
new agency. For example, it will be responsible for not only science
and new product development, but also the monitoring of over 2,800 power
plants, 190,000 pipelines, and 18,000 flights per day. Part of this agency
will include the new Homeland Security Advanced Research Projects Agency
(HSARPA), which will have a capability to solicit, develop, and demonstrate
technologies that will meet the operational needs.
The third panelist in this session was Dr. John Y. Killen, Assistant
Director for Biodefense Research at the National Institute of Allergy
and Infectious Diseases (NIAID), National Institutes of Health (NIH).
He described the NIH biodefense research agenda. He noted that since
the fall of 2001, the NIAID has moved quickly to accelerate basic and
clinical research related to the prevention, diagnosis and treatment
of diseases caused by potential agents of bioterrorism.
Dr. Killen also noted that for FY 2003, President Bush has proposed
a $1.75 billion budget in biodefense research funding for NIH, which
will enable the NIAID and other NIH institutes to expand ongoing projects
and establish new initiatives as part of a comprehensive and sustained
biodefense research program. The NIAID biodefense research agenda focuses
on studies of microbial biology and host responses to microbes; the development
of new vaccines, therapies, and diagnostic tools; and the development
of research resources such as appropriate laboratory facilities.
Of particular note are plans to establish eight to ten regional Centers
of Excellence for Biodefense and Emerging Diseases Research and to construct
more biosafety level (BSL)-3 and BSL-4 laboratories. These resources
will not only provide state-of-the-science research capacity, but also
will link to the Centers for Disease Control and Prevention and to state
and local health departments to provide permanent, regional expertise
on agents of bioterror and other emerging and re-emerging diseases.
Dr. Killen noted that NIAID research on organisms with bioterror potential
will almost certainly lead to an enhanced understanding of other more
common and naturally occurring infectious diseases that afflict people
here and abroad. The work should have enormous positive impact on our
ability to diagnose, treat, and prevent major diseases such as malaria,
tuberculosis, HIV/AIDS, and a spectrum of emerging and re-emerging diseases
such as West Nile fever, dengue, influenza, and multi-drug resistant
microbes. He concluded by noting that NIH has made substantial progress
in the biodefense research effort; however, much remains to be accomplished.
Another panelist, Dr. Robert Popp, Deputy Director for the Information
Awareness Office at the Defense Advanced Research Projects Agency (DARPA),
U.S. Department of Defense (DOD), gave an overview of how DOD is developing
an intelligence network to help integrate, focus, and accelerate counter-terrorism
through information technology. He noted that by nature, terrorist threats
do not lend themselves to easy identification. Moreover, intelligence
collection is different than it was in the past and requires new models.
New technology, he said allows the government to exploit all information
sources and to efficiently and systematically use information. The network
uses pattern recognition and predictive modeling to assess threats. Dr.
Popp cautioned that it was not a "collections" technology,
but merely a means for connecting dots using information that already
exists.
The final panelist for the day, Dr. John C. Crowley, Vice President
for Federal Relations, Massachusetts Institute of Technology, provided
an historical background on the tensions between science and security.
He observed that beginning in World War II, classified information was
defined as information that was strictly of a military nature -- it did
not include basic research. In 1950, the House Un-American Activities
Committee began to challenge this and attacked civil liberties for scientists
with what the Committee considered questionable loyalty.
The 1980s were an era in which openness was equated with vulnerability
and universities were seen as points of leakage and targets for national
security. Several editorials during that time questioned the handcuffing
of science. This led to the Corson Panel, which examined technology leakage
to other nations. It concluded that security by secrecy would weaken
United States capabilities and that it was inappropriate to stop international
scientific communication. This led a National Security Decision Directive
under President Reagan, which stated that the only mechanism for restricting
the dissemination of fundamental research results was formal classification.
More recently, editorials have again reminded government officials that
ultimately relationships between government, academia, and industry depend
on trust. Dr. Crowley expanded on recent science and security debates
(e.g., select agents, the Patriot Act, censoring of publication) and
cautioned that government must engage and include universities in these
policy discussions.
The first day of the Colloquium concluded with an address by Dr. Shirley
Ann Jackson, President of Rensselaer Polytechnic Institute, and President-Elect
of AAAS. Her topic for the William D. Carey Lecture was "Standing
on the Knife Edge: The Leadership Imperative."
Friday morning's plenary session of the 28th AAAS Colloquium was entitled, "Who
Owns Science? Issues of Intellectual Property." Mr. Reid G. Adler,
General Counsel, J. Craig Venter Science Foundation and Member, AAAS
Committee on Science Engineering, and Public Policy, moderated the session.
Prior to introducing the panelists, Mr. Adler commented that six experienced
speakers would bring up six real life issues with intellectual property
(IP). He questioned the experts about whether this is a time to revisit
the IP policy and commented that while some believe that there is reduced
dissemination of science and technology knowledge by IP filing, others
don't agree.
The first panelist to speak, Mr. Robert P. Charrow, Partner, Greenberg
Traurig LLP, explained the challenges and opportunities of ownership
and access to science and technology from the university perspective.
He commented that the Bayh-Dole Act has provided universities unique
opportunities in partnerships. However, there are administrative, ethical,
and FDA regulatory obligations (e.g., inspections). Therefore, a balance
between collaboration and patenting and negotiation of the intellectual
property is critical.
Mr. Carl E. Gulbrandsen, Managing Director of the Wisconsin Alumni Research
Foundation (WARF), provided an overview of the Foundation. He also discussed
how WARF has struck a balance between technology transfer and knowledge
dissemination with benefits to the public such as transfer of the human
embryonic stem (ES) cell technology discovered by Dr. James Thompson
at the University of Washington (UW) in November 1998. Two patents were
awarded to WARF for this discovery. The research team has demonstrated
that cardiac cells from human ES cells can contract just like human heart
cells. WARF allowed early non-exclusive research access to these cells
to UW researchers as well to other universities. WARF also negotiated
with PHS the license agreements to provide NIH these cells at a low cost
and with very few restrictions. The patent rights were given to PHS for
any discoveries based on ES cell research. So far, 130 groups have received
these ES cells. Geron provided critical funding for this project since
this type of research cannot be conducted with federally funded dollars.
Geron then received an exclusive commercial license for select cell treatments
and other companies got exclusive patent rights for other uses. Could
the Federal Government and Geron have done a better job in distribution
of this technology than WARF? Mr. Gulbrandsen also described different
court cases where patents in dispute were for DNA research. In one case,
the University claimed education as a business and in another; a university
used its license to regulate rights for use of the technology. He questioned
whether regulatory wording in the licensing agreement is a reasonable
approach and is in the public interest. But, who will be responsible
for the language for appropriate regulatory wording?
The next panelists, Mr. Edwards T. Lentz, Patent Attorney, Oneonta,
New York, and Dr. Mildred Cho, Co-Director of the Center for Biomedical
Ethics, Stanford University, addressed the topic, "Perspectives
from Genomics Technology as a Case Study."
Mr. Lentz and Dr. Cho provided technology transfer perspective from
the industry side. Mr. Lentz commented that there should be a balance
between: (1) patentability and novelty, and (2) actual technology contributions,
claims, and experimental use. Mr. Lentz provided some cases as examples
to support his views. He added that broad claims in some cases might
be justified.
Dr. Cho discussed the effects of gene patenting and licensing on genetic testing
performed by the clinical labs. She discussed the data from the two surveys
of the genetic testing laboratories to address two questions: (1) Do patents
reduce the research and development of genetic tests; and (2) Are patents required
for incentive? She concluded that there was a negative impact of patents on
clinical labs' ability to provide genetic tests, and that information sharing
was negatively impacted. It was not clear whether the total volume of the tests
was decreased due to the patents. Dr. Cho questioned whether genetic tests
should be available for the public good and commented that companies are not
interested in developing genetic tests for orphan diseases. Dr. Cho recommended
a more rigorous regulation of genetic tests by the Food and Drug Administration.
The final panelists of the morning session, Mr. Joseph P. Allen, President
of the Robert C. Byrd National Technology Transfer Center, and Dr. Marie
Thursby, Director of the Technology and Innovation Program at Dupree
College of Management, Georgia Institute of Technology, addressed "Outstanding
Policy and Legislative Issues."
Mr. Allen provided history of the Bayh-Dole Act and discussed how universities
could use technology transfer to balance research and partnership with
industry. The Bayh-Dole Act was established by Congress to enhance utilization
and commercialization of inventions made with government support. Since
1991 to 2000, patent numbers have increased while the numbers of United
States' publications remained the same. Thus, the Bayh-Dole Act has helped
to channel research to practical applications without adversely affecting
the dissemination of knowledge through scholarly publication. More university-industry
partnerships are now evident. However, universities have to be reasonable
in exercising their patent rights. It takes five to seven years for good
university technology to get to market, and for life science technology,
it can take 12 years. There are risks involved in technology development.
Who pays for a failed technology? Mr. Allen had a message for universities:
regulate yourself, balance your world, and don't kill the golden goose
(e.g., the Bayh-Dole Act).
Dr. Thursby discussed the policy issues associated with the Bayh-Dole
Act. Dr. Thursby asked whether exclusive licensing is necessary and cautioned
that such licensing might change direction of the faculty research and
restrict dissemination of research. From 1991 to 2000, there was a substantial
increase in invention disclosure, number of patent applications, licenses
executed, and royalties. However, this positive effect may not be exclusively
due to the Bayh-Dole Act, since there was also an increase in software
licensing during this time. Dr. Thursby asked whether exclusive licensing
delays or restricts publication. Half of the companies require that information
be deleted from publication and the other half tend to delay publication.
Is there a change in direction of research due to the Bayh-Dole Act?
The 1997 article by Blumenthal et. al in the Journal of the American
Medical Association (JAMA) provides evidence that the individuals or
organizations involved in technology development are less likely to share
information. This could be more due to the cutting-edge academic war
than due to patents. A faculty database constructed with NSF funds indicated
that there was no change in the pattern of research after enactment of
the Bayh-Dole Act. Dr. Thursby's conclusion was that, in general, Bayh-Dole
facilitates federally funded research; however, in some cases, Bayh-Dole
tends to favor industry-restricted research.
The Colloquium's afternoon plenary session focused on "Regulating
the New Biology: Cloning and Bioethics." It covered issues such
as reproductive and research cloning, human embryonic stem cell research,
political considerations, and ethics and policy.
The first speaker of the afternoon was Mr. Eric Cohen, Resident Scholar
of the Ethics and Public Policy Center, and consultant to the President's
Council of Bioethics. His presentation focused on a recent issue in the
cloning debate. In late 2002, Stanford University announced the creation
of a $12 million research center that would produce cloned human embryos
for biomedical research. This research involves the insertion of a person's
DNA into an enucleated human egg. This produces a living, dividing, developing
human embryo -- a genetic copy or clone of a living individual, which
researchers plan to destroy in order to extract its stem cells.
Over the past few years, such cloning experiments have been the subject
of widespread public debate. In July 2001, the U.S. House of Representatives
passed, by more than 100 votes, a ban on all human cloning, including
the procedure now embraced by Stanford. In July 2002, the President's
Council on Bioethics recommended a four-year moratorium on the production
and use of cloned human embryos for biomedical research, so that the
Nation might debate the moral and scientific issues fully and fairly
before deciding whether or not to cross this moral boundary. Stanford's
announcement is important: In a country still weighing the significance
and moral dangers of taking the first steps toward human cloning, a major
research university has decided to plunge ahead. Stanford seems to believe
that the question of whether to harvest and exploit cloned human embryos
-- and perhaps eventually cloned human fetuses -- is one for scientists
and internal university review boards, not citizens and their democratic
institutions.
Second to speak was Dr. R. Alta Charo, Professor of Law and Bioethics,
Schools of Law and Medicine; and Associate Dean for Research and Faculty
Development, Law School, University of Wisconsin.
The ethics and politics of stem cell research as it applies to development
of new therapeutics for neurological disease was addressed. Dr. Charo
covered the history of ethical and political debate over embryo research
and the way in which this history has come to affect current international,
federal, and state proposals for the management and funding of embryonic
stem cell research. She also speculated about the direction of future
funding and regulatory decisions.
Last to address the Colloquium was Dr. Kathy Hudson, Director of Genetics
and Public Policy Center at Johns Hopkins University, Washington, D.C.
Genetic testing to detect carriers of cystic fibrosis (CF) is being
routinely offered to many couples seeking prenatal care. Recent reports
in the medical literature indicate that guidelines for safe and appropriate
testing are not always being followed. Inadequate oversight of genetic
testing and failure to adhere to professional guidelines has resulted
in misleading information being given to some patients. As a result,
couples have undergone needless medical procedures including abortion.
This experience with CF genetic testing reveals cracks in the system
of oversight of genetic tests that must be repaired. Patients have already
fallen through. Federal agencies exercise very little oversight of genetic
tests. In the absence of federal regulations, testing laboratories, physicians,
and patients rely on testing and reporting standards crafted by professional
societies. The current problems with CF genetic testing may be occurring
with other genetic tests and, as genetic tests proliferate, these problems
will only grow unless swift action is taken to strengthen oversight. "Genetic
testing should offer families the chance to receive information to guide
their health care and reproductive decisions. When conducted properly,
it does just that. Flaws in the system; however, can have dangerous,
expensive and painful consequences," explained Dr. Hudson.
James S. Gordon
Founder and Director
Center for Mind-Body Medicine, and
Clinical Professor, Departments of Psychiatry and Family Medicine
Georgetown University School of Medicine
(April 17, 2003)
Topic: Benefits of Complementary and Alternative Medicine for the General
Public
As a medical student, Dr. James Gordon did not like the way he and his
fellow students were treated. He was disturbed at the way patients were
often depersonalized, being referred to as "the gall bladder" or
by room number instead of their name. He found the typical M.D.'s approach
to science -- regurgitation instead of reasoned analysis -- to be unsatisfying.
These concerns have driven his life-long commitment to understanding
the world from the patient's viewpoint, and to consider the spiritual
and emotional needs of the individual, not just the physical ones, as
key elements in their health and treatment.
Is there another way to work with biology? Yes. There are spiritual
approaches, other systems of healing (e.g., Chinese medicine), exercise
(e.g., yoga, tai chi), and nutrition. Holism -- derived from the Greek
holos, or "whole" -- is based on the concept that the whole
is greater than the sum of its parts. Holistic medicine -- now more commonly
called integrative medicine -- thus uses the whole of the world's healing
traditions to treat the patient. It’s suggestive that opening our
thinking to alternative medicine systems could greatly enrich our understanding
of health. A good Chinese herbalist, said Dr. Gordon, would recognize
30 or 40 different aspects of depression in designing a treatment.
The Center for Mind-Body Medicine was founded by Dr. Gordon to create
a more comprehensive and compassionate approach to health care. The Center
combines the science of conventional medicine with the wisdom of ancient
healing practices. The Center not only addresses the physical needs of
the individual, but the emotional, social, and spiritual needs as well.
Most of the Center's missions are educational, and focus on three areas.
Its primary mission is the education of health care professionals --
medical students, physicians, and other health care workers -- in integrative
medicine. They create models for how to use self-awareness and self-care
as an engine to shift from a reliance on treatment to taking care of
one's self. Several courses of study are offered by the Center, including
an 18-month MindBodySpirit Professional Training Program, Mind-Body Skills
Groups, and symposia in the same healing techniques.
Another goal is the transformation of cancer care so that each person
has individualized and comprehensive treatments integrating complementary
therapies with conventional oncology. Three years ago, the CancerGuidesTM
Program was established to meet this goal. Health care professionals
are trained to help cancer patients choose the best combination of conventional
and complementary therapies for their individual physical needs, and
to provide continuing emotional support for both patients and their families.
Since 40 to 70 percent of cancers are to some degree nutritionally related,
the Center's Food As Medicine Program is also important for this work.
Nutritional education is traditionally neglected in the medical school
curriculum, so a pilot program was established at Georgetown University
School of Medicine to fill the void. The Program has now been extended
to five other local medical schools and is available to other medical
professionals around the country.
Finally, the Center trains individuals in techniques needed to heal
the wounds of war and civil strife. For the past six years, they have
worked at both the international (Kosovo and other Balkan states, and
expect to be involved in Iraq) and national (post 9/11) levels. The effort
is based on the knowledge that once the physical wounds of war have been
treated and basic physical needs have been met (e.g., shelter, water)
it is important to treat the psychological wounds. The Center's staff
not only treats the victims of war, they train local caregivers to continue
the work and integrate it into their own healing practices.
Not only health care professionals, but also people suffering from cancer,
chronic illness, and psychological trauma, benefit from comprehensive
programs in mind-body healing. The programs offered by the Center for
Mind-Body Medicine are designed and intended to be models and catalysts
for change in the global health care system. More information about the
Center and its programs can be found at their web site (www.cmbm.org).
Dr. Gordon observed that "alternative" medicine has in fact
become more mainstream. A 1997 survey found that 42percent of all Americans
used some form of the broad "complementary or alternative medicine" (CAM).
Even more telling, 69 percent of all cancer patients are using CAM according
to studies.
As Chair of the White House Commission on Complementary and Alternative
Medicine Policy, established in 2000, Dr. Gordon not only raised the
visibility of CAM in public policy circles, but also argued for more
public funding to investigate CAM with the same rigorous methodologies
and standards applied to other medical research, and to enable research
into CAM medications and techniques that are clearly not patentable and
so would languish for lack of corporate interest.
David L. Evans
Under Secretary for Science
Smithsonian Institution
(April 23, 2003)
Topic: Who We Are at the Smithsonian and How Science Works
The Smithsonian Institution is not a federal agency, but a unique entity.
Two-thirds of its funding comes from the Federal Government, with the
remaining one-third coming from donations, advertisement, and profit-making
activities, such as gift shops and courses. The Smithsonian is governed
by a board of regents, which has representatives from all three branches
of the government, plus several private citizens. Since it is not a government
agency, many government regulations do not apply. For instance, the Smithsonian
does not have to abide by Freedom of Information Act (FOIA) rules. However,
they do voluntarily abide by most federal regulations.
For the first 100 years of its existence, the Smithsonian was mainly
a scientific organization, not a museum or tourist destination. In the
mid-1960s, the museums became much more prominent, although the scientific
mission of the organization did not change. Historically, most Secretaries
of the Smithsonian were scientists.
The scientific mission of the Institution generally falls in one of
four thematic areas: the origin and nature of the universe; the formation
and evolution of the Earth and similar planets; discovering and understanding
life's diversity; and the study of human diversity and culture change.
The Office of the Under Secretary for Science has six science units
and four offices to support its activities. The science units include
the National Museum of National History; the National Zoological Park;
the Smithsonian Astrophysical Observatory; the Smithsonian Tropical Research
Institute; the Smithsonian Environmental Research Center; and the Smithsonian
Center for Materials Research and Education. The offices include: Fellowships;
International Relations; National Science Resource Center; and the Smithsonian
Press. The largest unit is the Astrophysical Observatory based in Cambridge,
Massachusetts, which employs about 1,000 people.
The majority of scientific work is basic research. A weekly newsletter
highlighting current research topics can found at www.si.edu/research/spotlight.
The Smithsonian conducts many collaborative research projects with other
government agencies -- working with the National Oceanic and Atmospheric
Administration, for example, on global climate change, invasive species,
and ocean and coastal resources projects.
The Smithsonian is currently experiencing budgetary problems. The one-third
of its budget that comes from non-federal funds is highly dependent on
tourist dollars, and tourism has been down sharply since September 11,
2001. They currently have a $1.5 Billion deferred maintenance requirement
to update basic structural needs, including updated plumbing, electrical,
and basic structure maintenance.
Site Visit to the District of Columbia Water and Sewer Authority
Blue Plains Wastewater Treatment Plant, Washington, D.C.
(April 30, 2003)
The District of Columbia Water and Sewer Authority (DCWASA) runs the
Blue Plains Wastewater Treatment Plant, located on the banks of the Potomac
River at the southern tip of the District of Columbia. Mr. Michael Marcotte,
General Engineer of DCWASA, detailed the Plant's operation to us.
After Mr. Marcotte’s introduction, we toured the Plant. We were
able to witness all aspects of the Plant and gain a first-hand knowledge
of its peculiar fragrance and operation. We also learned that we should
all be thankful for that burst of chlorine we detect each time we turn
on our home faucets!
Blue Plains is the largest advanced wastewater treatment plant in the
world. It processes 330 millions gallons of wastewater per day, and has
an annual budget of $250 million. In addition to treating all of the
District of Columbia’s wastewater, Blue Plains also treats wastewater
from portions of Maryland and Virginia. Wastewater is raw sewage plus
storm water collected by pipes and culverts. Within the District of Columbia,
a third of the wastewater is collected with a combined system (sewage
and storm water sharing the same pipes) and the other two-thirds is collected
with a separate system that uses two independent piping systems, one
for sewage and one for storm water. The combined systems are older and
have the serious flaw that during heavy storms, the Plant can be overloaded,
and the combined system will discharge raw sewage into local rivers.
The District of Columbia will invest $1.2 billion in plant upgrades over
the next ten years, including construction of underground detention tanks
to minimize these overloads and improve the quality of the discharge
into the Potomac River.
Wastewater treatment starts with debris and grit removal through a screening
process. Debris from this process may contain objects as small as dead
leaves that have washed down a storm drain to a full-sized motorcycle
-- the largest object our guide reported ever seeing trapped by the screens.
The remaining liquid flows into primary sedimentation tanks that separate
about 50 percent of the suspended solids and much of the suspended fats
from the liquid. The liquid then flows to secondary treatment tanks and
is oxygenated to allow bacteria to breakdown the organic matter. In the
next stages, bacteria converts ammonia into other forms of nitrogen and
then into harmless nitrogen gas. Residual solids are settled out and
the water is filtered with sand and other substances to remove the remaining
suspended particles. The water is disinfected, treated to remove chlorine,
and finally discharged into the Potomac River.
The solids from primary sedimentation tanks go to processing units where
the dense sludge settles to the bottom and thickens. Biological solids
from the secondary and nitrification reactors are thickened separately
using floatation thickeners. All thickened sludge is dewatered, lime
is added to reduce pathogens, and the remaining organic bio-solids are
applied to agricultural land in Maryland and Virginia. We learned an
important lesson: wash your vegetables before you eat them!
Michael Oppenheimer
Milbank Professor of Geosciences and International Affairs
Princeton University
(May 7, 2003)
Topic: The Science of Climate Change
Dr. Michael Oppenheimer is the Albert G. Milbank Professor of Geosciences
and International Affairs at Princeton University and also serves as
the Director of the Program in Science, Technology and Environmental
Policy (STEP) at the Woodrow Wilson School of Public Policy and International
Affairs, Princeton University; and Associated Faculty of the Princeton
Environmental Institute and the Atmosphere and Ocean Sciences Program.
Dr. Oppenheimer provided an extensive overview of global warming and
its effects on climate change around the world. CO2, methane, and other
greenhouse gases absorb infrared radiation that results in warming. He
commented that some natural greenhouse effect (e.g., due to forest fire)
is good, since it keeps the Earth from freezing; however, too much greenhouse
gas build-up can lead to global warming.
Dr. Oppenheimer discussed the analysis of ice cores in Antarctica dating back
100 to 400 years. These data indicate that there is a consistent build-up of
CO2 and other greenhouse gas over the years that correlate with an increase
in the Earth’s mean temperature. The effect of clouds on the temperature
change is interesting since the low level clouds reflect sunlight and reduce
the temperature; however, the high level clouds act like greenhouse gases and
increase temperature. Dr. Oppenheimer commented that ozone build-up in the
lower atmospheric layer (troposphere) can be harmful, and leads to cataract
formation. However, ozone in the outer layer of the atmosphere (stratosphere)
blocks UV rays and reduces warming. Chlorofluorocarbons can deplete ozone from
stratosphere leading to penetration of UV and warming. However, the 1990 Clean
Air Act has reduced the chlorofluorocarbon levels in the atmosphere and thus
prevented the ozone depletion from the upper atmospheric layer. Dr. Oppenheimer
commented that the location of warming may also be important -- for example,
warming in Alaska may not be a problem. Because of the interactions of multiple
factors, the development and interpretation of the global temperature prediction
models are complex and can provide different conclusions. However, there is
a clear indication that the Northern Hemisphere has been getting warmer in
the last 140 years with a distinct step increase since 1980. The data indicate
with certainty that most warming in the last 50 years is a result of human
actions such as increase in the number of cars on streets, home heating, and
deforestation.
Developing the proper national policy on climate change and global warming
is important but difficult, Dr. Oppenheimer observed. Difficult for five
basic reasons: (1) there is no necessary upper limit on global warming
if fossil-fuel use continues and increases; (2) there is a significant
lag between the build-up of greenhouse gases and the consequences of
that build-up; (3) greenhouse gases persist in the upper atmosphere for
a long, long time; (4) the uncertainties in our models are very large,
almost guaranteeing surprises; and (5) our rate of knowledge accumulation
is slow.
Dr. Oppenheimer commented that low-level warming may be manageable;
however, a high-level of warming may be disastrous. Dr. Oppenheimer concluded
by saying that new technologies to counteract global warming as well
as changes in life style in order to reduce air pollution will be necessary
to reverse the adverse effects of global warming in the future.
Richard H. Moss
Director
U.S. Global Change Research Program
(May 21, 2003)
Topic: Climate Change Research
Dr. Richard Moss directs the U.S. Global Change Research Program, an
interagency venture that lies at the interface between policy and science.
The Program is tasked with providing unbiased scientific information
on climate change and global warming to policymakers and the public.
The researchers strive to remain objective by upholding high standards
of scientific credibility and trying not to be influenced by policy.
In Dr. Moss' view, it's the job of policymakers to weigh the scientific
data against the values of society, and the risks society wants to take,
before deciding how to respond.
THE SCIENCE AND ITS IMPACTS -- Examining the science of climate change
is not as simple as measuring greenhouse gas emissions or how much carbon
is stored in a "sink." Carbon dioxide persists in the atmosphere
for centuries, and methane for 10 to 15 years. But, they also move in
and out of earth, forests, and water. The researchers model entire systems:
water cycle, carbon cycle, atmospheric conditions, solar radiation, biogeochemistry
of the oceans, changes in sea ice, changes in land use, and other human
actions, even the role of clouds.
The United Nations Framework on Climate Change, signed by the first
President Bush, commits the United States and other nations to avoid "dangerous
anthropogenic actions," so one question the research helps answer
is what actions are "dangerous." This has led to studies on
the sustainability of the water supply, how energy is used, and sustainable
forestry practices.
The science team claims some successes. It was the climate change research
program that showed CFC emissions were depleting the ozone, leading to
an eventual ban on the chemical. It also succeeded in modeling the El
Nino Southern Oscillation, a weather phenomenon that affects low-lying
areas, including some developing countries. The climate change researchers
greatly improved predictions of this event, to the extent that futures
markets are influenced by the predictions.
ORGANIZATION -- The U.S. Global Change Research Program (USGCRP) was
created by the first President Bush and by Congress to look at "changes
in the environment that may alter the capacity of Earth to sustain life." The
second President Bush created a new organization, the Climate Change
Research Initiative (CCRI); to pursue more focused areas of research
and to narrow uncertainties in the science.
The $1.8 billion pool of research money is dispersed throughout multiple
agencies where the research is performed or funded. Under the second
Bush Administration, there is a greater emphasis on interagency coordination
on setting priorities and objectives, and on external guidance. The research
program has a small central staff office.
A parallel technology program looks at technology solutions, such as
carbon sequestration (capturing and storing carbon emissions from individual
facilities or from the atmosphere), energy efficiency, and clean energy
products.
STRATEGIC PLAN -- The USGCRP conducted two strategic planning efforts:
one before the Bush election, and a second at the new Administration's
request. To avoid the perception that the new plan might be skewed by
politics, the planning process was open and transparent. A public meeting
in December 2002 drew more than 1,000 people, and the team requested
public comments and an NRC evaluation. NRC recommended a clearer vision
of goals and priorities, better program management, clearly defined products/deliverables,
and external review.
Some of the strengths of the new plan, Dr. Moss says, include managing
data to better support national and regional decisions; producing definable
products -- including scientific articles in peer-reviewed journals and
reports to Congress; and improved communication of the science. Thus,
the research program has been expanded to include not just basic research,
but also what use can be made of the research.
The new research plan includes not just mitigation (reducing greenhouse
gases) but, also adaptation -- responding to the impacts of climate change.
For example, providing data that helps local governments prepare for
extreme weather events.
KYOTO -- According to Dr. Moss, the Kyoto Protocol contained some useful
provisions, such as packaging multiple gases together, flexible schedules,
and regional incentives. The Bush Administration rejected the Protocol,
citing the scientific uncertainties and the fact that developing countries
were not required to participate. Dr. Moss believes it had other drawbacks
as well: arbitrary, rather than science-based, targets for emissions
reductions, which meant few actual reductions over the long term; complexity;
lack of positive incentives that encourage behavior change; and lack
of "decarbonization" -- research on ways to encourage reductions
in energy use.
Daniel Charles
Author of The Lords of the Harvest: Biotechnology, Big Money, and the Future
of Food
(June 3, 2003)
Topic: Agricultural Biotechnology
Mr. Daniel Charles was for many years a science writer at New Scientist
magazine and at National Public Radio. He grew up on a small farm in
Pennsylvania -- where his brother still farms -- and came to understand
the complexities of agriculture in ways that many reporters seldom do.
He has witnessed first-hand the growth and controversies of the agricultural
biotechnology industry to the present day.
Currently, the agricultural biotechnology issue is back in the news,
with a European trade case being heard in the World Trade Organization
over genetically modified (GM) grain. In addition, Zambia, gripped by
famine, recently refused shipment of emergency food aid from the United
States on the grounds that the GM seeds might be dangerous to native
strains, and transgenes in maize in Mexico were seen as threatening traditional
strains and future seed stocks.
According to Mr. Charles, the first agricultural biotechnology boom
occurred around 1983, and it represented the collision of the patient
world of plant breeding and the fast and optimistic world of genetics.
Many promises were made. Agricultural biotechnology (Ag biotech) was
hailed as a "second Green Revolution," which may eventually
lead to nitrogen-fixing (that is, self-fertilizing) cereals. It also
met criticism, particularly from activist Jeremy Rifkin who raised the
alarm about the implications that biotechnology had on the relationship
with the natural world. Jeremy Rifkin asked where it all would lead,
inciting fears by conjuring up the potential for meddling with human
genetics. The anti-GM forces were driven more by their concerns about
corporate control of food -- particularly Monsanto’s legal devices
to ensure ownership -- than strictly by fears of Ag biotech.
The history of Ag biotech reveals that what is new is not necessarily
the biology but more the legal and commercial arrangements that brought
the seeds to market, especially when the seeds were produced by chemical
companies and patented as intellectual property. Following the lesson
of "what is new is less important than what is old," Mr. Charles
told the story of the Calgene tomato -- its failure was due less to genetic
tinkering than breeding, growing, storage, and transport; i.e., the basics
that the Ag biotech companies had never before mastered, but which were
well-known to the food processing and distribution industries.
Do GM crops pose a threat to the food supply? Mr. Charles maintains
that the regulatory process has its place, but the threat is rather minuscule
compared with other threats such as bacteria and chemical contamination
of the food chain.
Since many forms of cancer are purported to have food-related causes,
people should pay more attention to what they eat, Mr. Charles says.
Industrial agricultural methods in the United States, including greater
and more intensive production and overuse of chemical fertilizers, may
have deleterious health effects, and yet this is very difficult to pin
down in test cases. At the same time, food standards have improved, though
consumers may not know whether they are eating GM foods since the Food
and Drug Administration will not label foods as such unless they are
recognized as a threat. The Europeans remain more strict, requiring warning
labels if the soy oil came from genetically modified soybeans. And yet
chemical sprays do not require labels.
Lords of the Harvest is, at its heart, a story about the environmental movement’s
aversion to the corporate control of nature as well as fears about ownership
of what is thought to be common property. In the wake of Mad Cow and any number
of other food related crises, the ideas espoused by Jeremy Rifkin and others
in the early 1980s took hold. Above all, Ag biotech has failed because there
has been no perceived consumer benefit to having GM foods. The fight is by
no means over -- the U.S. Government continues to advocate for United States
farmers in maintaining grain exports including those headed to hunger fighting
programs in Africa -- but the financial hits taken by companies such Monsanto
spell an uncertain future for agricultural biotechnology.
Site Visit to the Chesapeake Bay Foundation
Phillip Merrill Environmental Center, Annapolis, Maryland
(June 4, 2003)
The Chesapeake Bay Foundation (CBF) is a non-governmental organization "dedicated
to resource restoration and protection, environmental advocacy, and education" in
the Chesapeake Bay ecosystem. The Chesapeake Bay is the largest estuary
in the United States, a valuable economic and recreation resource for
millions of people in the Mid-Atlantic Region, and just happens to have
the Nation’s Capital located within its watershed.
The CBF located its headquarters on the waterfront southeast of the
central part of Annapolis, and decided to make the site a demonstration
project showing how buildings can be designed to minimize impact to the
environment, in general, and the Chesapeake Bay, in particular. Acquisition
of the site itself preserved a large tract of undeveloped land and shoreline;
the building occupies only a small fraction of the total property. We
were taken on a tour of the facility and shown the highlights of the
building design that features extensive use of recycled materials. Significant
attention was paid to energy conservation and renewable energy sources,
and very low-impact water supply and waste disposal system.
Of particular note is the use of parallel-strand lumber, made from scrap
wood chips and strong enough to be used for structural beams, cisterns
(made from recycled pickle barrels) that collect the rainwater that is
used for all non-potable use on the site, and composting toilets. Water
is heated by solar panels, photovoltaic cells supplement the electrical
supply, and heat and cooling is provided by a groundwater source heat
pump. The building uses a fraction of the energy consumed in a conventional
office building of similar size and requires only a gallon of water per
day to operate the composting toilets. The site is also designed to minimize
run-off to the Chesapeake Bay and the nearby tidal creek. Parking lots
are unpaved gravel to maximize infiltration, and run-off is directed
into retention ponds and other constructed wetland that support native
vegetation. The landscaping is comprised of native plants that require
low maintenance and have low environmental impact.
We received two briefings from CBF officials. Senior Naturalist, Dr.
John Page Williams provided us with background about the CBF’s "State
of the Bay" report -- an annual summary of the condition of the
Chesapeake ecosystem. Vice President for External Affairs, Mr. Chuck
Fox gave us a history of the Chesapeake Bay restoration effort and led
a discussion on the uses and limitations of scientific information to
guide public policy. In general, despite almost two decades of intense
effort to restore the Chesapeake Bay, most indicators show only modest
improvements or even slight declines. This is mostly due to the fact
that nitrogen loadings have been continuously increasing during this
period. The sources of nitrogen are primarily due to agricultural run-off
and atmospheric emissions from motor vehicles and power plants. Increased
population, suburban development, and motor vehicle use in the watershed
have all led to continued stress on the Chesapeake Bay ecosystem despite
the substantial restoration efforts.
Following the presentations, we were given an opportunity to board canoes
and paddle the waters of Black Walnut Creek, a tidal tributary of Chesapeake
Bay that borders the CBF property. Despite the somewhat chilly weather
and persistent drizzle, we were able to see some of the valuable functions
of wetlands, including providing habitat for aquatic life, filtering
sediments carried by storm water run-off, and, in the anoxic bottom sediments,
providing a geochemical environment in which excess nitrogen can be removed
from aquatic ecosystems and returned to the atmosphere through denitrification.
In addition, abundant wildlife was visible, most notably great blue heron,
osprey, and mallard ducks. Finally, we were shown different strategies
for shoreline development and their effects on environmental quality.
John J. Doll
Director, Technology Center 1600
United States Patent and Trademark Office (USPTO)
(June 18, 2003)
Topic: Patent Application and Review Process Focusing on Intellectual
Property Rights in Biotechnology
Mr. John Doll is the Director of "Technology Center 1600" at
the USPTO. The Center deals with biotechnology, organic chemistry, and
pharmaceuticals -- one of the hottest fields of patent law. Mr. Doll
provided an overview of the legislative history of patenting with a focus
on biotech patents and described the major changes to the patenting process
and patent rights.
The USPTO traces its lineage to the Constitution, which explicitly grants
to the Congress broad powers to "promote the progress of science,
and the useful art by securing for limited times to authors and inventors
the exclusive right to their respective writings and discoveries." Mr.
Doll pointed out that a patent does not provide to the inventor a right
to sell the product. It only excludes others from marketing the patented
invention. The famous 1980 U.S. Supreme Court ruling in Chakrabarty v.
Diamond was a major breakthrough in biotech patenting because it allowed
patenting of genetically modified organisms. Since then, over 400 transgenic
animals have been patented. One good example of such animals is Oncomouse,
a genetically engineered mouse used as a drug screening model for cancer.
The patent law is codified in Title 35 US Code. Anyone who invents or
discloses any new and useful process, machine, manufacture or composition
of matter or any new and useful improvement can obtain a patent. Mr.
Doll discussed the four key patent criteria: (1) utility and eligibility;
(2) anticipation or novelty; (3) obviousness (or lack thereof); and (4)
enablement. He commented that the "utility" guidelines of January
5, 2001, now require substantial, credible and specific utility of the
invention. For example, new genes without any substantial and specific
utility will not be patentable subject matter, but a therapeutic method
for treating diseases, an assay method to identify useful therapeutic
compounds, or an assay to detect a material that has correlation with
a disease state will qualify as substantial and specific utility and
may be patentable. More than 7000 patents have been issued by USPTO in
the discovery genetics area.
Mr. Doll discussed at some length changes to patent law that affect
how long the patent actually protects the inventor -- dry stuff unless
you consider that in the pharmaceuticals field a difference of a month
in "exclusivity" could translate to several millions of dollars
in profits. In 1995, for example, the 17-year patent term limit was increased
to 30 years from the time of application. Since the average life of a
biotech patent is 11 years, he observed, this change might not mean much
to the biotech industry.
Mr. Doll’s overview of the life cycle of a patent application
at Technology Center 1600 was to some extent a portrait of an overworked
agency dancing at the edge of meltdown. Filings are increasing dramatically
-- over 41,000 in FY 2002 -- and resources are not.
Mr. Doll noted that the number of patent applications has increased
exponentially from 1996 onwards, but because of the ceiling, the number
of hires at USPTO has dropped significantly from FY 2002 to FY 2003.
The loss of experienced patent examiners is also a problem since it is
difficult to replace these trained examiners with new hires. USPTO is
thinking about contracting parts of the review that do not involve any
decision-making on the patent application, Mr. Doll said, but contracting
out the entire patent review is not an option since the patent bar is
against it, fearing that contracting may diminish the quality of the
process.
back to top
Bruce P. Mehlman
Assistant Secretary for Technology Policy
Technology Administration
U.S. Department of Commerce
(October 24, 2001)
Topic: Science and Technology Policy in the Bush Administration
Mr. Bruce Mehlman provided insights into the Bush Administration’s
views on science and technology policy. He indicated that the Administration
cares very deeply about science and technology, but believes they have
not been effective at communicating their interest in this area. Examples
of this Administration’s interest in science and technology include:
-
Secretary of State Colin L. Powell's seat on the Board of Directors
of America On-line and his activities with the Power-Up Foundation,
which brings Internet access to clubhouses in inner cities.
-
Secretary of Defense Donald H. Rumsfeld’s former position
as CEO of General Instruments.
-
National Security Advisor Condoleezza Rice’s interactions
with technology leaders during her tenure as Provost at Stanford
University.
-
The dramatic growth of the high tech industry in Texas while President
Bush was Governor, which was fueled by his bi-partisan approach to
technology development issues.
The high tech agenda has changed as a result of the September 11th attacks.
Prior to this date, the agenda included: (1) investing in knowledge,
(2) promoting growth and innovation, especially trade efforts, (3) building
a 21st Century infrastructure, (4) critical infrastructure protection,
and (5) empowering citizens.
Now combating terrorism is the top priority, followed by the economy,
trade, education and energy. It is clear that technology is a crucial
tool to achieve all of these goals.
Mr. Mehlman firmly presents the case that technology generates economic growth.
There are plans in the formative stage to use commercial instruments in the
Middle East to promote prosperity and combat the hopelessness and despair that
leads to a willingness to commit acts of terrorism, including suicide attacks.
This will help people in poor and depressed areas by "plugging them in" globally.
Technology will aid us in obtaining greater productivity. The Bush Administration
has both a respect and a passion for technology, but the extent of their
concern is not well-known. Mr. Mehlman gave the Administration an "A-" in
policy, and a "C" in marketing that policy. Since the September
11th attacks, market adjustments have made some overdue corrections and
the technology industry as a whole is doing fine.
Mr. Mehlman concluded by stating that there are some positive occurrences
as a result of September 11th. There is a real appreciation for the federal
work force at the highest levels, and they are more valued. This is a
great boost in attracting employees. Mr. Mehlman went on to praise the
professional civil service and encouraged more collaboration with the
Legislative Branch work force professionals. He also predicted additional
stimulus package legislation, but no more specific industry bailouts
as was done for the airlines.
Nancy E. Foster
Coordinator for Quality Activities
Agency for Healthcare Research and Quality
(October 31, 2001)
Topic: Reducing Errors in Medicine by Utilizing Emerging Technologies
Dr. Nancy Foster opened her presentation by commenting that we live
in an era where technology plays a key role in our lives. We have experienced
its beneficial impact in the growth of our economy and have also experienced
its unfortunate misuse. Therefore, Dr. Foster is gratified to know that
our government seeks more opportunities to exploit the potential benefits
of technology rather than the potential dangers. The Agency for Healthcare
Research and Quality (AHRQ) is one of these opportunities.
The AHRQ (www.ahrq.gov) is the Nation’s lead federal agency for
research on health care quality, costs, outcomes, and patient safety.
The AHRQ’s goal is "to be sure that people who make decisions
about health care have good information on which to base their decisions." This
message permeated Dr. Foster’s presentation to us. She emphasized
the AHRQ activities supporting improvements in health outcomes, strengthening
quality measurements and identifying strategies to accomplish the latter.
The AHRQ has seen its budget increased from $125 million in FY 1996 to
$270 million in FY 2001. Congressional concerns over patient safety drove
this increase.
Among the AHRQ’s activities is the Quality Interagency Coordination
Task Force (QuiC), which enables federal agencies with health care responsibilities
to coordinate activities to improve quality by working together. These
activities include measuring and improving the quality of care, providing
beneficiaries with information to make choices about care and developing
an infrastructure to improve the health care system through knowledgeable
and empowered employees and well-designed information systems. The QuiC
includes a number of work groups aimed at evaluating different facets
affecting the quality of care. Among the items considered is the concept
of "value based purchasing," which looks to purchase heath
care services in a way that it increases its quality.
The AHRQ interacts with a number of public and private sector work groups
to find ways to improve the delivery of health care services and patient
safety. These efforts include an analysis of organizational causes of
accidents. The model employed by the AHRQ involves four components: (1)
managerial decisions and organizational processes in the delivery of
services; (2) factors influencing clinical practices that can trigger
medical errors; (3) care management problems resulting from unsafe acts
or omissions by medical personnel; and (4) defense barriers available
to the medical personnel that may limit liability.
In its efforts to find solutions to patient safety issues, the AHRQ
goes beyond reviewing traditional practices in the health care community.
The primary interest of the agency is in finding technology that can
help minimize or eliminate the risks in providing health care services.
The AHRQ has looked at other industries, such as the aviation industry,
that address similar safety-related problems. It has also reviewed the
use of technologically related tools and is focusing attention on the
use of computer technology, such as the computerized Physician Order
Entry with Decision Support and the Personnel Data Assistant systems.
Some of these systems have been available for 20 years, but have not
found favor with the health care community due to cost or lack of confidence
in the system. In addition, the systems are only as dependable as the
data they contain and hospitals have not always been forthcoming with
information, except as legally required. Further, all health care practitioners
do not necessarily follow the same procedures in treating the same ailment.
The lack of a standard treatment for some medical conditions complicates
the sorting of data available to the physician and can lead to increased
risks to the patient.
Site Visit to the National Zoological Park
(November 7, 2001)
Our first field trip was to the National Zoo in Washington, DC. The
Zoo Deputy Director, Dr. McKinley Hudson, greeted us. Dr. McKinley stated
that the National Zoo consists of 163 acres of land with some 3,000 animals,
including 500 different species. The Zoo receives between two to three
million visitors annually, and has an annual operating budget of $20
million. In addition, the National Zoo carries out animal operations
on 3,200 acres near Front Royal, Virginia, primarily for research purposes.
Interestingly, as Dr. Hudson was greeting us, the day began with an "escaped
animal drill." Dr. Hudson explained that this is when one of the
park employees, usually wearing a green outfit, runs through the park
traversing the path that a live escaped animal might pursue. In this
drill, the "escaped animal" was a zebra.
After an interesting start to our morning, we visited the National Zoo's
most famous residents, the Giant Pandas. The four-year-old male Giant
Panda is named Tian Tian, which means "more and more" in Chinese.
The three-year-old female Giant Panda is named Mei Xiang, meaning "beautiful
fragrance." Panda Head Zookeeper, Ms. Lisa Stevens, explained about
the intense research, care, and feeding that are such a critical part
to keeping Tian Tian and Mei Xiang healthy and happy. These Giant Pandas
are actually on a ten-year loan to the National Zoo from the People’s
Republic of China. Approximately 20 volunteers work in the Panda facility,
helping to care for the Pandas and collecting research data on their
behavior, feeding preferences, and general health. As one might suspect,
about 90 percent of the Giant Panda's diet consists of bamboo.
Following the Giant Panda exhibit, we visited the elephant and hippopotamus
facilities. We were fortunate enough to see all three of the National
Zoo's elephants, introduced to us by Elephant Zookeeper, Ms. Marie Galloway.
The oldest of the elephants was a 25-year-old Asian female named Shanthi,
who was pregnant at the time of our visit. (Subsequently, two weeks after
our visit, Shanthi gave birth to a male, named Kandula.) Ms. Galloway
explained that Shanthi’s pregnancy was the result of artificial
insemination. She said that most zoos do not keep male elephants because
they are such large and aggressive animals. At the other end of the elephant
facility is where the hippos live, a mother and son pair. Hippos are
the third largest terrestrial animals, but spend 80 to 90 percent of
their time in the water. Hippos can outrun a man, reaching speeds of
an astonishing 20 to 35 mph on land. Incidentally, hippos kill more people
in Africa than any other animal.
We then visited the Conservation Research Center, where Center Director,
Dr. Daryl Bonness, greeted us. The Research Center consists of four separate
laboratories used to perform advanced research in various areas of zoology
and biology. The four labs are the Nutrition Lab, which studies ways
to improve animal nutrition and health; the Geological Information Systems
Lab, used to examine the relationship between animals and their environment;
the Molecular Genetics Lab, for performing DNA analysis and "family
tree" typing; and the Scanning Electron Microscopy Lab, which houses
a 60,000 power scanning electron microscope used to examine animal cells
and other tissue samples. In addition to the large laboratory facilities
located at the National Zoological Park, the Conservation Research Center
performs numerous research projects at the Front Royal, Virginia operations.
Dr. Bonness stated that his team also performs research all around the
world, including South America, Asia, and China.
The final stop on our tour was the National Zoo Animal Hospital. Zoo
Veterinarian, Dr. Suzanne Murray, took us on a spectacular tour of the
animal hospital. The National Zoo has two full time veterinarians, one
resident veterinarian, as well as veterinarian dentists. The facility
consists of a pharmacy, pathology laboratory, x ray laboratory, and surgery
facility. The animal hospital also contains a "recovery room," where
we saw two monkeys recovering from injuries they had received in the
park. Dr. Murray indicated that both veterinarians would be active in
the upcoming birth of the baby elephant, and that the facility was being
used to store blood plasma in anticipation of the event. With some 3,000
plus wild animals to care for, covering 500 species, one might suspect
that two full time veterinarians and one intern would be quite busy and
completely challenged. A tribute to the talent, vigor, and professionalism
of the National Zoo Animal Hospital staff was the successful pregnancy
of Asian elephant, Shanthi, and the birth of her calf, Kandula.
Dayton Maxwell
Special Advisor to the Administrator
U.S. Agency for International Development (USAID)
(November 14, 2001)
Topic: USAID Foreign Policy, Conflict Prevention, and Humanitarian Emergencies
Mr. Dayton Maxwell has a distinguished background in foreign policy
and conflict prevention, working with USAID and other international policy
groups. Currently, Mr. Maxwell serves as a special advisor to the Administrator
of USAID. Mr. Maxwell initially retired from USAID in 1995, but returned
in April 2001 to assume his present position. During his first tenure
at USAID, Mr. Maxwell opened the USAID office in Bosnia in 1994, and
served in Niger (1983-1988), Chad (1976-1979) and Laos (1967-1975). He
also held the position of Deputy Director of the USAID Office of U.S.
Foreign Disaster Assistance, where he led disaster assistance response
teams to the Philippines, northern Iraq, the former Soviet Union, Laos,
and the Republic of Georgia. Additionally during his short retirement
from USAID, Mr. Maxwell worked with World Vision -- George Mason University’s
Program on Peacekeeping Policy and the Institute for Defense Analyses.
At World Vision, he opened a conflict resolution and reconciliation portfolio
for World Vision’s global programs.
Mr. Maxwell began his presentation by providing a historical perspective
of USAID and its mission. United States' foreign assistance has the dual
purpose of furthering our Nation’s foreign policy interests in
expanding democracy and free markets while improving the lives of the
citizens of the developing world. USAID grew out of the Marshall Plan
reconstruction of Europe after World War II and the Truman Administration’s
Point Four Program. In 1961, President John F. Kennedy signed the Foreign
Assistance Act into law and officially created USAID by executive order.
USAID, with policy guidance from the State Department, is the principal
agency responsible for extending foreign aid assistance to countries
recovering from disaster, trying to escape poverty, and engaging in democratic
reforms. USAID is now actively involved in over 70 countries. Although
most Americans believe that the United States contributes a disproportionately
large amount of funding to foreign aid, in actuality, the United States
is last among donor countries in per capita assistance, spending less
than 0.5 percent of its federal budget on foreign aid.
The initial focus of USAID was to help rebuild and develop the physical
infrastructure of war-torn European countries. However, as USAID expanded
its reach to developing countries, the focus changed to basic human needs,
such as health and poverty issues. From the late 1970s to early 1980s,
foreign policy reform, led by the World Bank and the International Monetary
Fund, encouraged developing countries to focus more on self-governance
issues, helping to establish democratic rule and free market economies.
As a result, foreign aid became increasingly tied to political reform
mandates. USAID became more concerned with designing safety nets to raise
general populations above the poverty level.
The post-Cold War era of the 1990s brought a whole new set of problems
to the foreign policy arena. During the Cold War, there existed a bipolar,
relatively stable political world in which the United States would support
certain governments in order to keep them on "our side." Similarly,
the former Soviet Union supported other governments to maintain their
allegiance. However, in 1989, the demise of the Soviet Union created
a power vacuum allowing the ascendancy of warlords in many developing
nations. This led to political instability and the "failed state" problem,
as exemplified in Somalia, Sudan, and Angola. The instability and power
struggles between these and other nations continue to the present day.
While the United States adopted a hands-off policy to these situations,
believing that it was not in our national interest to become responsible
for "policing the world," politically unstable situations continued
to grow. The present political crises in Bosnia and Iraq can be traced
back to this lack of diplomatic involvement during the post-Cold War
period.
As a result of these changes in the political climate, the role of USAID
under the present Bush Administration increasingly includes an emphasis
on conflict prevention. Moreover, after the events of September 11, prevention
of terrorism has become an overriding priority throughout the government.
Since abject poverty is generally considered one of the root causes of
terrorism, USAID’s mission of working to establish poverty safety
nets in less developed nations becomes even more critical.
An engineer by training, Mr. Maxwell stressed the importance of science
and technology in helping to overcome poverty. He emphasized that this
critical tool is often overlooked by those responsible for foreign policy
and diplomacy. Conventional wisdom fostered the attitude that developing
countries do not have the capacity to use technology, however that is
incorrect. Even in the poorest of countries, some people are capable
of using, or of being trained to use, technology. Technology may be used
to effectively prevent conflict. For example, information technology
can be employed to share up-to-the-minute information and potentially
diffuse potential conflicts. Diplomatic negotiators and the military
can also proficiently use computer simulation to predict the outcomes
of pressing conflicts based on various options for action. Furthermore,
giving the general population a means to rapidly communicate with each
other and the outside world can be an effective check on a government
that may otherwise take oppressive action.
Other productive uses of technology as part of foreign aid include agricultural
satellite imagery such as the Famine Early Warning System. This type
of imaging predicts loss in crop production due to drought and other
climatic stresses. Advance warning of potential famine situations is
helpful in planning the allocation of humanitarian aid and preparing
for related political conflicts.
By increasing the use and effectiveness of available technological tools,
Mr. Maxwell hopes that USAID can further its mission to provide relief
despite limited resources.
Robert P. Lyons, Jr.
Director
Joint Strike Fighter Engineering and Manufacturing Development
U.S. Air Force
(November 14, 2001)
Topic: Joint Strike Fighter Program
Colonel Robert Lyons presented to us an overview of the development
of the Joint Strike Fighter (JSF) -- an aircraft not only new technically,
but in its raison d’être as well. It is the first military
aircraft designed with economy as a major consideration, superceding
performance as the first priority.
The JSF is the next generation strike fighter, intended to replace the
aging inventory of F-16 Falcons and early versions of the F/A-18 Hornets.
It is also intended to replace the vertical take-off Harrier aircraft
in service with the U.S. Marine Corp and the British Royal Navy and Air
Force. The project’s vision is to develop and produce a family
of affordable multi-mission fighter aircraft using matured, demonstrated
21st Century technology and sustain it worldwide. Colonel Lyons explained
that the development of aircraft in the past was based on a vision of
maximum performance regardless of the cost. While that is understandable
given the Cold War arms race, the result was a host of different airframes,
each completely unique and each having separate attendant maintenance
requirements. Maintenance and refurbishment comprises the greatest cost
of an aircraft over its predetermined life cycle. Of course, ever-tighter
budgets have spurred a rethinking of this approach.
Thus, a new process, considering the economics of the total life cycle
of the product, was introduced. In many ways, the approach is reminiscent
of the first use of interchangeable parts during the 19th Century. In
the present case, a fighter has been developed which has exceptionally
good performance (at least as good as the planes it is replacing, which
is still superior to that of any potential adversary), coupled with new
technologies including stealth signature, advanced countermeasures, the
latest avionics and data link architectures, and state-of-the-art prognostics
and health management. The theme of the development project is "lethal,
survivable, supportable, affordable."
Colonel Lyons further explained that the development is an international
program. This is important in that several countries outside the United
States will be the end users of the product and their input into its
utility and design must be considered. These countries include the United
Kingdom, Denmark, The Netherlands, Norway, Italy, Israel, Canada, Turkey,
and Singapore. Each of these nations views their potential combat operations
from a different tactical standpoint; thus, the new fighter must meet
certain performance and operational criteria, which accommodates all.
This was a significant challenge because the result of such wide requirements
could have resulted in a "compromise" product useful to none.
This was avoided by designing not a single aircraft, but a family of
three, using as much modularity and commonality as possible. The family
includes a conventional take-off and landing version (CTOL), a carrier
variant (CV), and a short take-off and vertical landing (STOVL) version.
These are suitable for the U.S. Air Force, Navy, and Marines, respectively.
Each member of the family shares a common engine, structural geometry,
and electronics system, but differs in secondary factors such as wing
area, landing gear strength, or payload versus fuel capacity. For example,
the carrier version has a larger area, folding wing arrangement and tougher
landing gear, while the Marine version has a vertical take-off/landing
propulsion design with less fuel (range) capacity with differing air-to-ground
combat capability for supporting infantry. None of the variants are lacking
in performance by any measure -- they only perform differently.
It was a particularly good time to hear from Colonel Lyons -- the contract
was awarded to Lockheed-Martin only three weeks prior to his lecture.
He went on to lay out the required milestones in the awarded project.
As many as 6,000 aircraft will be built during the manufacturing phase
of the contract. Most importantly, the manpower, operation, and support
costs will be 30 percent less than that which would have been obtained
had a conventional design and development approach been used. Colonel
Lyons imparted to us an important lesson in balance -- the deployment
of advanced science and technology weighed against economic constraints,
and how such an approach has succeeded.
Eamon M. Kelly
Chairman
National Science Board
National Science Foundation
(November 28, 2001)
Topic: The National Science Board
Dr. Eamon Kelly was appointed by President Clinton to serve on the National
Science Board (NSB) in 1995. He was elected NSB Chairman in 1998 and
re-elected in 2000.
Dr. Kelly spoke on the origin, purpose and roles of the NSB. He spoke
on the NSB’s responsibilities as an advisory resource to the President
and its charter as the governing body of the National Science Foundation
(NSF). He also spoke very briefly on his own beginnings and diverse set
of experiences that included Professor of International Development and
Technology Transfer where he worked with issues of world development,
predominantly focused on Africa.
The NSB came about as a result of World War II, when scientific development
from the war effort resulted in enormous breakthroughs such as RADAR
and in new research organizations like the Atomic Energy Commission and
Office of Naval Research. The recognition of the need to mobilize science
and research to benefit a post-war environment led to the establishment
of the NSF. Dr. Kelly remarked that the scientific community was against
the NSF being a federal agency; however, President Truman did not want
the NSB. Compromises were made culminating in the creation of the NSF,
with the NSB as a governing board. There was also to be one Executive
Committee, led by the NSB chair, legislatively mandated for oversight.
Today the NSB operates under that same structure and mandates. In fact,
the President appoints all 24 members of the NSB. Dr. Kelly explained
that the NSB has two major activities: (1) stewardship of the NSF, and
(2) advising the President and Congress on science and engineering policy.
He pointed out that for the NSF, this means guidance of the organization’s
major area of focus (i.e., the health and well-being of the sciences
and engineering). In general, scientific research is underfunded as a
percentage of the Gross National Product. This is a major concern given
the impact of science and technology on the economy. Another concern
is maximizing the value-added outcome of grants. Dr. Kelly spoke of the
difficulty in the continuous funding of research, which for the most
part comes in one or two-year grants. The lengths of such grants permit
only partial results and often leave the balance of the research underfunded
or with no continuing funding at all.
Dr. Kelly mentioned several areas where the NSF is the lead agency and
has ongoing major initiatives -- nanotechnology, biocomplexity in the
environment, and science and math education. He expressed the opinion
that the failure of our urban education system is the country’s
biggest problem.
Dr. Kelly conveyed how he has seen a measurable growth in interest over
the last six years in the general attendance and breadth of congressional
participation at NSF hearings. He spoke of how support on both sides
of the aisle in Congress has been good and interest in funding basic
research has continued to grow.
Dr. Kelly concluded his talk with a description of the NSB’s role
as advisor to Congress and the White House. The NSB’s role grew
out of the recognition that there was a vacuum of science policy at the
national level. The NSB saw a need to provide the White House with the
analytical reviews required for developing policies in a comprehensive
manner.
Dr. Kelly commented on how the Board is actively engaged in priority
policy issues and continues to aid in setting and reviewing the federal
research portfolio. Dr. Kelly reiterated that in this role, the NSB undertakes
studies for research budget priority setting and using methodologies
that reach across the vast science and engineering fields in both the
United States and other countries. He spoke of the "globalization
of science" and the ability of the information age, with its wireless
information technologies, to help other countries make great leaps in
both social and economic status as a result of making scientific discoveries
more readily accessible.
Karen Cleary Alderman
Executive Director
Joint Financial Management Improvement Program
(December 5, 2001)
Topic: The Impact of Technology in the Financial Systems Environment
The Joint Financial Management Improvement Program (JFMIP) is a joint
undertaking of the Department of Treasury, the Office of Management and
Budget (OMB), the General Accounting Office, and the Office of Personnel
Management to improve financial management policies and practices. It
supports a wide range of programs mandated by Congress and is primarily
interested in issues that affect a large number of agencies. The JFMIP
goes back over 50 years and draws its authority from the Budget and Accounting
Procedures Act of 1950, OMB Circular A-127, and the Federal Financial
Management Improvement Act of 1996. The JFMIP vision centers on supporting
a partnership between program and financial management to assure integrity
of information for decision-making and performance measurement.
The fiduciary success of some federal agencies is directly related to
recent improvements in technology. Several Federal Financial Management
Offices/Agencies have implemented Enterprise Resource Planning applications
to enable them take advantage of the integrated nature of these packages.
However, implementing financial management packages has been difficult
because of the wide variety of data needs and the difficulty of defining
requirements before contractor implementation.
Ms. Karen Alderman eloquently discussed the role of technology in improving
performance, and alluded to the problem that the Federal Government as
an entity does not currently produce an auditable financial statement.
To resolve some of the inherent problems, JFMIP has partnered with the
Chief Financial Officers (CFO) council to deal with crosscutting issues,
such as functional systems requirements responsibility. JFMIP identifies
and issues financial system requirements documents for core financial
systems and for the financial portions of mixed systems. Also, under
OMB policy, JFMIP must certify vendor core financial system software
prior to its implementation to ensure uniformity and conformance to data
integrity within the Federal Government. Some strategic drivers of JFMIP
activities include:
-
Increase in information needed by stakeholders,
-
Increase in accountability and oversight,
-
Decrease in budgetary resources,
-
Decrease in human resources, and
-
Increase in information technology.
These congressional mandates and CFO requirements were initiated by
new needs for providing adequate financial management information, accountability
and audit trails. Successful implementation of new financial management
systems is clearly correlated to the adequacy of the requirements definition
and to management commitment.
Ms. Alderman mentioned the Bush Administration’s new Management
Agenda, which contains budget/performance integration initiatives, a
strategic management of human resource capital element and targets increased
competitive outsourcing, improved financial performance measurement,
expanding E-Government initiatives, and also the provision for high-quality
financial data in program areas. Future demands for financial information
will require improved timeliness, enhanced usefulness and more reliability
of these data.
Ms. Alderman discussed the cardinal sins of systems implementation,
which result in failure. These include lack of qualified project managers,
poor sponsorship, scope creep, and poor milestone management.
Norman P. Neureiter
Science and Technology Advisor to the Secretary of State
U.S. Department of State
(December 12, 2001)
Topic: Science and Technology in 21st Century Foreign Policy
Dr. Norman Neureiter is the Science and Technology (S&T) Advisor
to the Secretary of State. This job was created partially in response
to an October 1999 National Academy of Sciences' report entitled, The
Pervasive Role of Science, Technology, and Health in Foreign Policy:
Imperatives for the Department of State. As a result of this report,
the Department of State developed a policy and implemented a program
to strengthen its capacity with respect to S&T. Currently, the United
States has 16 stated foreign policy goals with 13 directly involving
science, technology, or health. Many decisions are made on the basis
of concerns and priorities other than S&T and the role of a science
advisor is to describe scientific opinions. In cases where scientists
disagree, Dr. Neureiter believes that they must integrate back until
they agree on the facts even if they have reached different conclusions.
Intellectual property (IP) rights are an issue in forging scientific
agreements with other countries. The Office of the U.S. Trade Representative
requires IP language in our agreements with other nations, even though
only one percent of our agreements involve IP issues. This has sometimes
resulted in unwillingness on the part of other nations to execute agreements
with us. Additionally, the United States does not have a stellar reputation
as an international partner. For example, because of cost overruns and
management problems, NASA is planning to delay expanding the crew on
the International Space Station from three to seven astronauts. With
this smaller crew size, the scientific utility of the station will be
significantly reduced. This decision will violate specific agreements
with Japan, Germany, Russia, and all the European Space Agency countries.
Canada has already requested formal government-to-government negotiations
to resolve this issue. Dr. Neureiter believes that the United States,
as a world leader, should show integrity in maintaining relationships
and, thus, there is a need to plan on an international basis from the
project initiation.
Dr. Neureiter points out that there are a large number of issues, (i.e.,
HIV/AIDS, clean water, energy, global warming, human cloning, and terrorism)
for which an understanding of S&T is important. He expressed considerable
concern over the HIV/AIDS situation in Africa. The capacity to see where
S&T will lead us does not exist within the government. The National
Intelligence Council released a publication, Global Trends 2015: A Dialogue
About the Future with Non-government Experts that outlines seven drivers
of the future. These drivers are: (1) demographics, (2) natural resources
and the environment, (3) science and technology (information technology,
biotechnology, materials technology and nanotechnology), (4) global economy/globalization,
(5) national and international governance, (6) future conflict, and (7)
role of the United States.
Dr. Neureiter believes that the Department of State should be considered
a national security agency involved in three parts of national security
-- intelligence, diplomacy, and military force. The goal should be to
prevent conflicts (diplomacy). In the past, the American people have
demonstrated isolationist tendencies. The government needs to do a better
job communicating to the American people that we have a world leadership
role.
Dr. Neureiter seeks to improve the science literacy at the Department
of State by providing temporary positions for personnel from other federal
agencies and from scientific societies. The work of the Department of
State is to build a world where we all get along, with S&T as a key
part of this effort.
George P. Smith, II
Professor
Columbus School of Law
The Catholic University of America
(January 8, 2002)
Topic: Bioethics and Health Law
Health has become a pivotal issue for the Nation. The rise of health
maintenance organizations (HMOs) and the impending retirement of the "Baby
Boom" generation have generated interest in issues ranging from
appropriate health care to the right to die. This interest has fueled
the need to develop criteria for a decision-making process that is consistent
with societal values. Professor George Smith, a widely published and
recognized national and international scholar in the field of Bioethics
and Health Law, provided some insights on the various options that address
these issues.
Professor Smith focused his presentation on the development of a decision-making
construct related to bioethics and health-related issues. The proposed
construct defines a balancing test or cost-benefit analysis that weighs
the gravity of the harm against the utility of the good resulting from
the given decision. While these two components are significant in identifying
the decision’s cause and effect, the key to the construct is the
fulcrum on which they are balanced. In formulating this presentation,
Professor Smith discussed bioethics principles and the various theories
that can impact the decision-making process.
Bioethics is defined as moral reasoning relating to health care. In seeking
to understand the role of bioethics in health-related issues, one must be mindful
of the principles that dictate its very own function. These principles include
autonomy, non-malfeasance, beneficence, confidentiality, distributive justice,
and truth telling. The principle of autonomy states that independent actions
or choices of individuals should not be constrained. Non-malfeasance demands
that we "do no harm." It is a reminder that one has the duty not
to inflict evil, harm or risk of harm on others. Beneficence relates to the
duty to help another by doing what is best for that person. Confidentiality
imposes the duty of not divulging information of one to another. This duty
can be a result of an implied promise. Distributive justice brings into consideration
the distribution of scarce resources. Finally, truth telling means simply to
be truthful.
These principles are significant in weighing the two components of the
proposed construct because bioethics is based on the theory of relativism.
Under this theory, bioethics takes into account the custom of the society
in which a person lives in. Thus, it holds that the "wrongness" of
an action cannot be determined apart from its cultural setting. By looking
at the culture in which the question presents itself, the educational
level and "feelings" become the basis of the decision. These
principles provide a means for the person to exercise a decision. From
this basic definition, a number of different types of bioethics have
evolved.
-
In Narrative Bioethics, the basis for decision-making comes from
stories of those involved in bioethical disputes.
-
Virtue Bioethics relies on what a virtuous person would do under
the same circumstances.
-
The Ethics of Caring analyzes and evaluates the strength of the
relationship between the human beings.
-
Religious Bioethics centers on the person’s beliefs.
-
Causity Bioethics looks at evaluating and testing each bioethics
case separately with moral judgment.
-
Pragmatic Bioethics looks at practical solutions that avoid waste.
-
Posnerian Bioethics applies an economic model to the decision-making
process.
-
The Critical Waste Theory of Bioethics looks at the racial background
to make the evaluation.
-
Liberal Feminist Bioethics looks at the similarities between men
and women.
-
Cultural Feminist Bioethics focuses on the differences between men
and women.
While these types of bioethics provide helpful perspectives, they do
not define the elusive fulcrum. The fulcrum allows one to objectively
and reasonably evaluate the components against a set standard. But, this
standard is not singular. It also has its variants. The most practical
standard is an economic analysis standard, where the cost of the good
is weighed against the cost of the harm. While this standard is clear
cut and simple, it can be perceived as cold and calculating. Another
standard is the deliberative democracy standard, which involves the public
in weighing these components. Finally, a third standard asserts that
the decision-maker must be educated with help from outsiders. This standard
is referred to as the judiciary standard. Under this standard, the judge,
who is also a product of an environment, may be unable to come to grip
with this and, thus, is in need of continuing education. Yet, by bringing
bioethicists into courts, one risks bringing moral issues into the court
as well.
The moral issues that may be brought into the court tend to center on
the physician’s role as a gatekeeper for the allocation of health
care services. As gatekeeper, the physician has direct influence on whether
these services are allocated positively (i.e., for the benefit of the
patient) or negatively (i.e., when death is conflicted). How can one
find a balance? Generational equity? That is, rationing resources based
on a person’s age or prospect of societal contribution? What is
the fulcrum?
In the end, science has a duty to exercise self restraint in matters
that can result in dangerous cataclysmic results to mankind. While there
should be freedom of investigation, it is important that people be reasonable.
The question of reasonableness, and perhaps the answer to the much-needed
fulcrum, ultimately resides on determining what would a "reasonable" person
do under similar circumstances. This person would think and not let emotions
control the outcome. While this may not necessarily be the ultimate definition
of the proposed construct, it is important to continue seeking a common
understanding involving a democratic consensus. After all, the law reacts
to social need and does not seek out technology.
Lana R. Skirboll
Director
Office of Science Policy
National Institutes of Health (NIH)
(January 16, 2002)
Topic: Policy Issues at the National Institutes of Health
Dr. Lana Skirboll is a neuroscientist and the author of more than 75
scientific publications. She received her Ph.D. in the Department of
Pharmacology, Georgetown University Medical School, and conducted her
postdoctoral training in psychiatry and pharmacology at the Yale University
School of Medicine. She served as a Fogarty Fellow at the Karolinska
Institute in Stockholm, Sweden, and conducted research in the intramural
research program of the National Institute of Mental Health (NIMH). Subsequently,
she served as Associate Administrator for Science in the Alcohol, Drug
Abuse, and Mental Health Administration (ADAMHA). In 1992, Dr. Skirboll
was appointed Director of the Office of Science Policy in the NIMH. In
1995, she joined the Office of Science Policy, NIH, as Director.
In her current position, Dr. Skirboll advises the NIH Director and the
NIH Institute and Center Directors and provides national leadership on
science policy issues, including the ethical, legal, social, and economic
implications of medical research. She is responsible for the development
of NIH’s Guidelines for Research Using Human Pluripotent Stem Cells,
and NIH’s oversight of gene therapy research, including the activities
of the Recombinant DNA Advisory Committee (RAC). Her responsibilities
also include the activities of the Secretary’s Advisory Committee
on Genetic Testing and the Secretary’s Advisory Committee on Xenotransplantation.
The Office of Science Policy is also the focus for a variety of program
activities, such as the NIH-wide initiative on the development of biomarkers
in clinical trials as well as economic studies on the Cost of Illness
and the annual Biomedical Research and Development Price Index (BRDPI).
In her address to us, Dr. Skirboll discussed the social, ethical, and
political issues of stem cell and cloning research. She said in the past
nine years, two administrations have held two different positions on
this issue. The role of civil servant in the Federal Government is to
support the changed position of the current Administration. Dr. Skirboll
said that embryo research has been banned since 1995; there has been
a strong pro-life contingent and the Federal Government does not want
to put money into such a controversial area. However, there is also a
strong patient advocacy contingent that supports this research. President
Bush has taken the middle ground, allowing research to be conducted using
embryos that have already been harvested, but not allowing any new embryos
to be used in federally supported research.
Dr. Skirboll explained that we have two types of cloning: reproductive
and therapeutic. She agreed that reproductive cloning should be banned.
However, therapeutic cloning has been used to derive cures for a variety
of diseases. At this time, the President had not issued a statement regarding
cloning, but there is an objection to therapeutic cloning that requires
an embryo, which is destroyed if not used. The music industry in Hollywood
has become involved in the issue of cloning, both as an advocate and
a way to educate the public.
Dr. Skirboll concluded that the NIH finds itself walking a fine line
between the beliefs and desires of the public and the expectations of
the science community. In the end, NIH must do this while holding to
its mission to save human lives and to find cures for the diseases that
threaten public health. To fulfill this mission, NIH has established
accountability goals with reasonable targets that integrate performance
and budget.
Eric D. Green
Chief, Genome Technology Branch and
Director, NIH Intramural Sequencing Center
National Human Genome Research Institute
National Institutes of Health (NIH)
(January 31, 2002)
Topic: The Human Genome Project (HGP)
The HGP is a coordinated international effort to decode and elucidate
the genetic architecture of the human genome and the genome of several
other organisms. Politically, the HGP began in 1990 with a goal to totally
map the entire human genome by the end of 2003. However, to obtain funding
from Congress, the HGP emphasized the mapping and sequencing of the human
genomes. The HGP started with the study of single cell organisms, such
as yeast, and then progressed to worms, mice, and finally humans.
Dr. Eric Green began his presentation by first explaining to us the
basic structure of the human anatomy starting from cell, to chromosome,
then to deoxyribonucleic acid, commonly referred to as (DNA), which forms
the basic genetic blueprints of all living things.
In 1953, Watson and Crick discovered that DNA was made up of four bases:
Guanine (G), Adenine (A), Thymine (T), and Cytosine (C). They noted that
DNA forms a double helix whereby G always pairs with C and A always pairs
with T. In the 1990s, the genome revolution became possible because of
the research of Dr. Kary B. Mullis, who in the 1980s developed the most
prevalent method used in tagging or marking bases with radioactivity
by the use of fluorescence dye known as Polymerase Chain Reaction (PCR).
Dr. Fred Sanger also played an important role in the shaping of the DNA
technology. Both Dr. Mullis and Dr. Sanger received Noble Peace Prizes
for their research.
The mapping and sequencing of the human genome began with the cytogenetic
map of the human genome, which includes the 24 Chromosomes (22 Chromosomes,
plus the X and Y). Using a literature analogy, Dr. Green described each
chromosome as representing a volume in the human gene map. The two main
methods applied in the mapping and sequencing of human genes are the
shotgun and the whole genome shotgun sequence. A third hybrid method
of sequencing exists, which employs the best aspects of the first two
methods. The results of this massive effort were conducted in five main
gene centers commonly referred as the G5. These are the University of
Washington, the Massachusetts Institute of Technology, Baylor University,
Fred Sanger Center, and Stanford University (The Joint Genome Institute
(JGI)). The results of this effort are then consolidated in the NIH GenBank,
a central repository.
The main impact of the HGP is tailored towards the study of human diseases
and finding therapeutic applications such as in the functional and positional
cloning of diseased genes. By February 15, 2001, the results of the genome
project were published in both Science and Nature. In the "what
next" category, efforts are being made to finish the human genome
sequence, which is now 67 percent complete, and projected to be finished
by April 2003. Once completed, efforts will be directed at sequencing
genomes from other organisms.
Dr. Green is concerned about the Ethical, Legal, and Social Implications
(ELSI) of the HGP. Scholars are studying these impacts by carefully examining
all the facets associated with genetic results since all diseases have
a genetic component. The results of the genome project have helped with
better diagnostic and therapeutics.
Dr. Eric Green received his M.D. and Ph.D. from Washington University
School of Medicine (St. Louis, Missouri), and is a leader in the genomics
mapping and sequencing endeavor.
Site Visit to Celera Genomics
(February 6, 2002)
Celera Genomics is a private company that has earned national recognition
for its contributions to mapping the human genome. Our host for this
visit was Dr. Scott Patterson, Vice President for Proteomics at Celera
Genomics.
In a brief presentation, Dr. Patterson outlined the company’s
structure, history, and primary research activities. He explained that
Celera Genomics’ activities are focused in two areas. One is an
on-line information business and the other is therapeutic drug discovery
and development. The on-line information business is a leading provider
of information based on the human genome and related biological and medical
information.
Dr. Patterson described the company as being in transition. Previously, it
had been focused on the mapping and sequencing of the human genome. Now the
company is attempting to leverage the technology and analytic tools developed
in the genome project to focus on therapeutic drug discovery and development.
Celera Genomics intends to use its capabilities in the study of genes and proteins
and their relationship to diseases to identify diagnostic markers and to discover
and develop novel therapeutic drug candidates.
The efforts to map the human genome involved the analysis of over three
billion bits of data. In support of this effort, the company invested
in sophisticated DNA sequencing and analysis equipment, a super-computer
center, and algorithm development. Celera Genomics has established the
world's largest sequencing facility, powered by 300 DNA analyzers. They
used the first DNA sequencer designed for production scale sequencing.
Celera Genomics was able to sequence and assemble the human genome in
just nine months. The key to Celera Genomics' sequencing speed and productivity
is its whole genome "shotgun" sequencing approach. This technique
involves breaking the entire genome into small, random fragments, then
sequencing them with no foreknowledge of their location on the chromosome.
Proprietary algorithms were used to assemble the fragments into contiguous
blocks and assign them to the correct location in the genome.
Celera Genomics also has a powerful super-computing facility; featuring
800 interconnected computer systems, each of which is capable of performing
more than 250 billion sequence comparisons per hour. This facility is
being used to assemble DNA fragments into complete genomes, and will
be used for comparative analysis. It also provides linkages to customer
computers on site, and will act as an electronic delivery system for
Celera Genomics' database use over the Internet.
Dr. Patterson explained that Proteomics is the protein equivalent of
the human genome project. By mapping the proteins in the human body,
coupled with knowledge of the human genome, this research could lead
to new advances in disease control.
Proteomics enables high-throughput biology through protein-based analysis
integrated with genomics. Proteomics, the study and understanding of
the role and function of the protein products of genes, is one of the
core programs at Celera Genomics. Its accurately assembled and annotated
human genome, coupled with its bioinformatics capability, allows proteomic
knowledge to be harnessed in a systematic fashion.
The goal of Celera Genomics' proteomic-driven therapeutic discovery
process is focused on delivering proteins that are differentially expressed
in diseased tissue compared to normal tissue. This work focuses on: (1)
identifying disease candidate markers, (2) identifying therapeutic antibody
targets, (3) identifying cellular immunotherapy candidates, and (4) identifying
differentially expressed, biochemically activated proteins from which
small molecule drugs can be developed.
As part of the company’s experience in large-scale biological
research efforts, they have developed an industrial scale facility capable
of analyzing hundreds of samples per day. Their research strategy involves
discovering disease diagnostic markers and targets for therapeutic intervention.
This approach focuses on a liquid chromatography/mass spectrometry-based
program that allows for simpler automation and higher efficiency than
conventional 2-D gel chromatography technology.
Celera Genomics' scientific team has integrated high-throughput mass
spectrometry. Celera Genomics has also developed proprietary sample preparation
procedures employing both high-speed cell sorting (up to 50,000 cells
per second) to enrich for specific types of cells, and proprietary protein
capture methodologies that enable enrichment of specific protein families
(i.e., drug targets). Celera Genomics' bioinformatics group has developed
matching algorithms that tie specific peptide fragmentation patterns
identified by mass spectrometry back to proteins and the complete human
genome.
After the technical presentation was completed, we toured the company’s
facilities, visiting the proteomics laboratory, gene sequencing facility
and computing center.
S. Fred Singer
President
Science and Environmental Policy Project
(February 27, 2002)
Topic: Global Climate Change
Dr. Fred Singer is an atmospheric physicist, professor emeritus of environmental
sciences at the University of Virginia, and distinguished research professor
at George Mason University. Currently, he serves as the president of
the Fairfax, Virginia-based Science and Environmental Policy Project
(SEPP) -- an organization he founded in 1990. The SEPP is a nonprofit,
policy institute whose mission is to provide a sound scientific base
for environmental policies. Dr. Singer began his presentation by outlining
the three major elements involved in understanding the problem of global
climate change: (1) science, (2) economics, and (3) politics. Throughout
his presentation, Dr. Singer emphasized that global warming is a controversial
subject and he illustrated which points were generally accepted and which
areas were disputed.
Dr. Singer explained that the geological record provided well-accepted
data indicating historical climate conditions. Historical climate data
are collected from a variety of sources, such as ice core samples in
Greenland and Antarctica, ocean sediments, and coral and stalagmite records.
Beginning around 1880, there are also records of surface air temperatures
measured by thermometers. He noted that, on a millennial scale, climate
changes occurred well before there was any significant human intervention.
For example, it is well accepted that there were "ice ages" over
10,000 years ago, interrupted by "inter-glacial" warm periods.
The data indicate that there have been several warming and cooling periods
throughout history, including a Medieval Climate Optimum about 1,000
years ago, followed by a "little ice age" around 1200-1400
A.D., and another warming trend beginning in 1850. Accepted scientific
explanations for these long-term climate variations include changes in
the Earth’s orbit around the sun and axial precesses (or wobbles)
in the Earth’s orientation that cause portions of the Earth to
be closer or farther away from the sun.
While climate change on a millennial scale is well-documented, theories
on decadal climate changes produce differences in opinion. Interpretation
of short-term climate fluctuations may lead to different conclusions.
In the 1960s, there was great concern about a possible coming ice age
due to falling temperatures. This was followed by concerns raised by
a sudden unexplained warming in 1975. The climate data throughout the
1980s lead to general apprehension on global warming trends due to "greenhouse
effects" produced by human activity. While official NOAA records
show increases in temperature over the last 20 years, Dr. Singer disputes
that this is indicative of an actual global warming trend. Rather, he
argues that the warmest years in recent United States' history were in
the 1940s, and that there has been no perceptible increase in temperature
since. Dr. Singer suggests that the urban heat island effect is polluting
recent temperature data showing a false warming trend. Additionally,
data from weather balloons and satellites give a different picture of
environmental conditions, calling into question the global warming conclusion.
Dr. Singer also explained that climate models give widely differing
predictions of the effects of global greenhouse gasses. The differences
among the major climate models, as reported by a United Nations' science
group, are as large as 300 percent in the predicted effects and climate
outcomes. Furthermore, while various reports predict dire impacts of
increased greenhouse gas emissions, the predicted regional results of
these models vary widely.
With respect to the economic assessment of potential climate changes,
Dr. Singer notes that the current United States and United Nations' predictions
are that global warming will have a disastrous economic effect. Various
existing proposals, including the Kyoto Protocol, attempt to mitigate
the economic and environmental effects of potential global warming. The
Kyoto Protocol is an international treaty requiring member nations to
reduce their greenhouse gas emissions, which are widely regarded as a
leading cause of global warming. According to Dr. Singer, the cost of
implementing the Kyoto Protocol, requiring a seven percent reduction
in greenhouse gasses from the 1990 level, would be upwards of two to
four percent of the United States' Gross National Product. Dr. Singer
asserts that, according to the British climate model, the effects of
such a small reduction would be negligible, resulting in a decrease of
only 1/20th of a degree by 2050. In order to stabilize the effects of
greenhouse gasses, Dr. Singer asserts that emissions must be reduced
by approximately 60 to 80 percent from the 1990 base level. In 1997,
the Senate rejected the Kyoto Protocol.
According to Dr. Singer, the most important factor affecting global
climate change policy is the political aspect. The United States' environmental
policy will be driven largely by the political concerns of the current
Administration. For example, the Clinton Administration proposed policies
favoring mandatory emissions reductions, asserting that the science indicating
global warming is certain, supporting the effectiveness of such measures.
In contrast, the present Bush Administration has referred to the science
as "uncertain," and has rejected mandatory emissions on the
grounds that they will be economically harmful to the United States.
The present Administration views the Kyoto Protocol as fatally flawed,
partially because not all countries are signatories to the treaty.
Dr. Singer proposes that the best way to reduce emissions and to improve
the environment is not through mandatory treaties, but rather, by offering
incentives to do the environmentally sound thing. For example, energy
efficiency should be encouraged by imposition of a fuel tax, rather than
congressionally mandated emissions standards taking decades to implement.
Other ideas are to provide incentives for the development of hybrid electric
cars and alternative fuel cells. Dr. Singer further predicts that the
issue of global warming will fade in coming years, similar to the now
forgotten fears of nuclear winter and the predicted ice age of the 1940s.
Susan E. Cozzens
Chair and Professor
School of Public Policy
Georgia Institute of Technology
(March 6, 2002)
Topic: Science and Technology Policy Professionals:
Jobs, Work, Knowledge, and Values
Dr. Susan Cozzens, Chair and Professor of the School of Public Policy
at the Georgia Institute of Technology, engaged us with a presentation,
which echoed the keynote address she delivered at the American Association
for the Advancement of Science Workshop on Science and Technology Policy
Careers in May 2001. She emphasized three topics: (1) what kinds of career
options are available for a science and technology (S&T) policy professional,
(2) what one needs to know to be an S&T policy professional, and
(3) the benefits of a career in science and technology policy.
Dr. Cozzens distinguished between the "amateurs" of science
policy; those who are often the leaders in their field, but whose "day
jobs" are in the science, rather than the policy of science, and
the S&T policy professionals who are steeped in the knowledge base
of the field; trained in the techniques of analysis; and carry a special
public responsibility that is different from the public responsibility
of the amateurs. They may find themselves in academia or in government
or both.
To give a flavor of the type of work an S&T policy professional
may find, Dr. Cozzens described the careers of several people in the
field. These people demonstrated the range of disciplinary backgrounds
that is brought into this field, from the sciences and engineering through
the social sciences and humanities. Second, they show the movement among
positions, in different sectors, including an in-and-out-of-government
pattern that places them in many different kinds of jobs across government.
Third, they illustrate the wide range of jobs outside of government that
this field opens in private industry, nonprofit organizations, and universities.
Dr. Cozzens explained that the job of the S&T policy professional
involves a combination of administration and management, analysis, and
research. In their administrative and management roles, S&T policy
professionals participate in the decision processes of various research-related
organizations. They use their knowledge, skills, and concepts to shape
the world. For example, when Dr. Cozzens was at the National Science
Foundation in the 1990s, she was part of the senior leadership team,
participating in strategic planning and program development, which was
combined with designing a planning and assessment system for the foundation
and exploring options in the peer review process. The budget examiner
job is different -- quiet, behind-the-scenes, yet highly influential
through the quality of information gathering and advice to budget decision-makers.
The second job, analysis, brings systematic information to bear on policy
decisions. Because the analyst is generally asked to do this on a short
turnaround basis, they must work largely with existing data and research.
A key skill is being able to take the fuzzy, ill-formulated question
in the mind of an amateur or a decision-maker and translate that into
a problem that can be addressed with data. The outputs of the analysis
process may be written or presented orally and visually. A policy analyst
must therefore be able to write and speak simply and be able to present
data. Dr. Cozzens stressed the importance of analysts presenting data
with the conceptual framework and vision of those they work for. If you
work for someone who shares your vision, you may be able to change the
conceptual framework. The S&T policy world has been moved by the
combination of talents such as this.
Policy research builds the knowledge base that S&T policy analysis
uses in its short-term work. Policy research develops concepts and methods,
and is necessary for the analytical tasks that require long-term empirical
studies. Thus, the lifestyle of the policy researcher differs from that
of the policy analyst. The projects are longer-term and have a larger
scale, and the knowledge and academic base is more cumulative, requiring
footnotes into a literature. The projects also require funding, whether
from the United States or governments such as those in Europe, Japan,
and South America, which have been investing heavily in the human resources
and knowledge of S&T policy over the last two decades.
Dr. Cozzens shared a number of resources that should be part of the
knowledge base of an S&T policy professional. These included a few
good overviews, histories of science policy and of the research university,
a set of books that represent new thinking and critical perspectives,
readings in innovation theory, writings on science for policy (i.e.,
the use of expertise in government decision-making in other areas), and
new growth theory. The list could be expanded with readings in high-technology
development strategy and technology assessment and careers and human
resource development for S&T.
In addition to this knowledge base, an S&T policy professional needs
the skills of the analyst -- setting up a problem for analysis; evaluating
and presenting data; and clear, simple communication, in written and
oral form. The difference between a good policy professional and a great
one, she added, is a habit of mind to keep the big picture in view, to
watch trends, and to develop a sense of underlying dynamics in the enterprise.
Dr. Cozzens closed with her thoughts about what the field of S&T
policy stands for. She said that S&T policy professionals believe
in the transformative power of science and technology. They believe that
S&T can make life better for all of humankind. But, she stressed
that having a little "gee whiz" in us does not relieve us of
the need to be critical of the S&T enterprise. It is essential that
S&T policy professionals maintain professional standards in data
analysis, remain true to the data, and are able to stand outside the
enterprise and give it a hard look.
Dr. Cozzens posed the question: "What barriers are there to the
goal of S&T policy that keep it from living up to its full potential
for making life better for all humankind?" She suggested that the
outcome of this work is ameliorated by the struggle between various interests,
played out through public and private decision-making processes. These
interests include those of private firms, special interest groups, elected
representatives, researchers and research organizations, and the people
representing the interests of the world's poor. All S&T policy professionals
work for an organization that represents at least one of these interests,
and it is their job to represent that organization's interests. At the
same time, S&T policy professionals have a special responsibility
to the public to make sure that S&T will produce as much benefit
as possible for all of humankind through the actions and activities of
their organization.
Dr. Cozzens recommended that those persons interested in a career in
S&T policy study the mission statement and values of any organization
they consider working for to see whether it matches the objectives they
want to accomplish in their own lives. She concluded, "The joy of
this work -- the pleasure of this very rich and fascinating professional
community -- is that the combination of these values produces a great
diversity of specific views, that can be carried out in a wide variety
of jobs."
Charles E. Yocum
Senior Group Patent Counsel
Black & Decker Corporation
(March 13, 2002)
Topic: Intellectual Property Issues
Mr. Charles Yocum has a Batchelor degree in Electrical Engineering from
Georgia Tech, a Master of Science in Administration from George Washington
University and a Juris Doctor from the University of Baltimore. He is
an adjunct professor of Law at the University of Baltimore and faculty
member for Maryland and Virginia’s CLEs. Mr. Yocum is also a guest
lecturer to the United States Patent and Trademark Office Examining Corps.
The presentation was divided into two parts: (1) intellectual property,
and (2) the DeWalt story. He indicated that his company is at the low
end of high tech and that they dealt with high-volume as a way of doing
business, thus making patent rights and protections very important to
them. Also, he indicated that Black & Decker is the only major United
States' power tool company currently in operation. He elaborated on his
point about patent rights with a product that was marketed in 1994 called
the "Snake Light." This single product turned things around
for Black & Decker selling over 12 million units and generating over
$300 million in sales in the first two years. It received sundry awards
from various trade magazines. By the spring of 1995, there were 25 lawsuits
filed by Black & Decker to protect their product via intellectual
property patent laws with 55 defendants all over the world with most
of the cases settled out of court.
Mr. Yocum indicated that often time offshore "knockoff’s" would
copy the product and suggest they had enhanced a feature making it a
different product not subject to the patent restrictions. He explained
the different types of patents: utility; design; trademark; trade dress;
copyright; and unfair competition. He indicated that the more specific
a patent was, the more difficult it would be to copy the product. Yet,
the way to approach it was to improve on the product. Ninety-nine percent
of the patents are on the most specific aspect of a product; therefore,
the improvement needs to occur within a very specific area.
Patent protections afford Black & Decker the opportunity to provide
a significant amount of money for research and development (R&D)
than otherwise would be spent. This activity is critical for businesses
to prepare for future growth. The major costs of a new product are from
personnel and R&D. Patents are filed in a specific country in accordance
to the Paris Convention. Countries have one year to file patents with
the member countries.
A patent cannot be enforced while it is pending. Therefore, no recovery
is possible. Most often suits deal with products that are specifically
designed around another patent without infringing on it. Generally, lawyers
take these suites on a contingency basis. If a court finds in favor of
the plaintiff, it will issue a cease and desist order. If the plaintiff
prevails in a patent infringement case, they may also be awarded the
defendant’s profits for the product.
The DeWalt story: In 1990, Black & Decker had eight percent of the professional
power tool market. In 1992, they spent 50 million dollars to introduce the "Yellow/Black" DeWalt
power tools to professional users. In 1992, they had 20 products and by 1998,
they had 250 products. They focused their marketing efforts toward the end
users both where they worked and played. The company had 250 teams going from
job site to job site promoting the products. Black & Decker (DeWalt) sponsors
Matt Kensing’s No. 17 NASCAR racecar. Recently, they had a trademark
for the colors yellow and black upheld. Part of the argument was the result
of a survey, which indicated that 85 percent of professional users identified
those colors with DeWalt products. The sales history for DeWalt grew from 36
million in 1991, to 120 million in 1992, to 1.2 billion in 1998. DeWalt currently
has over 50 percent of the United States' market for professional power tools.
Black & Decker attributes a large part of their success to their ability
to patent their products. This in turn enables them to invest large sums of
money into R&D to produce better products in the future.
Norman E. Lorentz
Chief Technology Officer
Office of Management and Budget (OMB)
(March 27, 2002)
TOPIC: Technology and E-Business in the Federal Government
Mr. Norman Lorentz recently joined the Office of Management and Budget
(OMB) as the Chief Technology Officer. He has focused his efforts on
implementing E-Government initiatives and on planning and creating governmentwide
architecture and deploying the needed technologies.
The General Services Administration (GSA) established the FirstGov Web
portal in an effort to streamline Federal Government provision of services
to the various agencies. To improve this process, the President’s
Management Council (PMC) established the E-Government Task Force, headed
by OMB -- the agency responsible for defining E-Government initiatives
including formulating and promulgating policies and procedures to better
serve the people.
Mr. Lorentz outlined and discussed four main E-Government initiatives:
(1) Government-to-Citizen (G2C), (2) Government-to-Business (G2B), (3)
Government-to-Government (G2G), and (4) Internal Efficiency and Effectiveness
(IEE).
Twenty-four E-Government initiatives were selected from a priority list
of over 350 identified by a task force. The task force selected those
initiatives with the most promise and a high likelihood of deployment
within 18 to 24 months. The intent of these initiatives was to reduce
some of the redundancies and overlaps that currently plague the FirstGov
website. Mr. Lorentz also discussed the use of performance measures to
track the progress of E-Government initiatives across the agencies.
Mr. Lorentz supports the use of Quicksilver Processes employed by the
task force -- a management tool applied across agencies. An example is
employment/hiring processes, which Mr. Lorentz argued, could be made
much easier by implementing these initiatives. The E-Government initiative
is geared towards improving the lives of citizens and federal employees.
The OMB is responsible for implementing these 24 initiatives by working
closely with E-Government leaders and managing partners in other agencies.
The terrorist attacks of September 11th tremendously impacted the implementation
of these initiatives because homeland security is now high priority.
Information Technology is a key tool in counterterrorism efforts because
it will improve communication for homeland security. The need to improve
performance in the government is paramount in the President’s agenda,
and employing wireless technology, or Internet-related technologies will
accelerate and streamline government delivery of services to citizens.
Levin Lauritson
Office of Satellite Data Processing and Distribution
National Environmental Satellite Data and Information Service
National Oceanic and Atmospheric Administration (NOAA)
(March 27, 2002)
Topic: Use of Earth Observation Satellites for Hazard Support
The mission of the National Environmental Satellite Data and Information
Service (NESDIS) is to provide timely global environmental data to promote,
protect, and enhance the Nation’s economy, security, environment,
and quality of life. To accomplish this substantial mission, NESDIS builds
and operates environmental satellites and provides data, information
services, and analytical research. Space-based remote sensing and observations
make possible improved weather, climate, and disaster assessments and
predictions. The two primary satellites types used by NESDIS are the
Geostationary Operational Environmental Satellite (GOES) and the Polar-orbiting
Operational Environmental Satellite (POES). NESDIS manages satellite
data at three centers: (1) the National Climate Data Center, (2) the
National Oceanographic Data Center, and (3) the National Geophysical
Data Center. These centers all conduct data processing and distribution
as well as applied research. Raw data is made available to government,
university, and public users without charge.
GOES provides continuous weather imaging and data to feed the models
for forecasting, severe weather warnings, and for the monitoring of global
surface conditions. POES, combined with the Defense Meteorological Satellite
Program, conducts visible and infrared cloud cover observations, performs
aerosol and snow coverage assessments, constructs sea surface and ice
coverage maps, and collects data important to land observations such
as vegetation coverage and fire observations. NOAA NESDIS leverages the
information from many other satellites in orbit to provide near real-time
observations of the earth.
One of the most well-known uses of satellite information is hurricane
forecasting. Accurate and timely forecasting can save the Nation millions
of dollars by allowing for evacuation and property protective measures
in the hurricane’s destructive path and avoidance of unnecessary
evacuations when the path shifts. Improved prediction capabilities have
made significant societal contributions, reducing the cost of damages
and loss of life from severe storms, hurricanes, tornadoes, and volcanic
ash distributions. Other uses of satellite data have included tracking
the mammoth icebergs in polar oceans and monitoring the Great Lakes'
ice coverage. These activities contribute to safe navigation and commerce.
GOES is also an integral part of the U.S. Coast Guard’s search
and rescue emergency location system and provides critical information
to their environmental hazard management duties such as oil spill response.
Satellites have a useful life of about four years and need to be flown
accurately to maintain proper orbit and orientation. A series production
program is in place, which continually requires planning for new instrumentation
development, deployment, and satellite launch needs. Over a dozen agencies
cooperate in the management of satellite operation and information gathering
with the common goal of creating a sustainable society resilient to natural
hazards.
Lynne A. Osmus
Deputy Associate Administrator for Civil Aviation Security
Federal Aviation Administration
(April 3, 2002)
TOPIC: Airline Security
In response to the tragedy of September 11th, legislation was passed
in November 2001 establishing a new security organization, the Transportation
Security Administration (TSA), within the Department of Transportation.
This organization is currently being developed, with major portions of
TSA being transferred from the Federal Aviation Administration (FAA).
Among the FAA functions being transferred to the TSA are security research
activities. Ms. Lynne Osmus has primary responsibility for orchestrating
the development of the TSA. She has a major task ahead of her as the
legislation dictates that this must be completed by November 2002.
Prior to the legislation, approximately 1,200 FAA employees were involved
in airport security. The number of airport screening personnel in the
TSA will be greater than 22,000. These employees, coupled with 1,300
screener supervisors, additional federal law enforcement officers and
other support staff could result in an approximate total of 50,000 TSA
employees.
The TSA will be taking over all airport security screening operations.
Airport security personnel will be federal employees. This will provide
the government with more flexibility and control of the screening operation.
The screeners will be better trained and also better compensated. Security
screening had been an airline responsibility, with the work generally
performed by contract personnel reporting to the airlines. As an initial
step following September 11th, the security contracts for the screeners
were transferred to the government. A significant effort is now underway
to hire permanent security screeners. The hope is that many of the contract
employees can be hired; however, citizenship concerns may disqualify
some current personnel. The TSA is also currently involved in hiring
Federal Security Directors for all the Nation’s major commercial
airports. Additionally, the TSA trained over 10,000 National Guard troops
to provide airport security.
If airport security operations were all that needed to be done, the
TSA responsibilities would be daunting enough. However, the TSA is also
responsible for deploying additional screening technology. Massive numbers
of additional screening systems have to be deployed at the Nation’s
airports. Research and development of enhanced screening equipment is
also necessary. After September 11th, a Broad Agency Announcement was
released soliciting improved inspection technologies. This resulted in
an overwhelming number of ideas and concepts being proposed. The TSA
has the additional responsibility of evaluating these proposals.
Ms. Osmus discussed some of the new technologies becoming available,
including biometrics and trace detection technology. The concepts of
a "trusted traveler" and the use of more intrusive inspection
technologies that may be used to focus security screening efficiently
were also explored. The requirements for explosive detection system resolution
capability, alarm resolution technology, the probability of detection
and probability of false alarm associated with the technology, and their
impact on the efficiency of baggage processing were addressed. Technical
discussions are being held within TSA on the best technological approach
to meet the congressional mandates.
Other issues being addressed by the TSA include remodeling airport security
checkpoints to increase both security and efficiency. In our discussion
with Ms. Osmus, questions concerning the recent security incidents that
resulted in the clearing of airport concourses and the re-screening of
travelers were raised. The impact of the new security requirements on
general aviation and business aviation was also discussed.
Other questions raised during our discussion involved progress toward
the hardening of aircraft cockpit doors, arming pilots, the air marshal
program, and pilot training in aircraft maneuvers to disrupt hijackers.
Additional questions involved the use of intelligence information and
coordination between the FAA and the military.
We were left with the overall impression that the TSA has many demanding
tasks in front of it. The challenges put before it by Congress are difficult;
however, meeting them is of vital importance.
27th Annual American Association for the Advancement of Science
(AAAS) Colloquium on Science and Technology (S&T) Policy
(April 11 and 12, 2002)
The theme for this year's AAAS Colloquium was "Science and Technology
in a Vulnerable World: Rethinking Our Roles."
The Colloquium began with welcoming remarks by Dr. Alan I. Leshner,
AAAS Chief Executive Officer. In his opening remarks, he commented that
science and technology are embedded in all aspects of life, that scientific
progress was being made at an outstanding pace, and that the Colloquium
would address such timely topics as terrorism, cloning and the economy.
Dr. John H. Marburger, Director of the White House Office of Science
and Technology Policy, gave the keynote address. His speech began with
a description of the current activities of the Office of Science and
Technology Policy, including its role in supporting Homeland Security.
He described the initial government response to the technical needs of
the "war on terrorism." He pointed out that institutions that
produce science and technology are not only resources, but also potential
targets of terrorism. He stressed that he felt the biggest challenges
in the "war on terrorism" were in the implementation of technology.
With the exception of bioterrorism, the terrorist threat does not demand
advances in basic research. He stressed the need to establish the Nation’s
scientific policy on the basis that new scientific discoveries are needed
on a broad front, not specifically targeted to a societal need. We can’t
predict which scientific research will significantly impact our economy
in the future. He also addressed the topics of balance of funding for
basic research, the President’s agenda for managing science, and
the importance of the social sciences. He concluded his talk by discussing
work force issues, particularly the need to increase the number of American
students pursuing advanced degrees in science and technology.
The next item on the agenda was a plenary session, addressing "Budgetary
and Policy Context for Research and Development (R&D) in FY 2003." The
first address during this session was provided by Mr. Scott Lilly, Minority
Staff Director of the House Committee on Appropriations, who painted
a picture of increasing competition for increasingly limited resources
given the rise in spending on security. He stressed the significant contribution
that investment in science and technology made to economic growth in
the 1990s, and the concerns over the need to develop a more educated
work force.
Mr. Lilly’s presentation was followed by a speech given by Dr.
Kei Koizumi, Director of the AAAS R&D Budget and Policy Program.
He provided an overview of the federal budget proposals for R&D in
FY 2003 and indicated that, given the projected budget deficit and the
demands of homeland security, the non-Department of Defense R&D budget
would see minimal cuts. He noted that counter-terrorism R&D had nearly
tripled. Increases to the NIH budget would be funded despite the fact
that the NIH budget now almost equals the total of all other non-defense
R&D.
The next speaker was Dr. G. Wayne Clough, President of the Georgia Institute
of Technology. He stressed that effective scientific research requires
stable funding, a balanced portfolio and an expanding talent pool. He
discussed the impact of the Bayh-Dole Act and the decline in the number
of American students in advanced degree programs in science and engineering.
The plenary session ended with an address by Ms. Deborah L. Wince-Smith,
President of the Council on Competitiveness. She stressed the need to
keep the economy moving in our new environment. She stated that the credit
for recent productivity gains should be given to the previous investments
the country made in science and technology. She also mentioned the shortage
of technical talent. She stressed that government had the responsibility
for funding high-risk, crosscutting R&D activities.
Other events on the first day of the Colloquium included a luncheon
address by Mr. Benjamin H. Wu, Deputy Under Secretary for Technology,
U.S. Department of Commerce, and a series of concurrent sessions addressing
the following topics:
-
Technical Challenges to Governance;
-
The Regulatory Environment for Science: Conflict-of-Interest Issues;
and
-
Rethinking the U.S. S&T Policy System: Greater Responsiveness,
Continuing Excellence.
Following the concurrent sessions, policy roundtable discussions with
agency officials from the Department of Defense, the National Institutes
of Health, the National Science Foundation, and the Department of Energy
were held.
The first day concluded with an address by Dr. M.R.C. Greenwood, Chancellor,
University of California, Santa Cruz, on "Risky Business: Research
Universities in the Post 9/11 Era." Her presentation included discussions
of the following concerns of the research community:
Risk 1 -- Proposed limitations on researchers' access to data and methodologies.
Balancing the perceived risks of open access with the risks to the health
and vitality of the research community is exactly the kind of issue that
calls for a new partnership between the research community and the government.
The traditions and structure of research in the United States today
depend on replication and refutation, and that means sufficient data
and information on methods used must be published in peer-reviewed journals.
Risk 2 -- Proposed allocation of those tax dollars in the Administration's
FY 2003 budget. In particular, the changing allocation of R&D with
an increased emphasis on "missiles and medicine."
Other disciplines such as engineering and the physical sciences now
account for far smaller shares of total academic R&D than previously,
and are now at only 15 and 9 percent, respectively, of the total university
R&D portfolio.
This kind of imbalance may mean that we might not be training the right
mix of scientists and engineers and other scholars that we will need
to work for national security in the next generation. This is not to
say that we need less funding for health R&D; rather, we need more
non-defense R&D in many other disciplines. Perhaps it is even time
for the science and technology community to help out its other colleagues
and call for increases in federal funding for specific areas in the humanities.
Risk 3 -- The move to increase the tracking of foreign students in universities.
This risk relates to the role of foreign students in United States' universities.
Our government is concerned that potential terrorists may abuse our student
visa process as a mechanism of entering our country. As a consequence,
Homeland Presidential Directive 2 states that, "The Government shall
implement measures to end the abuse of student visas and prohibit certain
international students from receiving education and training in sensitive
areas, including areas of study with direct application to the development
and use of weapons of mass destruction."
If the United States decides to restrict access to foreign students,
we would have to develop new policies to prevent a net loss of science
and engineering personnel at all levels in the next generation.
Dr. Greenwood proposed a contemporary version of the National Defense
Education Act (NDEA). The NDEA was a direct result of an increase in
the perceived risk to national security by the launch of Sputnik. NDEA
marked a change in national science policy in response to national security
concerns, and it increased support for large numbers of students who
became scientists and engineers from the late 1950s throughout the 1970s.
All sectors of our society -- ranging from science and technology, security
and intelligence, defense, foreign relations, and economic development
-- need well-educated students in these areas for the future.
This is another important area that should be discussed within the framework
of a new partnership between government and universities.
Friday morning’s session of the 27th Annual AAAS Colloquium was
entitled, "Science and Technology’s Roles in the War on Terrorism
and Homeland Defense." This session began with a discussion by Dr.
Lewis M. Branscomb, Professor Emeritus, John F. Kennedy School of Government
of Harvard University entitled, "The Changing Relationship Between
Science and Government, Post-9-11." Dr. Branscomb discussed types
of terrorists, sources of vulnerability, and the general asymmetric threat.
He identified five terrorist threats: (1) bioterrorism, (2) nuclear radiation,
(3) infrastructure, (4) cities, and (5) borders. Also, he discussed the
S&T tools needed to counter these threats, including sensors, data,
vaccines, and biometrics (identification techniques). The central problem,
according to Dr. Branscomb, is that our government’s science agencies
are stove-piped, with very little cross coordination among the agencies,
leading to competition for limited resources.
Next, Mr. Donald A. Henderson, Director of the Office of Public Health
Preparedness, Department of Health and Human Services, discussed "Public
Health Preparedness," indicating that we are facing a new era in
biology and biological threats. Today, terrorists desire biological weapons
because they are easy to obtain and deploy, and most local "first
responders" (i.e., police, rescue, hospitals) are simply not prepared
to deal with this threat. He further stated that our Nation’s water
supply is probably at a low risk of biological attack. Today, the health
community has a network of over 80 laboratories dealing with biological
research.
The third speaker was Dr. Eugene H. Spafford, Professor of Computer
Science from Purdue University, who led a discussion about "Protecting
Our National Information Structure." There is approximately one
new computer virus every 75 minutes and by the end of the decade, this
number is expected to be one new virus every 20 minutes. Currently, the
Information Technology field is woefully understaffed, thus, there is
a large personnel demand. Part of the underlying problem in this area
is that policy decisions are being made by people lacking the technical
expertise to understand the problems, combined with large corporations
who are pushing their own special interests into the legislative process.
Dr. Eugene B. Skolnikoff, Professor of Political Science, Emeritus,
from the Massachusetts Institute of Technology gave the final talk of
the morning on "Research Universities and National Security: Can
Traditional Values Survive?" He identified four points surrounding
these traditional values as: (1) a commitment to openness and information
exchange, (2) data classification issues, (3) open relationship between
the government and universities, and (4) foreign students. He summarized
his talk by indicating that it is the policy decisions in Washington,
DC, which will ultimately determine the fate of these traditional values.
Prior to introducing the speakers for the afternoon's session, Dr. Albert
H. Teich, Director of the AAAS Science and Technology Policy Programs,
asked the question "Should some research be outlawed if the research
results would be destructive to society?" He acknowledged some of
the challenges to answering this question. He argued that we must balance
ethics with potential benefit and that we must distinguish legitimate
science from pseudo-science (i.e., science with a political agenda).
Dr. Teich went on to introduce this session’s speakers. Unfortunately,
one speaker, Dr. Howard Taylor, was unable to attend the session. He
was originally scheduled to speak on the topic of race and IQ.
Consequently, the first afternoon speaker was Dr. Ronald M. Green, The
Eunice and Julian Cohen Professor for the Study of Ethics and Human Values,
who spoke on the topic of human reproductive cloning. Human reproductive
cloning is a subject of great interest to many people. This has led to
congressional debates on whether to impose criminal punishment for this
activity. It has also led to rumors such as unsubstantiated reports of
human clone pregnancies. Dr. Green presented some "pros and cons" on
the subject. The arguments for a ban on human cloning research include
outlawing this research because:
-
It threatens grave and catastrophic risks to society. For example,
a tyrant may use cloning for creating humans for a specific purpose,
such as for slavery or military combat. Dr. Green rejects this argument
because of the time lag that would be required for such a race of
humans to reach an age that would allow them to fill such a role.
-
It threatens grave psychological risks to the cloned offspring.
Can the children live up to preconceived standards? Dr. Green rejects
this argument because it is based on the erroneous reliance on genome
determination. Genes do not make the organism! Dr. Green points out
that identical twins are different even though they share the same
genome. Thus, cloned humans will have personalities and aptitudes
different from the "parent."
-
It threatens physiological risks to cloned individuals. Evidence
for this argument includes the fact that Dolly, the cloned sheep,
has suffered from defects during development and that cloned mice
tend to have an obese phenotype even if the "parent" does
not. In contrast, some researchers claim that cloned cattle are generally
healthy and normal. Dr. Green accepts the argument that cloned humans
could suffer physiological damage.
The arguments against a ban on human cloning research include:
-
Some people, such as infertile couples lacking gametes or lesbian
couples, may benefit. Dr. Green accepts this argument.
-
A cloned child would not be harmed as long as the life is worth
living. This position argues that even if a child is disabled as
a result of the cloning, the worthiness of the child’s life
outweighs the negatives. Dr. Green disagrees with this argument because
he feels that just being born is not a benefit; therefore, birth
defects do cause harm not outweighed by just being born.
-
The ban would be ineffective because it could be evaded. Dr. Green
rejects this argument because it assumes a 100 percent ban on all
cloning and related research is possible.
-
The ban would create a research precedent that would impede other
important related research. Again, Dr. Green rejects this argument
because this assumes that the ban would apply to all related research.
At the end of this exercise, Dr. Green is left with two conflicting
arguments he feels are valid, the first is the argument for a cloning
research ban due to physiological risks and the second is against a ban
due to the benefit to infertile or lesbian couples. Dr. Green agrees
with the June 1997 recommendations of the National Bioethics Commission
that called for a ban, but with the provision that Congress should review
the issue after a set period of time. Additionally, Senators Edward M.
Kennedy (D-MA) and Dianne Feinstein (D-CA) have called for a ten-year
moratorium on human cloning research. Dr. Green agrees with the limited
ban on human cloning research. He feels that more time and research on
animals are needed before the procedure should be applied to humans.
He also feels that we must continue stem cell research where cells, but
not a viable human, are cloned because of the high potential for benefits,
such as treatments for Parkinson’s disease.
Dr. Green pointed out that some organizations have attempted to use
this issue to further other political agendas. For example, the New Hampshire
Act (H.B. 1464) prohibiting human cloning in that state has a line stating, "life
begins at conception." "Pro-life" proponents commonly
use that statement in their efforts to have abortions banned.
Following Dr. Green, we heard from Dr. David A. Kay, Director of the
Center for Counterterrorism Technology and Analysis, and Vice President
of the Science Applications International Corporation. Dr. Kay’s
topic was genetically engineered bio-weapons. He started his discussion
by contrasting biological weapons R&D with nuclear weapons R&D.
He listed the nine essential elements that served as control mechanisms
for the nuclear weapons R&D program. These were: (1) the weaponization
was secret, (2) the R&D was federally-funded and controlled, (3)
the information was secure, (4) the R&D was done at government facilities,
(5) there was an overwhelming threat, (6) there was global conflict (i.e.,
United States versus the U.S.S.R.), (7) the R&D was large and expensive,
(8) it had limited non-military benefits, and (9) the R&D took place
within an "old science" paradigm that called for a leisurely
pace. However by the 1950s, nuclear research knowledge, with the exception
of engineering, was in the open literature. By the 1960s, all essential
knowledge on thermo-nuclear research was in the open literature.
In the biological weapons research arena, there is a lack of a defined
threat and a lack of agreement on the seriousness and likelihood of the
threat. In contrast to the nuclear weapons R&D, the biological weapons
research is following a new paradigm. The science has become more international
and is being conducted at a much faster pace. The research is commercially
and scientifically dominated instead of having a primarily military focus.
As such, the biological weapons research offers significant non-military
benefits, especially in the areas of genetic engineering. Dr. Kay asked
the question: "What should we use as control mechanisms for biological
weapons R&D?" He suggests that we must utilize international
controls and track the research being done. We must control the dissemination
of the research results allowing only limited use. Dr. Kay thought that
we should follow international lines of research, but did not think this
would be likely because of political and financial hurdles. He feels
that the international community should criminalize certain research,
but acknowledged that it would be hard to get consensus on defining the
crime. He feels that we should use tools of attribution and detection
(i.e., forensic science) to track the origin of biological agents. However,
he is pessimistic that these ideas will ever be implemented because he
feels that the international community will not do what is necessary
to successfully control biological weapons R&D. The lack of agreement
on the threat and conflicting political motivations are the primary obstacles
to developing adequate control mechanisms.
Dr. Daniel Kevles, Stanley Woodward Professor of History at Yale University,
also participated in the session. He sought to bring the issues discussed
into a larger context. He explained that Western attitudes towards biological
and physical innovations have always been tested. Historically, even
the Copernican and Darwinian revolutions were met with apprehension.
We tend to greet biological innovations with fear, but eventually they
become commonplace. For example, the first experiments on artificial
insemination were denounced as leading to an assembly line for humans.
Today, artificial insemination is a widely accepted means of reproduction.
With new innovations, the issues of ethics, consequences and economics
inevitably arise. Cloning challenges our image of conception. As we pursue
this issue, we need to consider how parents will treat their cloned children.
However, we have no experience in this arena. The fear of physiological
problems developing is real but we need to not discount therapeutic cloning
because of the potential benefits. We also need to consider the legal
aspects of cloning. Dr. Kevles feels that a ban on all cloning will drive
reproductive cloning underground where there will be no control mechanism.
Currently, religious motivations have helped restrict reproductive cloning
and embryonic research and have served as a control mechanism.
On the issue of biological weapons, Dr. Kevles agrees with Dr. Kay.
He feels that we need to recognize that nations and terrorists possess
biological weapons. The threat of retaliation will deter some nations
from utilizing their weapons. This threat would probably not work on
terrorists or rouge states. Dr. Kevles feels we need a code of research
ethics condemning biological weapons research. Unfortunately, this code
would be ineffective because: (1) biological agents are easy to buy,
(2) they are a popular combat option in some parts of the world, and
(3) there is plenty of money available to purchase these weapons.
John T. Everett
Manager, United Nations Atlas of the Oceans; and
National Oceanic and Atmospheric Administration (NOAA) Fisheries Office of
Science and Technology
Chief, Division of Research (Retired)
(April 18, 2002)
TOPIC: Oceans and Climate Change: What We Think We Know
Dr. John Everett comes from a commercial fishing family and has been
a commercial fisherman in Massachusetts. He worked for over 31 years
in responsible positions including Senate Staff, Staff to the NOAA Administrator,
Manager of Tuna and Dolphin research, Chief of Fisheries development,
and most recently as Chief of the Research Division of the National Marine
Fisheries Service. He has chaired several impact analyses by the Intergovernmental
Panel on Climate Change (IPCC). The IPCC is organized to provide authoritative
statements and scientific opinion on climate change. Several hundred
scientists provide broad perspective and peer review. The culprits of
CO2, Methane, and NO2 are undoubtedly increasing. With these, changes
will come CHANGES in temperature, ocean and air circulation, sea level,
ice coverage and weather. Ten years ago, there was some doubt about global
warming, but now there is broad consensus that it is occurring. Is it
all bad? Dr. Everett explained the magnitudes of change expected. Temperature,
for example, is uneven, but generally rising. The ocean deepwater circulation
system (conveyor belt), which drives the ocean temperature in many areas,
as well as many nutrient and salinity levels, could be affected by global
warming. This will bring about bad news for some, but opportunities for
others.
The postulation of change in sea level, ice cover, and frequency and
intensity of storms seem to be devastating to the coastal regions. However,
Dr. Everett points out that tree and plant species may benefit from warmer
climes and higher levels of CO2 and that other species may thrive. Increases
in temperature could mean less heating costs for people living in regions
closer to the poles. Shifting populations toward more hospitable climates
may bring about some savings not currently considered in the equation
when the disasters are predicted. Some species can move and occupy new
global niches. Others may die out. The increased ocean temperatures currently
threaten large areas of coral reefs. However, mixes and symbiotic relationships
will re-establish when the change stabilizes.
Global warming should benefit fresh water fisheries and aquaculture
at higher latitudes. Fishery locations in the oceans should be able to
produce just as much, but they will move and the species mix will change.
Fishermen and their countries will suffer the nuisance of political and
distance boundaries that are not consistent with current practices. Subsistence
and small-scale fishermen worldwide will suffer the most, and important
food supplies for local regions may become more difficult to raise or
catch.
The important findings of the IPCC study conclude that there could be
harsh consequences because of the already existing stresses of over fishing,
loss of wetlands, and pollution. The instability in world fisheries will
be exacerbated by a changing climate. Globally, economic and food supply
impacts should be small. Nationally, they could be large. Dr. Everett
believes that over fishing is more important and critical than climate
change today. That relationship should reverse in 50 to 100 years. His
take away message was to look at the data with an open mind. Try to put
all the observed and forecast costs and benefits into the model predicting
change. Be mindful of the objectives of the organizations conducting
research and studies, as they may have a natural intended or unintended
bias.
Donald Scavia
Chief Scientist
National Oceanic and Atmospheric Administration (NOAA)
U.S. Department of Commerce
(May 1, 2002)
TOPIC: Action Plan for Reducing, Mitigating, and Controlling Hypoxia
in the Northern Gulf of Mexico
Dr. Donald Scavia described NOAA’s vision, structure, and activities.
He explained how the northern Gulf of Mexico Hypoxia study fits into
NOAA’s programs and presented the results of the White House Office
of Science and Technology Policy that took the lead of the Mississippi/Gulf
of Mexico Watershed Nutrient Task Force.
Scientific investigations have documented a zone on the Gulf of Mexico’s
Texas-Louisiana Shelf with seasonally low oxygen levels (<2 mg/l)
called the "hypoxic zone" or "dead zone." Between
1993 and 1999, the zone of mid-summer bottom water hypoxia in the northern
Gulf of Mexico has been estimated to be larger than 10,000 km2 with a
five-year (1996 to 2000) average of 14,128 km2. The hypoxic zone results
from complex interactions involving excessive nutrients, primarily nitrogen,
carried to the Gulf by the Mississippi and Atchafalaya Rivers; changes
in the basin; and the stratification in the waters of the northern Gulf
due to the interaction of fresh river water and the saltwater from the
Gulf. The changes in the basin result from channelization, loss of natural
wetlands and vegetation along the banks, and wetlands conversions.
Nutrients are essential for healthy marine and freshwater environments.
However, an overabundance of these nutrients can trigger excessive algal
growth (or eutrophication), resulting in several possible ecosystem responses.
In the near shore Gulf, excessive algal growth, resulting from excess
nitrogen, depletes dissolved oxygen in the bottom water, and a corresponding
loss of aquatic habitat in the water column as well as in the benthos.
Mobile organisms can leave the hypoxic zone, but those that cannot leave
are either weakened or die. Fish, shrimp, crabs, zooplankton, and other
important fish prey are significantly less abundant in bottom waters
in areas that experience hypoxia.
Additionally, excess nutrient levels have degraded water quality throughout
the Mississippi and Atchafalaya Rivers basin. Most states in the Basin
have significant river areas with high nutrient concentrations, primarily
phosphorus, so they are not fully supportive of aquatic life. In some
areas excess nitrates, which can be a human hazard, are also threatening
groundwater supplies.
A large portion of the nutrients entering the Gulf come from human activities,
such as sewage and industrial treatment plant discharges, and storm water
runoff from cities and farms. Nutrients from automobile exhaust and fossil
fuel power plants also enter the waterways through air deposition to
the vast land area drained by the Mississippi River and its tributaries.
About 90 percent of the nitrates in the Gulf come from non-point sources.
About 56 percent of nitrates enter the Mississippi River above the Ohio
River with the Ohio River Basin adding 34 percent. High nitrogen loads
come from basins receiving wastewater discharges and draining agricultural
land in Iowa, Illinois, Indiana, southern Minnesota, and Ohio.
The primary approaches to reduce hypoxia in the Gulf of Mexico appear
to be:
-
Reduce nitrogen loads from watersheds to streams and rivers in the
basin, and
-
Restore and enhance denitrification and nitrogen retention within
the Basin and on the coastal plain of Louisiana.
Annual water quality measurements and stream flow records indicate that
a 49 percent reduction in total nitrogen flux to the Gulf is necessary
to return to loads comparable to those during 1955 through 1970. Model
simulations suggest that, short of the 40 percent reduction necessary
to return to levels in the mid-century, nutrient load reductions of about
20 to 30 percent would result in a 15 to 50 percent increase in bottom
water dissolved oxygen concentrations. Any oxygen increase above the
critical 2 mg/l threshold will have a measured positive effect on marine
life, thus even small reductions in nitrogen loads are desirable.
While the proposed strategy is focused on reducing nitrogen loads to
the northern Gulf of Mexico, many of the actions proposed through this
plan will also achieve basin-wide improvements in water quality by reducing
phosphorus as well. Likewise, actions taken to address local water quality
problems in the basin will frequently also contribute to reductions in
nitrogen loads to the Gulf.
The goals of this strategy are based upon five principles: (1) encourage
actions that are voluntary, practical, and cost-effective; (2) utilize
existing programs,
including existing state and federal regulatory mechanisms; (3) follow adaptive
management; (4) identify additional funding needs and resources during the
annual agency budget process; and (5) provide measurable outcomes as outlined
in the following goals and strategies.
The goals are threefold:
-
Coastal goals -- By 2015, reduce the five-year average of the hypoxic
zone to less than 5,000 km2. This can be achieved through implementation
of specific, practical, and cost-effective voluntary actions by all
categories parties within the Mississippi/Atchafalaya River Basin
to reduce the annual discharge of nitrogen into the Gulf.
-
Within the basin goals -- To restore and protect the waters of the
31 states and tribal lands within the Mississippi/Atchafalaya River
Basin through implementation of nutrient and sediment reduction actions
to protect public health and aquatic life as well as reduce negative
impacts of water pollution on the Gulf of Mexico.
-
Quality of life goals -- To improve the communities and economic
conditions across the Mississippi/Atchafalaya River Basin, in particular
the agriculture, fisheries, and recreation sectors, through improved
public and private land management and a cooperative incentive based
approach.
There are no simple solutions that will reduce excessive nutrients in
the Gulf. An optimal approach would take advantage of the full rage of
possible actions to reduce nutrient loads and increase nitrogen retention
and denitrification. This should proceed within a framework that encourages
adaptive management in a cost-effective manner. This national effort
to reduce Gulf Hypoxia will be implemented within the existing array
of state and federal laws, programs and private initiatives. Hypoxia
in the northern Gulf of Mexico is a critical environmental issue that
occurs far beyond the geographic jurisdiction of individual states and
tribes. Additional federal funding is needed to support the collaborative
approach agreed upon by the Mississippi River/Gulf of Mexico Watershed
Nutrient Task Force.
Thomas E. Mann
W. Averell Harriman Senior Fellow in American Governance
The Brookings Institution
(May 8, 2002)
TOPIC: Campaign Finance Reform
Dr. Thomas Mann addressed us on the topic of campaign finance reform.
He provided us with copies of an article, which he co-authored with Dr.
Norman J. Ornstein, Resident Scholar, American Enterprise Institute for
Public Policy Research, entitled, Myths and Realities about the Bipartisan
Campaign Reform Act of 2002. Senators John McCain (R-AZ), Russ Feingold
(D-WI), Olympia Snowe (R-ME), and James Jeffords (I-VT), as well as Representatives
Christoper Shays (R-CT) and Marty Meehan (D-MA) drafted the Bipartisan
Campaign Reform Act (BCRA). The constitutionality of the BCRA is currently
being challenged by a diverse set of opponents including the California
Democratic and Republican parties, the National Voting Rights Initiative,
and the National Rifle Association. Generally, these groups are challenging
different parts of the BCRA. For example, the National Voting Rights
Initiative is challenging the increase in the amount of regulated contributions
allowed. The BCRA calls for a change in the amount of regulated contributions
from $1,000 to $2,000. A three-judge panel of the DC District Court will
hear these challenges and render a decision by December 2002. The decision
will probably be appealed to the U.S. Supreme Court in the spring.
Dr. Mann stated that this is the first major change in campaign finance
at the federal level since 1974 and that the legal challenges and administrative
proceedings to implement the law are going on simultaneously. In describing
how the BCRA came to pass and its potential impact, Dr. Mann quoted The
Grateful Dead line, "What a long strange trip it’s been." He
stated that Congress has been struggling with campaign finance reform
since the late 1970s. This has been partially due to the various hurdles
encountered. Dr. Mann identified these hurdles as:
-
Large money contributions -- "Money flows like water."
-
Constitutional limitations (i.e., right to free speech).
-
Self-interests of incumbents.
-
No presidential leadership -– President Bush has avoided the
issue.
-
Policy process has multiple veto points -– this enables a
Member of Congress to vote for the BCRA with the knowledge that it
will be vetoed later; therefore, it is politically safe to do so.
-
Philosophical differences.
-
Partisan calculations -- Democrats and Republicans do agree on the
size of campaign contributions.
-
Interest group opposition -– this has created "strange
bedfellows" with groups normally opposed to each other joining
forces against campaign finance reform.
-
Low level of public interest.
Dr. Mann explained that the Buckley Campaign Finance Reform Act of 1974
was unstable and has gradually been changed. He described the efforts
of former President Bill Clinton and his Senior Political Advisor Dick
Morris that damaged the "spirit" of campaign finance reform.
In the 1996 election campaign, former President Clinton and Dick Morris
orchestrated a move to use "soft" money to distribute candidate-specific "issue
ads," which undermined the disclosure and contribution limitation
provisions of federal election law. Because these "issue ads" promoted
a particular issue but did not explicitly promote a specific candidate,
they could be financed with "soft" money. "Soft" money
is unregulated funds that can be used for federal, state, or local elections.
This was later expanded to include national committees. State law regulates
this money; however, some states do not limit "soft" money
contributions. In contrast, "hard" money is regulated and limits
are imposed on the amount that can be contributed to federal elections.
The campaign finance system has become increasingly unstable because
unregulated money has started to swamp regulated money.
Dr. Mann identified events and activities that have become agents of
change for campaign finance reform. These are:
-
A number of foundations are supporting research on the "issue" ads
and other parts of finance campaign activities.
-
The sponsors of the BCRA have been committed and keep pushing the
issue.
-
The McCain presidential campaign increased his credibility and clout
and put the issue of campaign finance reform in front of the American
public.
-
The 2002 Senate Election has resulted in a net gain of five seats
for campaign finance reform proponents.
-
The Enron collapse has publicized campaign finance activities in
the press.
-
Senator Jeffords’ conversion from a Republican to an Independent
has given the Democrats control of the Senate.
-
President Bush has not threatened to veto the BCRA.
The BCRA is actually a very modest bill intended to try to get us back
to the campaign finance system before the 1996 election when things went
out of control. Dr. Mann thinks that we are currently encouraging our
leaders to lie in order to utilize "soft" money. The BCRA will
lower the incentive for politicians to lie. Additionally, because corporate
contributions will be limited, there is less potential for a conflict
of interest when politicians have to address issues dealing with corporations
(i.e., Enron, Microsoft). Finally, the BCRA provides incentives for national
parties to utilize "hard" regulated money and rely less on "soft" unregulated
funds.
Gary Gardner
Director of Research
Worldwatch Institute
(May 9, 2002)
TOPIC: Sustainable Development Concepts
Mr. Gary Gardner presented an interesting, though alarming, view of
global environmental conditions and how advancement of technology has
failed to secure options for sustainable development. He pointed out
the declines in water resources, reefs, forests; and the concurrent increases
in species extinction and population growth. The inference is that humans
are not using natural resources wisely to provide for future generations.
Mr. Gardner outlined five links in a chain of thinking directed towards
achieving sustainable development. The first is the assertion that economics
is a subset of the environment. He alleges that prioritizing consumption
of raw materials and ignoring the production of waste shows that environmental
considerations are economically ignored. The second assertion is that
economic accounting must tell the whole truth. That is, the value of
nature’s services both in the production of raw materials and in
the recycling of products (which he estimates represents an economic
input of $33 trillion) must be considered. Third, use of renewable resources
cannot exceed the rate of resource regeneration. He noted water supply
diminishment as a primary example. Fourth, materials should circulate
or recycle. Mr. Gardner estimates that 90 percent of raw materials are
used once and then discarded -- an unacceptable amount if sustainable
development is ever to be reached. He cited the concept of an "eco-industrial
park," where the waste of one factory process is the immediate input
to another, as an example of an efficient use of resources. He also commended
companies for attempting to use super-efficient processes that approach
zero waste production. Finally, we must look at whole environmental systems
not just subsystems, when we make our evaluations and considerations
for development. For example, we must look at the entire nitrogen cycle,
not just fertilizer production.
Mr. Gardner concluded his presentation by defining development as "increasing
one’s options." He rhetorically questioned whether we have
too many options. For example, one can go to the supermarket and choose
from some 140 different cereals. Food is available in this country any
time, any place, and cheaply -- is this the reason 65 percent of Americans
are overweight, leading to at least a 12 percent increase in health care
costs? He closed by suggesting that restraint is an integral part of
development, and that keeping environmental restraint as part of the
equation can help attain sustainable development.
Brown Bag Lunch: "Transforming Inventions Into Biomedical Products"
(May 16, 2002)
This brown bag lunch was the second in a series of discussions on technology
transfer-related issues, co-sponsored by the Office of Technology Policy,
Technology Administration, U.S. Department of Commerce, and the DC Chapter
of the Technology Transfer Society.
New drug development, drawing as it does on the ongoing advances in
molecular biology and other biosciences, is not one of the most dynamic
arenas for technological innovation. Two members of the Technology Transfer
Branch (TTB) of the National Cancer Institute (NCI) spoke on the topic
of "Multiple Parties, Multiple Interests, One Goal -- Transforming
Inventions into Biomedical Products."
Dr. Karen Goldman, a Technology Transfer Specialist at the TTB, presented
a number of case studies and discussed some of the unique intellectual
property issues that arise in the increasingly complex partnerships needed
for cancer drug development. Some of the cases illustrated the Intellectual
Property (IP) management solutions that have worked in the past and some
suggested the new challenges just now emerging. Dr. Goldman explained
that the more collaborators there are in the process, the more health
innovation is possible. The collaborations supported by the NCI include
multiple grantee sites and multiple industry collaborators with issues
of overlapping rights.
Dr. Kathleen Sybert, Chief of the TTB, explained that the TTB provides
a complete array of services to support the NCI's technology development
activities. To ensure that these activities comport with federal statutes,
regulations and the policies of the National Institutes of Health (NIH),
a large part of TTB's responsibilities includes the day-to-day negotiations
of transactional agreements between the NCI and outside parties, including
universities, pharmaceutical and biotechnology companies. These agreements
provide for: (1) the exchange of research materials under the Simple
Letter of Agreement (SLA), (2) collaborative research conducted under
cooperative research and development agreements (CRADAs), (3) pre-clinical
and clinical studies of the safety and efficacy of new pharmaceuticals
under clinical trial agreements (CTAs), and (4) the exchange of confidential
information under confidential disclosure agreements (CDAs).
TTB also reviews employee invention reports, generates patent ability
reports and makes recommendations to the NIH's Office of Technology Transfer
(OTT) concerning filing of domestic and foreign patent applications.
The NCI TTB staff participates in meetings, discussions and conferences,
as appropriate, to stay apprised of and monitor the scientists' needs.
In addition, the NCI TTB staff negotiates and secures execution of license
agreements for information technology products under authority of the
National Cancer Act. The TTB conducts trademark registration and licensing
for the Public Health Service.
Dr. Sybert identified a number of challenges that her office faces,
namely that data sharing can be difficult, but so can sharing intellectual
property rights. The NIH CRADA mechanism addresses these challenges.
When multiple institutions are involved in collaborative research, another
challenge is determining who has rights to the inventions. It can take
over a year to negotiate IP rights just within the NIH departments. Dr.
Sybert concluded that throughout the existence of the TTB, they have
learned to listen and look at all counterproposals.
Site Visit to the National Institute of Standards and Technology
(May 22, 2002)
With more than a century of experience, the National Institute of Standards
and Technology (NIST) is one of the Nation’s oldest laboratories.
For all of its tenure, NIST has developed and promoted measurements,
standards, and technology to enhance productivity, facilitate trade,
and improve the quality of life.
NIST Director, Dr. Arden Bement, explained that NIST is a non-regulatory
agency of the U.S. Commerce Department’s Technology Administration.
With a FY 2002 budget of about $819 million, including income from other
agencies and the sale of measurement services, NIST is staffed by about
3,000 scientists (including two recent Nobel Prize winners in Physics),
engineers, technicians, and support personnel. About 1,600 visiting researchers
complement this staff. In addition, NIST partners with 2,000 manufacturing
specialists and staff at affiliated centers around the country. NIST
has four programs that it uses to serve its customers.
-
The NIST Laboratories, which conduct research that advances the
Nation's technology infrastructure and is needed by United States'
industry to continually improve their products and services.
-
The Baldrige National Quality Program, which promotes performance
excellence among United States' manufacturers, service companies,
educational institutions, and health care providers; conducts outreach
programs and manages the annual Malcolm Baldrige National Quality
Award, which recognizes performance excellence and quality achievement;
-
The Manufacturing Extension Partnership, a nationwide network of
local centers offering technical and business assistance to smaller
manufacturers; and
-
The Advanced Technology Program, which accelerates the development
of innovative technologies for broad national benefit through funding
partnerships with the private sector.
Although it is organized around these four programs, Dr. Bement explained
in a tour of the "NIST and Your Community" display, that the
agency focuses on industry sectors and affects nearly every aspect of
daily life -- from autos, to semiconductor electronics, law enforcement,
health care, science research, home, and work.
A tour of the NIST Center for Neutron Research highlighted how the NIST
laboratories work with other organizations -- ranging from the Smithsonian
Institution and the National Institutes of Health to auto and oil companies
as well as universities. This national user facility is exploited to
better understand the makeup of materials, furthering both basic research
and industrial production.
Nanotechnology is one of the hottest areas of technical interest. It
is a major multi-year federal initiative receiving support and scores
of companies pursuing nanoscale applications in an effort to reap the
benefits from the development of devices on the scale of very small particles.
NIST has a major effort under way in this area, as explained by Dr. Eric
Steele in his laboratory that advances the state-of-the-art in nanoscale
measurements.
The world’s most advanced measurement laboratory is under construction
at NIST. It is a project that promises to give researchers the super-controlled
environment -- in terms of temperature, humidity, vibration, and cleanliness
-- needed to do the most sophisticated research. Director of Administration
and Chief Financial Officer at NIST, Mr. Jorge Urrutia, described the
Advanced Measurement Laboratory. Mr. Urrutia is leading a team of NIST
staff and private contractors to keep the $235 million project on time
and within budget.
The World Trade Center disaster is the subject of a pending NIST investigation
and related research. Dr. John Gross and Dr. Ronald Rehm explained NIST's
background and capabilities in performing building and fire disaster
investigations. With a range of technical experts available in the various
NIST laboratories and a reputation for neutrality, NIST has been selected
by President Bush and Congress to help improve our understanding of how
the World Trade Center buildings performed after the impact of the aircraft
from the terrorists’ September 11th attack. The goal is to improve
public safety, including that of emergency responders and occupants,
by better understanding design, construction, maintenance, and use issues.
Finally, reinforcing the role of NIST in promoting pioneering technical
research that leads to new technologies and industrial innovations, Mr.
Marc Stanley, Acting Director of NIST's Advanced Technology Program (ATP),
provided a briefing on the program’s history, accomplishments,
plans, and controversies -- and how the Administration’s initiatives
aim to stabilize a program which has been the subject of political debate.
Barney S. Graham
Chief
Viral Pathogenesis Laboratory and Clinical Trials Core
Dale and Betty Bumpers Vaccine Research Center
(June 5, 2002)
Topic: Vaccine Research and Production
Dr. Barney Graham’s presentation, "HIV Vaccine Development,
a Global Perspective" provided us with an opportunity to discuss
efforts to develop a vaccine for HIV. Dr. Graham has spent most of his
professional career performing virology and vaccine development research.
He is a highly respected researcher, with many academic and professional
awards.
In 1981, the first AIDS cases were identified. Dr. Graham treated his
first AIDS patient in 1982. The statistics and information presented
were alarming. Dr. Graham believes that AIDS and HIV will destabilize
our society, thus AIDS should be a bigger national concern than the current "war
on terrorism."
Globally, over 25 million people die annually from AIDS -- that is 7,000
people per day. Approximately, 70 percent of the cases are in sub-Saharan
Africa. Additionally, the number of infections is growing in India, Southeast
Asia, and southern China. Currently, 49 million people are living with
the disease and there are 14,000 new infections every day. Remarkably,
only 5 percent of the infected people have access to AIDS drugs. Most
of those receiving treatment are in the United States and Europe; however,
only 50 percent of these people can tolerate the drug’s side effects.
The disease disproportionately impacts young adults, removing what should
be the most productive members of society, thus reducing life expectancy
in and the Gross National Product of the counties impacted. AIDS has
orphaned 13.2 million children worldwide.
Dr. Graham discussed possible strategies to control the epidemic. He
supports the current emphasis on education and avoiding contact with
blood and bodily secretions. He pointed out that in those counties where
education was stressed, the epidemic was not as widespread. He believes
that the best long-term strategy is the development of an AIDS vaccine.
This is how other epidemics, such as smallpox and polio, were controlled.
A vaccine offers the opportunity to control the epidemic even before
all the subject population is protected, since it makes it more difficult
for the disease to spread.
Dr. Graham discussed how a vaccine works, how the body reacts to an
infection and how HIV is particularly difficult for the body to defeat.
It takes time for the body to recognize the disease and try to control
it. HIV spreads so rapidly that the body can’t eliminate the infection
because it cannot overcome the head start the virus has. At best, the
body can maintain the infection at a constant level, but eventually the
immune system fails.
The basic idea is to develop a vaccine that would allow the body to
develop the proper antibodies to fight the disease. If the antibodies
are in the system as a result of the vaccine, then the time delay is
eliminated. The extent of the infection when the antibodies begin acting
is much smaller and the body can more easily control the disease.
While significant efforts are being made to develop an AIDS vaccine,
one of the major stumbling blocks is evaluation of the vaccine through
clinical trials. Here, fear is a major factor since human subjects are
needed to evaluate the vaccine. Historically, during the testing of a
vaccine, a weakened or whole killed virus was used and injected in the
volunteer. Consequently, the general public fears that as part of the
research on the AIDS vaccine, they would risk exposure to HIV. For the
new AIDS vaccines being developed, HIV exposure is not possible. Modern
DNA techniques are used to extract portions of the viral DNA, which are
then cloned to develop the vaccines. It is the technical details of this
process that most people do not understand. That makes it extremely difficult
to find volunteers to participate in the clinical trials. Dr. Graham
provided us with information concerning the need and search for volunteers,
particularly in the Washington, DC area.
We then had discussions about the status of research on smallpox. It
is extremely risky to use the traditional smallpox vaccine on a previously
unexposed or immuno-compromised population. This vaccine causes serious
reactions and side effects and could result in hundreds of deaths. A
research effort to develop a safe, but still effective, vaccine is currently
underway.
Site Visit to the Carnegie Institution of Washington
(June 13, 2002)
We were welcomed to the Carnegie Institution of Washington by its President,
Dr. Maxine F. Singer. She provided details about the Carnegie Institution,
and discussed the important role of basic science in science policy and
education.
The Carnegie Institution of Washington is a private nonprofit organization
engaged in basic research and advanced education in biology, astronomy,
and the Earth sciences. It was founded by Mr. Andrew Carnegie in 1902
and incorporated by an Act of Congress in 1904. Mr. Carnegie, who provided
an initial endowment of $10 million and later gave additional millions,
conceived the Carnegie Institution's purpose "to encourage, in the
broadest and most liberal manner, investigation, research, and discovery,
and the application of knowledge to the improvement of mankind."
From its earliest years, the Carnegie Institution has been a pioneering
research organization, devoted to fields of inquiry that its trustees
and staff consider among the most significant in the development of science
and scholarship. Its funds are used primarily to support investigations
within its own research departments. Recognizing that fundamental research
is closely related to the development of outstanding young scholars,
the Carnegie Institution conducts a strong program of advanced education
at the pre-doctoral and post-doctoral levels. The Carnegie Institution
also conducts distinctive programs for elementary school teachers and
children in Washington, DC. At First Light, a Saturday "hands-on" science
school, elementary school students explore worlds within and around them.
At summer sessions of the Carnegie Academy for Science Education, elementary
school teachers learn interactive techniques of science teaching.
Following our discussion with Dr. Singer, we heard from Dr. Paul G.
Silver of the Department of Terrestrial Magnetism (DTM) at the Carnegie
Institution. He spoke to us about seismological studies of earthquakes
and earth structure.
He explained to us that the seismology group at DTM is broadly focused
on understanding the physics of earthquakes and imaging the Earth’s
interior. He detailed:
-
Physics of the Earthquake Process -- Earthquakes are central to
the field of seismology and while scientists have made substantial
progress, our understanding is incomplete. Nothing illustrates this
better than our inability to predict them. The key to understanding
earthquakes may lie in the aseismic part of the earthquake cycle.
Studies at DTM include observing the evolution of these aseismic
motions, and connecting them with earthquake occurrence. This same
approach is used to study magmatic processes, where precursors have
been observed. DTM is also interested in the mechanism of deep (> 30
km) earthquakes, whose physical process remains elusive. This is
an active area of research that includes both seismological studies
of these events, as well as experimental evaluation of candidate
mechanisms in collaboration with the Geophysical Laboratory.
-
Mantle Imaging: Beyond Plate Tectonics -- Seismology is the most
effective tool for imaging the Earth’s interior. DTM uses the
imaging capability to push the limits of plate tectonics, both back
in time and to great depth. With portable seismographs and global
data sets, DTM has performed imaging experiments to address fundamental
questions, such as the evolution and deep structure of the Earth’s
oldest continents, mountain building dynamics, the origin of mantle
plumes feeding volcanic chains, and the deep convective flow that
accompanies and interacts with plate tectonics. DTM has also focused
on the base of this convective system, the core-mantle boundary region,
whose complexity and activity rivals that of the Earth’s surface.
-
Instrumentation and Observations -- Instrumentation development
and the unique observations that they enable, have always been major
strengths at DTM. DTM's scientists have designed and deployed, throughout
the world, ultra-sensitive strain meters to measure slow deformation
associated with earthquakes and volcanic eruptions. Subject to funding,
these instruments will be deployed as part of a major NSF initiative,
called the Plate Boundary Observatory, which seeks to observe earthquake-
and magmatic-related aseismic deformation over much of western North
America, with a special emphasis on the San Andreas Fault System.
We have also played a leading role in the design of portable broadband
seismographs to be used for a national program and by DTM scientists.
The DTM has been in the forefront of this field of portable broadband
seismology for more than a decade.
Following Dr. Silver's presentation, we heard from Dr. Steven B. Shirey,
also of the DTM at the Carnegie Institution. He spoke about diamonds
-- how they form and what they tell us about ancient continents.
He explained that the central portions of our continents are composed
of ancient (up to 3.8 billion years old), stable crustal terrains known
as cratons, which are underlain by equally old mantle keels. Cratons
are important as storehouses of much the world’s mineral wealth
including gold, platinum, diamonds, nickel, chrome, and titanium. They
also serve as a record of the Earth’s early geologic history and
a platform for early life development.
The study of Earth’s cratons has long been a focus of DTM’s geochemistry
and seismology groups. For the first time, DTM has a regional picture of craton
structure that explains diamond chemistry. Recent work shows that the 2.5 to
3.8 billion-year-old Archean mantle root beneath the Kaapvaal-Zimbabwe craton
of Southern Africa has ±1 percent variation in seismic P-wave velocity
at depths within the diamond stability field (150 - 225 km) correlating with
differences in the age and composition of diamonds and their syngenetic inclusions.
The age of diamond formations indicate that mantle keels that became continental
nuclei were created by severe middle Archean (3.3 billion-year- old) mantle
depletion events with high degrees of melting and early peridotitic diamond
formation. Late Archean (2.9 billion-year-old) events involving the closing
of an ancient ocean basin stabilized the craton and contributed a later generation
of eclogitic diamonds. Subsequent Proterozoic (0.544 to 2.5 billion-year-old)
tectonic and magmatic events altered the composition of the continental lithosphere
and added new peridotitic and eclogitic diamonds to the already extensive Archean
diamond suite.
The DTM Cosmochemistry/Geochemistry Group has long been a pioneer in
the application of long- and short-lived radioactive decay schemes to
problems of solar system, lunar and terrestrial evolution. Researchers
employ mass spectrometry, coupled with other micro-characterization methods,
to analyze the isotopic composition of a wide variety of elements. The
focus is on absolute age determination or the use of isotope ratios as
tracers of geological processes. Areas of interest include the identification
and analysis of meteorites, the formation of chondrules and chondritic
meteorites, the chronology of achondritic asteroidal sources of interplanetary
dust particles, the meteoritic component of terrestrial impact crater
ejecta and melt rock deposits, the geochemical evolution of the Earth's
crust and mantle, the history and development of the continents and ocean
basins, and the large-scale flow and nature of chemical heterogeneity
in the mantle. Laboratory studies are coordinated with field studies
in areas as diverse as the Pacific and Atlantic Ocean basins, Brazil,
central Canada, Southern Africa and the western United States.
Dr. Christopher McCarthy, of the DTM, followed Dr. Shirey's presentation.
Dr. McCarthy presented current information on observations of planets
and planet formation.
He began his presentation by telling us that, for centuries, astronomers
have been interested in the origin of planetary systems -- especially
our own. While we cannot turn back the clock to witness our own creation,
we can observe the creation process in other parts of the universe. Carnegie
Institution Astronomer, Dr. Alycia Weinburger, has acquired images of
young stars to search for signs of planet formation. Utilizing new infrared-sensitive
detectors, giant "pre-planetary" disks are now visible. Their
composition, particle sizes and temperatures all provide clues to the
process of planet formation. Astronomers are making very precise observations
of stars for periods of years, or even decades, to trace out the slow
orbits of detectable planets. Such measurements, considered impossible
a few years ago, are now ushering in a new era in planetary discovery.
In addition to studying planets as they are forming, we can investigate
the possible outcomes of the formation process. For over a decade, Carnegie
Institution Astronomer, Dr. Paul Butler, has searched for planets outside
our solar system. A flurry of discoveries, beginning in 1995 with the
first "exoplanets" has produced surprising results, challenging
long-standing theories of planet formation. Dr. Butler and others have
found planets orbiting far closer to their host suns than expected, in
some cases completing "annual" orbits in just three days. Furthermore,
a majority of discovered systems consist of massive planets (hundreds
of times more massive than Earth) in elongated, oval orbits. In such
a solar system, Earth as we know it could not exist as it would collide
with a larger planet or be ejected. Is our solar system common or is
it a rare or even unique example? A recent discovery by Dr. Weinburger,
announced today for the first time, has confirmed the existence of a
solar system similar to ours.
Our next presentation was about extrasolar planetary systems and theoretical
challenges from Dr. Nader Haghighipour. Dr. Haghighipour is a National
Aeronautics and Space Administration's (NASA) Astrobiology Institute
Fellow, and a NASA Associate at the Carnegie Institution.
Dr. Russell J. Hemley, of the Geophysical Laboratory of Carnegie Institution
spoke next about forming novel materials under pressure.
He went on to tell us that until recently, material science has only
fully utilized two of its three fundamental tools -- temperature and
chemical composition. Pressure, the third fundamental variable altering
all states of matter, is in many ways the most remarkable as it spans
some 60 orders of magnitude in the universe. Recent advances in generating
very high pressures with single-crystal diamonds, allow materials to
be subjected to, and observed at, millions of atmospheres of pressures.
Materials can also be heated to thousands of degrees or cooled to milli-Kelvin
temperatures at these extreme pressures. Moreover, samples can be examined
under these extraordinary conditions using a wide variety of techniques
such as intense laser, x-ray, and neutron beams.
These experiments reveal a "brave, new world" of materials
under extreme pressures. The field is providing fertile ground for the
formation of new materials, possibly tripling the number of known substances.
Even more significant, entirely new classes of materials are appearing.
New forms of common and putatively simple substances such as hydrogen
and water (and their mixtures) occur when compressed. Other gases and
liquids are not only solidified under pressures but can be turned into
metals and superconductors. The highest temperature superconductivity
on record (164K) and new kinds of superconductors have been produced
under high pressures. Chemical bonds and affinities of otherwise familiar
elements and compounds are totally changed. Inert gases relinquish their
noble status to form compounds and normally unreactive transition metals
form new alloys. The common silicate and oxide minerals found at and
near the Earth's surface transform to dense, strong ceramic substances
that are now believed to make up the bulk of our planet. Even at pressures
of several thousand atmospheres, strong effects on organic and biochemical
reactions are observed. The variable of pressure is, in effect, adding
a new dimension to the venerable Periodic Table of the Elements. The
implications span chemistry, physics, Earth and planetary science, materials
science and technology, and biology.
Following Dr. Hemley, we heard from Dr. Robert M. Hazen of the Carnegie
Institution's Geophysical Laboratory. He spoke to us about minerals and
the origin of life.
He explained that no one yet knows how life arose on the desolate, primitive
Earth, but of one thing we can be sure: life’s origin was a chemical
event. The first living entity must have been crafted from air, water
and rock -– the same raw materials that sustain life today. Of
these three ingredients, rocks, and the minerals of which they are made,
have often received little more than a footnote in Origin of Life theories.
The atmosphere and oceans enjoy the staring roles in origin scenarios,
while rocks and minerals sneak in and out as bit players -– or
simply as props -– and then only when all other chemical tricks
fail.
A flurry of fascinating experiments promises to change that misperception.
Origin of Life researchers have begun to realize that minerals must have
played a sequence of crucial roles in the synthesis of life’s molecular
building blocks, as well as during their subsequent assembly into growing
and evolving structures.
Minerals surely played a passive role in providing sheltered environments,
such as overhangs, porous rocks, or deep hydrothermal environments at
ocean-ridge systems, for the isolation, concentration, and self-assembly
of organic molecules. However, mineral surfaces may have also served
as templates for the assembly and storage of organic macromolecules.
The common minerals quartz and calcite, for example, have been cited
as possible surfaces for chiral (left- versus right-handed) selectivity
of organic molecules. This early chiral selectivity is very important
when you consider that all living organisms have evolved to be selective
for certain chiral compounds.
The most frequently cited role of minerals in Origin of Life scenarios
is as catalysts that enhance the rate and yield of reactions essential
to the production of biomolecules. The Carnegie Institution's research
demonstrates a variety of mineral-catalyzed reactions at hydrothermal
conditions. Of special interest is the possible catalysis of carbon-carbon
bond formation on iron-nickel-sulfide surfaces. Recent experiments at
the Geophysical Laboratory suggest that the sulfides are not passive
catalysts, but rather partially dissolve and are reactants in the formation
of organic sulfides and transition metal carbonyls.
Ultimately, these studies will help answer several fundamental questions
regarding minerals and life's origins: Can minerals mimic biochemistry?
Can minerals mediate the synthesis of relevant biomolecules, especially
organic macromolecules? Under what circumstances do minerals inhibit
the formation of life? And, where else in the Solar System do appropriate
mineralogical environments for the origin of life exist?
Our last speaker at the Carnegie Institution was Dr. George D. Cody
of the Geophysical Laboratory. His topic provided information about biogeochemistry
and molecular spectroscopy at nanoscales.
He began by telling us that recent advances in X-ray micro-focusing
techniques coupled with brilliant, synchrotron derived X-ray sources
have led to the development of the soft X-ray scanning transmission X-ray
microscope (STXM) and microspectrophotometer. This instrument enables
researchers to analyze organic functional group distributions in bioorganic
structures at spatial resolutions approaching 50 nanometers (nm). A recent
biogeochemical application of this new technology involves the investigation
of the mode of chemical attack of vascular plant material by fungal microorganisms.
Fungal degradation of wood and forest litter plays an integral role in
the health of forest ecosystems as well as providing the primary route
for carbon cycling back into the atmosphere. Remarkably, there remains
considerable uncertainty in the details of exactly how fungal microorganisms
are able to metabolize structural polysaccharides. STXM analysis provides
a window into this critical process. A second application involves assessing
the biomolecular constituents present in Earth's first land plants; i.e.,
were the structural biopolymers ubiquitous to modern vascular plants
also present in Earth's first vascular plants. Carnegie Institution's
researchers have explored the chemistry preserved in ancient (~ 400 million-year-old)
vascular plant membranes in some of the most primitive plants (now extinct)
using carbon near edge X-ray absorption spectroscopy (C-XANES) with 30
nm resolution and are able to shed light on ancient biochemistry.
back to top
James Vrabel
Senior Manager
Product Line
ORBIMAGE
(October 4, 2000)
Topic: Remote Sensing Principles and Applications
Mr. James Vrabel, an imagery scientist with computer science training,
is deeply involved in the remote sensing community. At ORBIMAGE, he
is responsible for the development of product definitions for panchromatic,
multispectral (MSI), and hyperspectral (HSI) imagery products for Orbview
3 and 4. He also provides technical support to Orbview 4 hyperspectral
imagery program including testing, ground systems, and product generation
systems. Mr. Vrabel was formerly under contract to the National Imagery
and Mapping Agency where he was the resident expert on MSI and HSI
for the Office of Technology. He provided project management, image
science and evaluation design expertise in relation to MSI, HIS, and
imagery fusion. He provided image science and program management support
to the Civil and Commercial Applications Project, which evaluated all
commercial sensors in the areas of image interpretation, feature extraction
for mapping, geopositional accuracy, and radiometric fidelity.
Mr. Vrabel presented the basics of remote sensing including applications.
Remote sensing provides the trained analyst an integrated view of a
portion of the world that field survey cannot provide because of high
expense or lack of accessibility. It is important to understand how
to set the optimum combination of spectral bands, collection parameters
and collection conditions to reveal the conditions of the site of interest.
The combination and contrast of spectral bands can reveal such things
as stressed vegetation, turbidity of water, and pollution problems.
Resolution, azimuth and elevation are parameters that can be critical
for analysis as can time of day and time of the year. Remote sensing
is a valuable tool for the private sector and government agencies planning
for and evaluating the environment, agriculture, forestry, urban growth,
natural disasters, lines of communications, and others.
Arthur L. Caplan
Director of the Center for Bioethics
University of Pennsylvania
(October 18, 2000)
Topic: Ethical Implications of New Biomedical Technology, Including
the Human Genome Project and Cloning
In addition to serving on the faculty of the University of Pennsylvania,
Dr. Arthur Caplan has been the Director of the Center of Bioethics
and the Chief of the Division of Bioethics for the University's Medical
Center since 1994. Dr. Caplan delivered a lively and thought-provoking
presentation that touched on the ethical implications of advances being
made in a wide array of biomedical subject areas; most are being addressed
at some level at the University of Pennsylvania. But first, he described
the nature and purpose of the Center for Bioethics.
The Center supports a dynamic program made up of 14 faculty members
representing the disciplines of medicine, philosophy, theology, the
social sciences and patent law, and a current enrollment of 100 graduate
students. The Center places a strong focus on education and outreach,
providing information and access with the help of the Internet to both
the public and the media. (The Center's website can be found at www.bioethics.net.)
Partnerships are also being developed to foster support for a standardized
bioethics curriculum for high school students. As the Center's Director,
Dr. Caplan's opinion is routinely sought on scientific and medical
ethic issues.
In order to give us a sense of the breadth of the Center's areas of
involvement, Dr. Caplan touched briefly on a number of ongoing projects
and highlighted the kinds of ethical questions that arise. Project
areas include: genetic testing; an activity by the company Framingham
Genomics to use the 50-year medical records database generated by the
Framingham Heart Study to locate genetic markers and behavioral correlates,
comparable to a government-endorsed study conducted in Iceland; gene
therapy; the subjects of who owns life and whether we should try to
create artificial life; end-of-life health care issues; the use of
narcotics to relieve pain; treatment decisions for infants and young
children; physician-assisted suicide; reproductive technologies; cloning
as it is being done in the plant and veterinary sciences; and transplants
and the policies for the allocation of tissue, blood, and organs.
Dr. Caplan turned the discussion to the national blood supply to highlight
an area at the intersection of science, medicine, and public policy.
It is a topic with which he is well versed in his capacity as the Chairman
of the Advisory Committee to the Department of Health and Human Services,
the Centers for Disease Control, and the Food and Drug Administration
for Blood Safety and Availability. He described the various interest
groups with a stake in this area and the trade-off decisions inherent
between ensuring the safety of the blood supply and maintaining adequate
supplies. Currently, stronger support exists in our country for managing
for a safe blood supply.
He fielded a variety of questions from us, which enabled him to interject
more ethics considerations into the equation. Stimulating discussion
ensued from such topics as restrictions placed on blood donors, utility
standards being considered for patents for genetic diagnostic tests,
the appropriate role of the Federal Government in biomedical issues
such as reproductive technologies being practiced by the in vitro fertilization
industry and the use of genetic records by the insurance industry,
the potential use of information generated from the Human Genome Project,
and the calculation of the cost associated with a human life. In a
number of cases, Dr. Caplan pointed to the need for a legal framework
and suggested the potential role to be played by the Federal or State
Government.
Finally, to distinguish the practical field of bioethics with its
focus on medicine and the biological sciences from pure ethics, he
suggested as analogous the relationship between engineering and physics.
Dr. Caplan also outlined the series of fundamental questions that should
be asked typically for each bioethics case encountered. What are the
facts? Classify the value issues. Look at the existing areas of consensus
that have been established. Look to the law, religion and philosophy
for existing traditions. And as a last resort, seek to establish a
new principle.
Site Visit to the Remote Sensing Division, National Geodetic Survey,
National Oceanic and Atmospheric Administration
(October 25, 2000)
"Imagine bridges not meeting in the middle, planes landing next to
-- rather than at -- airports, ships frequently running aground, and
the north-bound commuter train on the same track at the same time as
the south-bound freight train." This is what our lives could be like
without geodesy. Geodesy is the science of measuring the size and shape
of the Earth and precisely locating points, or coordinates, on the
Earth.
The hardworking employees of the Remote Sensing Division (RSD), part
of the National Geodetic Survey (NGS), can be found all over America
mapping the Nation's shoreline from aircraft and ensuring the flight
paths to our country's airports are safe. The United States has approximately
95,000 miles of coastline. One of the missions of the NGS is to survey
these coastal regions to provide accurate, consistent, up-to-date shoreline
information. The method used to delineate the shoreline is stereo photogrametry
using tide coordinated aerial photography controlled by kinematic Global
Positioning System (GPS) techniques.
Coastal Mapping
Shoreline data is considered authoritative when determining the official
shoreline for the United States. Public law passed by Congress in 1998
provides the National Oceanic and Atmospheric Administration (NOAA,
the parent agency of NGS) with explicit authority to promulgate national
standards for all information acquired for nautical charting purposes.
The shoreline on NOAA's nautical charts approximates the line where
the average high tide, known as Mean High Water intersects the coast.
The Nation's shoreline changes rapidly with the demographic changes
taking place in our coastal waters as well as through natural forces
such as hurricanes, storms, erosion, etc. These changes place an ever-increasing
demand on the NGS coastal mappers to keep up. NGS' Research and Development
section is exploring the use of new technologies and new methodologies
to map the shoreline. One new technology, interferometric Synthetic
Aperture Radar (SAR) uses radar or transmitted energy that is returned,
scattered and returned back. Unlike the photogrametric approach using
cameras, the microwave signals of SAR can pass through clouds, smog
and dust allowing data to be collected at any time of day and in less
than ideal weather and atmospheric conditions. SAR technology can be
deployed from aircraft or satellite!!
LIDAR, Light Detection and Ranging, is another new technology being
tested for surveying the coastline. LIDAR uses pulses of light to illuminate
the terrain. LIDAR data collection involves mounting an airborne laser
scanning system onboard an aircraft along with a kinematic GPS receiver
to locate an x and y position, and an inertial navigation system to
monitor the pitch and roll of the aircraft. By accurately measuring
the round trip travel time of the laser pulse from the aircraft to
the ground, a highly accurate spot elevation can be calculated. LIDAR
has been tested in a wide variety of applications including assessing
post storm damage to beaches, mapping the Greenland ice sheet, and
measuring heights within forest timber stands.
Imaging Spectrometers gather data over a wide band of the electromagnetic
spectrum in many small band pass channels. Such data (often called
hyperspectral data) can be used to accurately determine the composition
of the ground cover in a scene. When the images are acquired at high
spatial resolution, the resulting data provide a robust characterization
of the Earth's surface. This is what the Airborne Visible and Infrared
Imaging System (AVIRIS) hyperspectral imaging is about, and the instrument
has been flown from NOAA Twin Otter aircraft in several test cases.
With the increased capability of each new technology, the amount of
data collected and the need for processing those data increases exponentially.
Aeronautical Survey
The NGS has been performing aeronautical surveys since the 1920's.
The survey data provides critical runway, obstruction, navigation aid,
and airport feature information needed to safely fly into airports.
The Federal Aviation Administration (FAA) uses the data to develop
instrument approach and departure procedures, to determine maximum
takeoff weights, to update aeronautical publications, and for airport
planning and engineering studies. The FAA has commenced a program to
provide GPS approaches throughout the National Airspace System. NGS
supports this program by performing Area Navigation Approach (ANA)
surveys at airports selected by the FAA. In addition to the technologies
described above, NGS uses trucks mounted with GPS systems to accurately
map airport approaches, confirm navigational aid positions, and to
determine obstructions to flight paths.
In addition to some contracting for aircraft services, NGS uses NOAA
aircraft to fly the instruments that map the Nation's shoreline. Officers
of the NOAA Commissioned Corps, who are heavily integrated in these
programs on a managerial as well as operational level, fly the aircraft.
Joseph Bordogna
Deputy Director
National Science Foundation
(November 1, 2000)
Topic: Overview of the Mission and Future Objectives of the National
Science Foundation
The National Science Foundation's vision is clear and simple -- enabling
the Nation's future through discovery, learning, and innovation. Dr.
Joseph Bordogna, Deputy Director of the National Science Foundation
(NSF), pointed out that the way one does this is by constantly pushing
the frontier; by being "disruptive." NSF wants to be "creatively" disruptive.
Fond of quoting Peter Drucker (a futurist and forecaster) and Joseph
Schumpeter (an Austrian economist), Dr. Bordogna said that according
to Schumpeter, a normal healthy economy was not one in equilibrium,
but one that was constantly being disrupted by technological innovation.
Disruption is an important characteristic of innovation. The innovation
process is naturally disruptive. Indeed, it is easy to see the rapidity
with which new and emerging technologies force us into change and dramatically
affect our economy; from the way we manufacture and deliver goods,
to changing our very social order. The Internet is a fabulous example
of how innovation has literally changed the retail industry, as well
as how we think and act on a daily basis. The instant access to a wealth
of knowledge and information in real time is a change that for many
of us has been profound.
Dr. Bordogna pointed out that it is necessary for NSF to be on the
farthest frontier. Industry, once very involved and on the cutting
edge of discovering new knowledge is now coming to NSF. Industry goes
to academe for knowledge and is paying for this "intellectual property." New
knowledge is worth billions. NSF wants to satisfy this need for knowledge
through people, ideas, and tools as noted in their strategic plan and
goals. Peter Drucker said that knowledge is a source of wealth, that
productivity is knowledge applied, and that innovation is knowledge
applied to new tasks. NSF knows that you need people to do this and
that people need to be prepared. Consequently, NSF is heavily involved
in education and training from K-12 through the postgraduate level.
A well-taken message from Dr. Bordogna was that in this year when
the Earth's population stands at six billion, only technology could
help to provide for such a magnitude of people. University educated
people help push the frontier and develop the new technology that will
be needed. NSF works hard to create partnerships with elementary schools,
community colleges, universities, and post academe. An interesting
note was that the United States infrastructure of community colleges
provides an important technical base for innovation and new technology.
Community college students get into the workforce early and are trained
mainly by people who have been active members of the workforce who
have turned to teaching.
Returning to the strategic goals of people, ideas and tools, NSF wants
to develop intellectual capital, that is, people who are diverse, internationally
competitive and globally engaged. Ideas will be generated by discovery
across the frontiers, and connected to learning, innovation and service
to society. Tools to enable this discovery will be accessible state-of-the-art
facilities that are shared for research as well as educational purposes.
NSF, which turns 50 this year, provides research dollars based on
a merit review system. A peer review process selects about 10,000 new
projects from the 30,000, mostly unsolicited, research and education
proposals received each year. This system works so well that NSF trains
other countries in the NSF system. NSF's budget will increase 17 percent
in FY 2001. Using a holistic approach to convince Congress to provide
this resource increase, NSF promoted four new initiatives: Information
Technology Research; Nanoscale Science and Engineering; Biocomplexity
in the Environment; and the 21st Century Workforce. It's easy to see
from these initiatives that the social sciences are as important as
the physical sciences in the work that NSF does. NSF's budget strategy
is to strengthen core activities, support major initiatives, identify
unmet opportunities, and to diversify their portfolio. Projects funded
for a period of up to ten years are cast aside to "refresh the frontier." NSF
gathers new ideas on what should be supported from a large variety
of sources such as Nobel Prize winners, the travel NSF program managers
do all around the world, the National Research Council, and many others.
In closing, Dr. Bordogna described the rungs in "climbing the knowledge
ladder" as a progression from bits to data, to information, to knowledge,
to wisdom, to the meaning of life.
Site Visit to the Mid-Atlantic Coca Cola Bottling Company
(November 7, 2000)
Mr. William Dubolino, Production Manager, provided a briefing on the
Coca-Cola manufacturing process and Coca-Cola Enterprises at the Alexandria,
Virginia facility. Mr. Lawrence Omene, Quality Manager, explained the
quality control assurance aspects of the Coca-Cola manufacturing process.
After the informative session, we were treated to a tour of the facility
ending with a beverage tasting session.
The Alexandria bottling facility is part of Coca-Cola Enterprises
(CCE), which is the largest Coca-Cola bottler in the world. It is 47
percent owned by Coca-Cola USA. Coca-Cola USA supplies the requisite
syrup concentrates necessary for making the Coca-Cola soft drink. The
Alexandria facility is a one-line soft drink production plant that
produces one- to two-liter plastic (PET) bottles. The plant in the
facility produces, on average, ten million cases per year. The facility
has an attached sales center. The entire facility houses about 200
employees.
The Quality Assurance Division has an analytical laboratory. The quality
of each batch of the syrup from the parent company and the finished
beverages are routinely analyzed. The specification for syrup, carbon
dioxide and water blending is set by the parent company. The laboratory
also houses a water treatment plant. Both the Alexandria City water
used for manufacturing the beverages and the wastewater (prior to its
release into the city sewer system) are treated in the plant. For manufacturing
beverages, the city water is treated with chlorine (sanitation), lime
(alkaline reduction), and ferrous (particle removal) and filtered through
sand and activated charcoal columns. The waste liquid is neutralized
prior to releasing into the city sewer.
The Coca-Cola manufacturing is an automated process wherein the PET
bottles are transported from one location to another through an Airveyor.
The empty bottles are fed from a Depalletizer (PET bottles are coded),
rinsed by a Bottle Rinser, filled with the syrup-carbon dioxide-water
mixture in a Filler (Filling Room that consists of a 98-valve Filler),
capped by Capper, warmed by Bottle Warmer (to expand bottle) so that
the labels can be properly applied by the Labeler. The labeled bottles
are packed into cases via Case Packer, loaded up into Palletizer, Shrink
Wrapped and are transported to the warehouse. Shelf life of the Coca-Cola
beverages is approximately 13 weeks for an optimum taste. Product degradation
depends on both storage temperature and length of time. Diet drinks
containing aspartame degrade faster than the non-diet Coca Cola.
Site Visit to the Product Quality and Safety Laboratory, Agricultural
Research Service, U.S. Department of Agriculture
(November 15, 2000)
The quality and safety of the United States food supply is on the
minds of many consumers. Our visit to the bucolic Beltsville Produce
Quality and Safety Laboratory, formerly the Horticultural Crops Quality
Laboratory, allowed a glimpse into research on fresh and "fresh cut" fruits
and vegetables being done as part of the President's Food Safety Initiative.
Many factors have contributed to concerns about food safety and the
safety of produce in particular. Americans are eating more fresh fruits
and vegetables than ever before. More meals are eaten away from home.
Consumers now expect year-round access to fresh produce. The globalization
of the produce market means Americans are eating fruits and vegetables,
which are grown in other countries, which may have sub-standard agronomic
practices. Finally, more produce is being eaten after initial or "fresh
cut" processing. Consumers find these products in salad bars -- peeled,
precut carrots; diced tomato; celery sticks; and sliced melons. "Fresh
cut" produce is used increasingly in restaurants where the shredded
lettuce or minced onion in your burrito or burger are processed far
from the point of purchase. It is predicted that more and more of these
types of products will be introduced to the institutional and consumer
marketplace. The drawback to the increased convenience of "fresh cut" produce
is once it is cut, any microorganisms on the outer surface will be
spread through the cut surfaces where they will flourish.
One method in development to control these microorganisms is biocontrol
-- the use of naturally occurring organisms or chemicals, which target
pathogenic bacteria. Bacteriophages ("phages" fah-gez) are ubiquitous
viruses that attack specific bacteria. Research into the use of phages
to combat bacteria dates to the 1920's and 1930's, but fell from favor
due primarily to development of antibiotics. Now, facing increasing
antibiotic resistance in bacteria, research into phages as an alternative
to antibiotics as well as pathogen control on produce, continues. Dr.
Craig Bloch presented research on yeast species, which may also offer
some utility against bacterial pathogens on produce.
We were also introduced to more basic research on the mutagenesis
of bacteria. By finding the genes responsible for a pathogen's ability
to survive, it may be possible to find ways to combat them.
Our host, Dr. Arvind Bhagvat, shared his research on the use of a
handheld gadget, which may be used in the field to detect the presence
of bacteria by the use of luciferase. Luciferase is an enzyme derived
from fireflies, which glows in the presence of adenosine triphosphate
(ATP). By measuring the intensity of the glow, or lumens, with the
machine, a producer could get a sense for the gross number of bacteria
on samples of one's crop. The presence of large numbers of bacteria
may be of concern, but only if the bacteria are pathogenic. The problem
comes from the fact that traditional methods for identifying the species
of bacteria takes days, which means produce languishes or may be passed
through marketing channels to the hands and mouths of consumers. This
points out the need for the development of a "real-time," cost-effective
method of pathogen analysis. Polymerase Chain Reaction (PCR) is a method
whereby a microorganism, among other organisms, can be identified by
its DNA fingerprint. The experiment we were shown was developing a
single test that can be used to identify several common food-borne
pathogens: Salmonella typhimurium, Escherichia coli 0157:H7 and Listeria
monocytogenes. This new method may some day be used to obtain results
in a matter of hours instead of days.
Lastly, consumers want not only safe food but appealing foods as well.
This laboratory also has state-of-the-art produce quality testing facilities
where instruments assess the physical properties of texture and color,
and human panels assess the taste of produce.
Sarah Horrigan
Program Examiner
Science and Space Branch
Energy and Science Division
Office of Management and Budget
(November 21, 2000)
Topic: Federal Science Research Funding Process
Dr. Sarah Horrigan briefed us on the Office of Management and Budget
(OMB), its organization, the type of people it hires, and its review
process.
The OMB assists the President in developing and executing his policies
and programs, and is organized by agency, program area, and functional
responsibilities. Of the approximately 500 employees in OMB, about
90 percent are civil servants, 10 percent are appointees, with only
the OMB Director being subject to Senate confirmation.
The OMB is organized into Research Management Office (RMO) branches
with examiners who are assigned to portfolios. In the science and technology
(S&T) domains, employees are hired straight out of policy schools or
they may have substantive backgrounds in science.
The budget review process tracks agency and accounts baselines. OMB
also reviews the corresponding agency yearly financial statements.
Examiners prefer for agency budgets that can answer a set of basic,
easy, and simple questions that are listed in priority order so that
they can be understandable and defensible before Congress.
OMB receives agency budgets in September. Then, it holds budget hearings
at which the various departments explain how programs are put together.
The result of this approach is a series of ongoing discussions with
each agency. Examiners are able to act as advocates for the various
agencies, and at the same time present other viable options for their
programs. Information going to Congress must go through OMB to ensure
that agencies are: (1) consistent with the President's budget, and
(2) able to make the best case possible for their programs.
In an answer to a question on how new initiatives get through the
budget process; Dr. Horrigan stated that agencies are successful when
they have champions within the Administration. In addition, the initiatives
are presented as interagency programs, which have the potential for
societal pay-offs -- often measured as social rates of return. It also
helps for agencies to have just a small number of initiatives in any
given year. Individuals working on initiatives need to understand that
measures of success must include something other than just increasing
budgets. They should be able to answer how well programs are being
managed and whether or not agencies are developing greater contacts
outside government.
Christopher T. Hill
Vice Provost for Research
Professor of Public Policy and Technology
George Mason University
(November 29, 2000)
Topic: Economic Growth, Technology and Technology Policy
Why do economies grow? What nurtures economic growth? What is technology,
and why should there be technology policy, and for that matter, what
is technology policy? These are some of the questions posed by Dr.
Christopher Hill, Vice Provost for Research and Professor of Public
Policy and Technology at George Mason University. In addressing these
questions, Dr. Hill drew from 25 years of experience researching and
creating science and technology policy in institutions such as the
Rand Institute, the Congressional Research Service, the Office of Technology
Assessment, the Massachusetts Institute of Technology, and Washington
University in St. Louis, Missouri.
Once upon a time, growth was tied to the growth in population. New
population required new land, which was obtained by clearing forests
to create farmland, or by subjugating neighboring people and appropriating
their territory, or by exploration and appropriation of newly discovered
lands. Modern economic growth did not begin until an appropriate social/political
framework had developed. This framework included a strong contract
system backed by a means of enforcement, political stability, and a
cultural mindset that supported change and was not rooted in acceptance
and endurance of the status quo. As the elements of this framework
developed, the basis of economic growth shifted from geographic expansion
to the development of better ways to do things. Technological change
became paramount.
The contribution of technology to economic growth is what's left over
when the contributions of other elements -- capital, labor, land and
so on -- have been determined. It is usually calculated to be about
50 percent, but measuring the contribution directly has proven to be
difficult. Technology means different things to different people; new
products and processes to some, and increased efficiency and ways to
reduce cost to others. Increased efficiency can have widespread effects,
because it frees up resources to be used somewhere else. The downside
of increased efficiency was seen in the Great Depression, when the
technological development of the 1920's made significant portions of
the workforce redundant. A necessary adjunct to increased efficiency
is a mobile workforce. Mobility is deeply ingrained in the American
workforce because of a long history of moving to find a better life
somewhere else. In Europe, mobility has been historically discouraged
in favor of loyalty to the employer, and growth has lagged behind that
seen in this country. Even here, however, mobility in the labor force
is inhibited by a general failure to adequately train and/or reeducate
the workforce.
Government involvement in technology and technology policy goes back
to at least World War II, when large amounts of money were put into
research in support of the war effort. The Cold War justified continued
large expenditures on national defense. Vannevar Bush determined that
the government should support research in universities without trying
to direct what research should be done. Bush's concept was that pure
research would provide the maximum opportunity for developing new technologies.
Government spending on research is still very much driven by a belief
in pure research, although new intellectual property laws were enacted
in 1980 and the years following to help push the results of this pure
research into a commercial development pipeline so that the technology
would benefit society and the economy.
Government activity since World War II has varied greatly from one
Administration to another. In the mid-1960's, the emphasis was on identifying
and aiding what were seen as lagging sectors. Politically the program
did not succeed, but it did result in the first real government analysis
of technology. President Nixon, concerned with a perceived decline
in technology, commitment and leadership, commissioned a report that
led to a National Science Foundation program to study technology policy,
as well as a program to promote technology development. The terrible
economy during the Carter Administration resulted in yet another study
of technology policy, this time integrating technology policy with
antitrust, environment and regulatory policy. A number of recommendations
from this study were put into place during the Reagan administration,
including the changes in the patent laws alluded to above. Technology
steadily increased in importance during the 1980's, driven at least
partly by competition from Japan and Europe. The Clinton years were
characterized by a booming economy coupled to a desire to reduce the
national deficit. The result was a decreased emphasis on technology
programs. The end of the Cold War and its concomitant reduction in
the importance of national defense as a technology driver contributed
to the decreased emphasis on technology spending.
It is not clear what technology policy the new Bush Administration
will develop or follow. Programs such as the Advanced Technology Program
and the Natural Gas Vehicle Technology Program, that support applied
research and development are likely to die.
Michael MacCracken
Executive Director
National Assessment Coordination Office
U.S. Global Change Research Program
(December 6, 2000)
Topic: Process of Generating and Content of the First National Assessment
of Climate Change Impacts on the United States
Dr. Michael MacCracken has an extensive background in climate modeling
and the science of climate change, involving seven years with the U.S.
Global Change Research Program (USGCRP) and a long-term career in atmospheric
sciences at Lawrence Livermore National Laboratory. The depth of his
scientific background was evident as he began to explain to us the
scientific evidence for global warming, and how changes caused by global
warming might affect society.
A number of questions must be addressed in the effort to understand
climate change and its impacts: Can we see if the climate is changing?
Can we determine if it is due to human activities? What are the effects
of these changes? What will be the impacts on society and the environment?
What are the consequences? And, what should be done, if anything, in
response to this knowledge? Looking specifically at the issue of energy
derived from fossil fuels and the associated increase in atmospheric
carbon dioxide, a greenhouse gas, Dr. MacCracken us through an analysis
of each question.
Dr. MacCracken also spoke of the enormous international effort underway
to develop a global scientific consensus on climate change. He believes
that arriving at a consensus will help generate within the international
community collective action to put in place effective international
environmental policy. The Intergovernmental Panel on Climate Change
(IPCC) has approximately 150 member countries involved in evaluating
cutting-edge climate change science. The IPCC periodically produces
reports on the state of our understanding of various aspects of climate
science.
The United States chairs the IPCC working group on impacts and adaptation.
This led naturally to the USGCRP congressional mandate to produce the
first national assessment. This assessment, performed through public-private
partnerships between the ten USGCRP member federal agencies and regional
and sectoral stakeholders throughout the country, has produced volumes
of detailed work looking at all aspects of how climate change might
affect life in the United States, and how adaptation and mitigation
strategies might be employed. A lengthy synthesis report and a brief
overview report summarizing key findings were published. The overview
report, which Dr. MacCracken kindly distributed to us, has been mailed
to all Members of Congress.
Alfred R. Berkeley III
Vice Chairman
Nasdaq Board of Directors
(December 13, 2000)
Topic: How Science Research Affects the Economy and the Nasdaq
Mr. Alfred Berkeley began by convincing us of his technical expertise.
We persuaded him to discuss the finer points of leading a cow up the
stairs and onto the top of the Rotunda at the University of Virginia,
an exploit, now famous, that he acknowledges from his student days.
Mr. Berkeley has an extensive financial as well as technological background,
and as such, has a very interesting perspective on the role of technology
in the markets. As President of Nasdaq from 1996 to 1999, and now,
as Vice Chair of the Board of Directors, Mr. Berkeley presided over
the world's first and still the premier electronic stock market. Nasdaq,
since its 1971 inception, has set a precedent for technological trading
innovation that is unrivaled. It is now poised to become the world's
first truly global market, and is the market of choice for business
industry leaders worldwide.
Starting with the history of markets, Mr. Berkeley reminded us that
without the development of economic markets, we would still be farmers,
living subsistence lifestyles. The U.S. stock market performs an essential
role in making our economy the envy of the rest of the world. AT&T
made a landmark decision several decades ago, which has made our stock
market significantly different from other world markets. They decided
to price their shares low enough to allow essentially anyone to own
stock. These days, as a result of the U.S. market going in this direction,
nearly half of all U.S. families own stocks or equity-based mutual
funds, whereas in Europe, Japan and other developed markets, only a
small percentage own equity. The U.S. Securities and Exchange Commission
also ensures that information is available to investors freely and
fairly, making it possible for investors to make informed choices.
The Nasdaq brought the stock market to a new level of efficiency with
the advent of on-line quotes and trading. Electronic markets help ensure
transparency and liquidity for a company's stock while maintaining
an orderly market, functioning under tight regulatory controls. By
providing an efficient environment for raising capital, Nasdaq has
helped thousands of companies achieve their desired growth and successfully
make the leap into public ownership. Mr. Berkeley told us that delegations
from other countries frequently visit the Nasdaq in order to learn
how to develop similar markets for their own countries. Most other
countries have economies that are significantly more debt-based. Mr.
Berkeley believes that to survive in the long term, they must develop
equity-based economies more like the United States.
Several trends in the market are troubling to Mr. Berkeley. He is
particularly frustrated over networks making the financial news into
entertainment, like sports. The current emotion laden reporting of
the financial world probably encourages gambling on stocks and adds
unhealthy volatility to the market. People aren't investing for long-term
productivity and growth, but are trying to beat the system on the day-to-day
fluctuations. Thus, industries that are necessary and add value (e.g.,
shipbuilding) have a difficult time in today's economy because they
are not in the exciting sectors that investors like. Difficulty in
attracting investors makes their cost of capital very high, forcing
consolidation and depressing industries that should be viable.
Mr. Berkeley also said that increasing pressure for profit, as stable
domestic companies compete in the global economy, has made it impossible
for firms to invest as before in research and development. Too much
shareholder pressure to plow profits into immediate returns on investment
makes long-term research investment essentially impossible. However,
since sustained economic growth requires continued basic research to
fuel advances, Mr. Berkeley thinks we must collectively (i.e., federal
labs and the private sector) invest in research. The role of government-funded
research will be even more critical in the global economy.
Site Visit to the David Taylor Model Basin, Carderock Division,
Naval Surface Warfare Center
(December 20, 2000)
The David Taylor Model Basin (DTMB) of the Naval Surface Warfare Center
(NSWC) is one of the largest, most advanced facilities in the world
for building and testing ship models. It is also one of the oldest.
It was the "brainchild" of a brilliant young naval engineer and ship
designer, [the future] Rear Admiral David Watson Taylor. Born in Louisa
County, Virginia in 1864, Admiral Taylor was successful in convincing
Congress of the importance of towing tanks to support model testing
in the design and construction of warships. In 1896, Congress appropriated
$100,000 for the construction of an Experimental Model Basin (EMB)
at the Washington Navy Yard on the Anacostia River. Admiral Taylor
designed and supervised the EMB's construction and for 15 years was
the facility's supervisor. By the standards of the time, the EMB was
state-of-the-art and evolved over nearly a century to remain so today.
The original Navy Yard EMB, completed in 1898, had a carriage driven
by four 450-horsepower motors. This carriage towed model hulls and
mounted photographic equipment so that engineers and naval architects
could study the generation of eddies and wave-making resistance and
their impact on vessel efficiency. The EMB itself was 14 feet deep,
42 feet wide and 470 feet long -- the longest in existence. The basin
was filled with 1,000,000 gallons of water -- about half its total
volume. A grillwork of wooden strips gradually descended below the
basin's water level to form a beach that served as a wave break to
smooth the water. It should be noted that a similar device is employed
today at the Carderock facility nearly 90 years later. This is a testament
to the quality of Admiral Taylor's experimental design. During the
period 1914 to 1922, more than 1,000 ship designs were tested including
all Navy ships and many civilian vessels. The testing of commercial
ship designs established a precedent that endures today. The DTMB is
a national engineering resource that is available on a fee-for-service
basis to commercial shipbuilders and naval architects. Interestingly,
during our visit, the U.S. America's Cup Team was conducting "classified" tests
of hull designs for their latest 12-meter racing yacht. Unfortunately,
our Carderock host, Mr. Tom Warring, or any of us had the necessary
security clearances to visit the America's Cup facility (not withstanding
the fact that at least three of us are avid yachtsmen).
Admiral Taylor also designed and supervised construction of the Navy's
first wind tunnel at the EMB. Thus, the laboratory took its (and perhaps
the Nation's) first step toward institutionalized aeronautical research
and development.
By 1929, it became apparent to both Admiral Taylor and the current
EMB Officer-in-Charge, Captain Ernest F. Eggert, that the facilities
were no longer adequate due to increasing demands placed on the laboratory
for both ship and aeronautical testing. In addition, the EMB's proximity
to the Anacostia River made the basin subject to periodic flooding,
and the action of natural springs was undermining the basin's sandstone
foundation. These were facts that David Taylor fully realized at the
time of the original design effort. From 1929 through 1933, Admirals
George H. Rock and Emory S. Land, successive Chiefs of the Navy Bureau
of Construction (after David Taylor) considered the problem of building
a new facility. In 1933, Admiral Land began an aggressive lobbying
effort to enlist the support of politicians, federal agency executives,
the scientific and engineering communities, and the commercial shipbuilding
industry to build a new basin as a public works project. As a senior
advisor, Admiral Land relied upon his mentor, Admiral Taylor, who had
suffered partial paralysis in 1932. By the end of 1933, President Roosevelt
signed the Model Basin Bill with the new facility to be sited at Carderock
and named in honor of Admiral Taylor. Groundbreaking took place on
September 8, 1937 and the dedication of the David Taylor Model Basin
was held on November 4, 1940, attended by Admiral Taylor, his wife
and daughter, and many professional colleagues.
Captain Eggert with Captain Harold Saunders worked on the design details
for the new facility. The requirements included the following:
- Minimal noise and ground vibration.
- Easy access to Navy's District of Columbia Offices.
- Doubling of the facility's work area (over the previous site).
- Firm bedrock foundation for the basins (the actual site is on a
granite shelf above the Potomac River).
- Multiple basins designed for varying specific functions.
- Basins sufficient to tow a model for eight seconds at constant
speed.
- Adequate fresh water supply for filling basins.
Needless to say, the DTMB has undergone many improvements in instrumentation,
facilities, and experimental capabilities since 1940, and will no doubt
continue to do so in the future. As its Commanding Officer, Captain
Steven Petri and its Technical Director, Mr. Richard E. Metrey pointed
out: "The lab [DTMB] is unquestionably unique in the world. It is an
important national resource for both the Navy as well as the civilian
maritime industry. Although the number of capital ships in the Navy
has decreased over the years, and military shipbuilding occurs at a
low rate, the DTMB is vital to maintain the country's technological
capability to produce warships should they be needed in the future.
If the laboratory were to cease operations and eventually be eliminated,
an irreplaceable body of scientific and technical knowledge would disappear
along with it." In order to maintain stewardship of this knowledge,
the DTMB is considering expanding its fee-for-service offerings to
the private maritime industry. Additionally, the lab's technical staff
is routinely involved in cutting-edge basic and applied research and
development projects through CRADAs with both universities and industrial
organizations to advance the technical knowledge base in ship design,
propulsion systems, hydraulics, computer-aided design and fluid mechanics.
Richard E. Bissell
Executive Director
Committee on Science, Engineering, and Public Policy
National Academy of Sciences
(January 10, 2001)
Topic: Mission, Goals, and Science Policy of the National Academy
of Sciences
Dr. Richard Bissell joined the National Academies in June 1998 as
Executive Director of the Policy Division and concurrently as the Director
of the Committee on Science, Engineering, and Public Policy. His most
recent assignment was head of the organizing secretariat of the World
Commission on Dams, a joint initiative of the World Bank and the World
Conservation Union. During 1994-1997, he was a founding member and
chair of the Inspection Panel, an independent accountability mechanism
established by the Executive Directors of the World Bank to ensure
compliance with World Bank policies by its management. Between 1986-1993,
Dr. Bissell was Assistant Administrator at the U.S. Agency for International
Development, first as head of the Bureau of Policy and Program Coordination,
and later, as head of the Bureau of Research and Development. He received
his B.A. from Stanford University and his Ph.D. from Tufts University.
The National Academy of Sciences (NAS) was established by an act of
Congress in 1863. The function of the NAS is to give advise on scientific
matters. NAS is often called upon by government agencies to give advise
on scientific matters of national interest. The NAS receives no compensation
for its services and is able to attract the best experts in the world
to offer their services free of charge; however, expenses are covered.
Most countries restrict their NAS to the physical sciences, but the
U.S. NAS includes the social sciences as well.
The National Academies are: (1) the National Research Council, (2)
National Academy of Science, (3) National Academy of Engineering, and
(4) Institute of Medicine. There are 1,927 members of NAS and 91 members
emeritus. They comprise 317 foreign associates and 25 (soon to be 31)
membership classes in six classes. Members and foreign associates are
elected annually in April.
The four priorities of NAS are to: (1) increase effectiveness of the
scientific enterprise; (2) improve the understanding of science and
the enterprise; (3) improve science and mathematics education for all
Americans; and (4) bring the wise use of science and technology to
the center of national and international efforts.
Internationally, NAS collaborates with foreign National Academies.
Worldwide, there are 74 National Science Academies. The Inter-Academy
Council allows NAS to enlist other international NAS for studies on
global issues, such as mad cow disease.
The President of NAS, Dr. Bruce Alberts, serves five years and is
also head of the National Research Council. The NAS website address
is www.nas.edu. All NAS reports are on the website and available for
use by the public.
The current challenge for NAS at the grass roots level is the issue
of biotechnology and genetically engineered food. The use of stem cells
for research is another.
How is NAS influenced by political pressure? The charter's perspective
is that NAS provide absolutely objective advice as well as conclusions
to all issues that they undertake.
There have been instances of issues that have become politicized and
in which NAS has been asked to take a position. An example of this
would be the Government Performance and Review Act (GPRA). Since there
is no quantitative outcome to GPRA, NAS dealt with the issue by focusing
on getting the two sides to talk to each other about the positive aspects
of GPRA. Another example of a politicized issue was the 1992 budget.
The NAS stepped in to help shape the federal expenditure for research
because research, by its very nature, cannot be done from year to year
but must be done on a long-term basis. People with conflicts of interest
do not serve on committees at NAS. In the area of teaching, the non-competitiveness
is becoming a problem. A legislative proposal creating incentives for
people to go back and teach K-12 has been offered by NAS. These are
graduates of math and science who do not want to go the academic route.
There is a drop-off for girls between grades 5 through 7, and NAS is
looking at ways to solve this problem.
Examples of NAS involvement include: (1) reshaping the graduate education
of science and engineering, (2) mentoring/advising (e.g., teacher,
role model, friends), (3) enhancing the postdoctoral experience for
scientists and engineers (e.g., paying them more). [There are more
graduates than there are academic positions.]
A study by NAS starts with someone calling up and saying, "we need
a study in . . ." This accounts for 45 percent of NAS studies. Another
is a congressman who inserts the request into an appropriations bill.
This accounts for five percent of NAS studies. Another is NAS coming
up with a study of interest to everyone. An example of NAS initiated
study is a 1992 study, which concluded that government should fill
science positions with highly qualified people. Lastly, private foundations
and states can make a request for study. When a request is made by
a state, NAS proposes a budget for the study and the state funds the
research. Examples of recent reports are: (1) science education for
establishing an education standard for science, and (2) experiments
in international benchmarking of United States business methods in
the fields of immunology, material science, and mathematics.
NAS also started the science, technology, and law program two years
ago. NAS works closely with the judiciary because of recent decisions
recognizing the use of state-of-the-art science methods in courtrooms.
Other topics studied by NAS are food and nutrition, assuming genetic
risks, post-protected plants, and arsenic in drinking water.
Kathie L. Olsen
Chief Scientist
National Aeronautics and Space Administration (NASA)
(January 17, 2001)
Topic: NASA's Research Directions
Dr. Kathie Olsen, a biologist, was selected in 1998 as NASA's Chief
Scientist primarily to lead NASA in the development of an integrated
biology plan, coordinating all aspects of NASA's research in fundamental
biology, astrobiology, and biomedical research as NASA transitions
into the era of the International Space Station (ISS). Dr. Olsen's
background includes an early career as a research scientist and Assistant
Professor at the State University of New York at Stony Brook, several
years as a research program manager with the National Science Foundation,
and a two-year assignment to the office of Senator Conrad Burns of
Montana, Chair of the Senate Subcommittee on Science, Technology and
Space. This broad experience equipped her to lead NASA in developing
a robust plan for biological and biomedical research, which ultimately
resulted in the creation of NASA's fifth strategic enterprise, the
Office of Biological and Physical Research. Dr. Olsen doubles as the
temporary leader of this new organization, until a permanent senior
administrator can be selected.
Dr. Olsen introduced NASA's science as four space-related research
thrusts: (1) From Space -- understanding the Earth using satellite
remote sensing; (2) About Space -- solving the mysteries of the universe
and discovering if we are alone; (3) In Space -- using the microgravity
environment of space to do fundamental and biomedical research; and
(4) Enable Space -- technology development to enable human and robotic
exploration of space. The new strategic enterprise, which she temporarily
leads, concentrates on the In Space opportunities of fundamental research
in the physical and biological sciences, and on biomedical research
and countermeasures. Part of this research, of critical importance
to NASA for its mission of human space exploration, is research directed
toward the health and safety of the astronauts.
There are a number of major health and safety issues facing astronauts
involved in long duration space flight. In the absence of gravity muscles
atrophy and bone mass decreases. The exercise programs developed to
help prevent this, and tested on the Shuttle and on Mir, have not been
sufficient to halt the processes. Radiation is a major problem. The
radiative environment on the ISS or during an interplanetary mission
presents the risk of extensive tissue damage. Countermeasures must
be developed to observe and correct damage before health problems develop.
The vestibular system is disrupted in the absence of gravity causing
the astronauts to experience motion sickness during the first few days
of any flight, and the circadian rhythms are disrupted, making sleep
difficult. Gene expression changes and neuro-plasticity have been observed
in animals exposed to the microgravity environment. We don't understand
the process of evolving in the absence of gravity, or what this could
mean for very long-term human presence in space.
Possibly the most challenging astronaut health and safety issues involve
human factors. Astronauts will face long tours of duty in cramped quarters,
lacking privacy, in unfamiliar situations, requiring peak performance
while experiencing multiple stresses. Dr. Olsen related that astronauts
have historically become astronauts because they are aggressive, highly
competitive risk takers -- not the type of people generally known for
excellent interpersonal skills. While a five- to ten-day flight can
be successful, even if the crew is competitive and tense with each
other, a multi-month or multi-year mission will not succeed if interpersonal
issues and team dynamics are not satisfactorily addressed.
Dr. Olsen concluded by reminding us of the importance of the involvement
of scientists and engineers in public policy. The number of scientists
and engineers in Congress or working on Capitol Hill is woefully small,
as is the number of practicing scientists or engineers willing to become
engaged in public policy. However, our country needs people who understand
both perspectives and live in both worlds. To prove the point, she
related hearing an elected representative on Capitol Hill questioning
why the government needed to fund NOAA, since one can always simply
turn on The Weather Channel to find out the forecast! She told that
our involvement is sorely needed and that the ComSci Fellowship is
providing the training we need to be able to influence policy and make
a difference.
Robert L. Buchanan
Senior Science Advisor
Food and Drug Administration
(January 24, 2001)
Topic: Food Safety Issues
Dr. Robert Buchanan began his presentation with an overview of the Food
and Drug Administration (FDA). At its core, FDA is a science-based, public
health agency. However, it is also a medically oriented regulatory agency.
FDA's primary duty is the pre-approval of products on the marketplace.
It is one of the oldest consumer protection agencies in the world and
in many ways represents the "gold standard." FDA is actually made up
of seven different centers based on whether their activities are pre-
or post-market and the type of activity they monitor.
- Center for Food Safety and Applied Nutrition
- Center for Veterinary Medicine
- Center for Drug Evaluation and Research
- Center for Biologics Evaluation and Research
- Center for Devices and Radiological Health
- Office of Regulatory Affairs
- National Center for Toxicological Research
Pre-market activities involve approval of food or drugs and make up
the bulk of work for most of the centers. Post-market activities involve
inspection of facilities, lab analysis to identify if adulterated food
was produced and introduced into interstate commerce and evaluation of
the implementation of regulations. Dr. Buchanan's center, the Center
for Food Safety and Applied Nutrition, concentrates on post-market activities.
Stepping back to the larger picture of agencies that monitor United
States food and safety regulations and laws, there are three main agencies
involved: (1) FDA; (2) U.S. Department of Agriculture, Food Safety and
Inspection Service, and the (3) U.S. Environmental Protection Agency.
These agencies are given their authority by Congress. None have strong
enforcement powers so they depend more on providing proactive guidance
and finessing the rest. There are four regulatory actions FDA can take:
(1) recommend a voluntary recall; (2) seize any adulterated product;
(3) enjoin the manufacturer against further production; and (4) assess
civil and criminal penalties for willful introduction of adulterated
products. When thinking about regulation, it is important to stress that
the final responsibility for producing safe food lies with the organizational
components of a product's manufacturing and distribution chain.
In Dr. Buchanan's opinion, good public health policy is a combination
of good science and good law. One aspect of good science that leads to
good public health policy is the process of getting pre-market approval
comments from the public before undertaking any new policy. The "public" in
this case consists of consumer groups, the industry affected, even other
federal agencies, etc. It is critical to both the good science and the
good public policy aspects that all these people and the groups they
represent have a chance to weigh in on the issues. In another example
of external Site Feedbackhoning
policy, the public also has the right to petition that regulations be
changed and the FDA must respond to those concerns.
The Center for Food Safety and Applied Nutrition has current activities
in a wide variety of areas. For instance, in the area of pre-market approvals,
the agency is involved in the irradiation of foods and genetically modified
organism (GMO) debates. The irradiation debate centers on the safeness
of irradiation of food to kill organisms. Dr. Buchanan explained that
the use of radiation is a food additive by law. As a food additive, it
must be proved, with a high degree of certainty, to be a safe process
before FDA will approve it. In the case of genetically engineered corn
that was used in taco shells was a case of adulterated food since the
process to engineer corn is considered a pesticide and has not been approved
for human consumption. In the area of nutrition, the agency is reviewing
infant formula and examining the use of "probiotics" in food. Probiotics
are used to add good bacteria to foods. FDA recently issued a public
health advisory on the levels of ethyl mercury in fish and published
an assessment and action plan in the Federal Register that pertains to
listeria monocytogenes risk. In addition, FDA is examining the health
claims made by the multitude of dietary supplements on the market.
Finally, Dr. Buchanan talked about how FDA assures that the "best science" standard
is met. This standard is maintained through three mechanisms: (1) the
existence of a qualified cadre of scientists on staff for advice and
review; (2) procedural requirements that are clearly outlined and rigorously
followed; and (3) the extensive use of "advisory" committees. In addition,
he stressed that FDA does not promote any single technology. Rather the
agency seeks to write regulations around a "performance standard." This
ensures that the regulations do not stifle technological innovation,
but encourage it.
Site Visit to the Bureau of Engraving and Printing
(January 31, 2001)
Mr. Thomas A. Ferguson was appointed as the Director of the Bureau of
Engraving and Printing on December 7, 1998. Prior to his appointment
as Director, he served as the Deputy Director. Mr. Ferguson holds a B.A.
in economics from Lafayette College in Easton, Pennsylvania and a Masters
degree in Public Administration from the University of Southern California.
The Bureau of Engraving and Printing (BEP) is comprised of two facilities
in the District of Columbia and Fort Worth, Texas. The BEP produces all
currency, government securities, and more than half of the Nation's postage
stamps. The BEP has 2,600 employees and is a self-funded agency. They
are funded in a novel way -- by selling currency to the Federal Reserve.
In FY 2000, sales accounted for 520 million dollars in revenue. BEP does
not operate to make a profit -- simply to recover costs.
Current issues with the Bureau are: durability (e.g., polymer substrates
are now being used in notes, and have been found to last four to five
times longer than the traditional notes); counterfeit deterrence; and
making it easier for the visually impaired and the blind to use currency.
The average life of a one-dollar note is 18 months. For a five- and ten-dollar
note, the average life is three to five years. Fifty and one hundred
dollar notes last for a long time.
Copiers are now made so that they do not copy or print currency. For
the last three to four years, the big problem that has been developing
is the computer ink jet printers, using digital cameras. The punishment
for the crime of counterfeiting is on a graduated scale, and is based
on how much currency is actually produced. A panel was put together by
the National Academy of Sciences and their report led to the new twenty-dollar
bill. A lot of money has been spent on public education in the last five
years, but people don't seem to pay attention. The new currency has held
up very well against counterfeiters. The cost to print one dollar is
3.8 cents, and the cost to mint one dollar is 12 cents. The cost for
printing one hundred dollars is 5.82 cents. About 60 percent of U.S.
currency is outside of the United States. The law (1980's Conte Amendment)
requires BEP to buy its paper products from an American company, which
manufactures its products in the United States, and which is owned by
a U.S. citizen. Research is funded on an as needed basis, which amounts
to about two million dollars a year. Johns Hopkins' Applied Physics Lab
is funded by BEP. BEP is never the first to use new technology, instead
they wait until it is proven to work because of the high volume of currency
in circulation, which would make it difficult to replace. By statute,
BEP only prints for U.S. agencies. They would like to print for foreign
governments and are awaiting legislation to that effect.
Tina Sung
President and CEO
American Society for Training Development (ASTD)
(February 7, 2001)
Topic: Quality and the 21st Century Leader
Ms. Tina Sung became the ASTD President and CEO in January 2000. A community
of dedicated professionals, ASTD acts as a catalyst, empowering members
and customers around the world to continuously improve and advance workplace
learning and performance using an unmatched global network of resources
and expertise. Before she took over the leadership of ASTD, Ms. Sung
directed the Federal Quality Consulting Group, a team that provides consulting
services to executive level government leaders, co-founded the Federal
Quality Institute, was a senior examiner for the Malcolm Baldrige National
Quality Award, and was part of former Vice President Albert Gore's National
Partnership for Reinventing Government.
Ms. Sung presented various theories of effective leadership. She spoke
of the need to master a content for effective leadership that is very
much situational. Different quality principles are applicable to either
private or government sectors because the output timelines differ between
the two. Ms. Sung discussed performance measurement tools such as the
Baldrige performance evaluation system and GPRA for improving quality
and increasing accountability in government, where a customer satisfaction
index was introduced at the macro level in 1993. The Baldrige Performance
Excellence Criteria Framework is built on a systems perspective and includes:
1) leadership, 2) strategic planning, 3) customer and market focus, 4)
information and analysis, 5) human resource focus, 6) process management,
and 7) business results. Ms. Sung noted that high performing organizations
pay attention to these criteria. She discussed the use of a balanced
scorecard by federal agencies as a means of providing the necessary linkage
between results achieved and resource input.
In discussing the trends in 21st Century leadership, Ms. Sung told us
about the theory that aspiration or desperation drives all actions, with
80 to 90 percent of actions driven by desperation (attributed to Peter
Senge). However, she stated that aspirations are desirable for more sustainable
actions. The interrelationship of aspiration, behavior, and circumstances
(ABC model) was discussed. Ms. Sung also discussed the seven trends to
be successful leaders -- (1) be meaningful (boiled down information),
(2) be global and local (tension between regional and international,
create emotional bond that allows people to give back willingly), (3)
be transparent (era of surveillance, seek customer participation), (4)
be diverse, (5) be intergenerational (develop appreciation for different
driving forces), (6) be adaptive and flexible, and (7) be curious. (From
Working Draft, Institute for Alternative Futures)
Frank J. Cilluffo
Deputy Director
Global Organized Crime Program
Center for Strategic and International Studies
(February 14, 2001)
Topic: Cyber Attacks and Privacy Issues
Mr. Frank Cilluffo works for the Center for Strategic and International
Studies (CSIS) based in Washington D.C. He is a Senior Policy Analyst
and Deputy Director of the Global Organized Crime Program. While most
of Mr. Cilluffo's discussion focused on cyber crime, he also drew our
attention to a new report on "bio-terrorism" that CSIS had recently released.
The report highlighted the need to integrate the public health and the
law enforcement communities to effectively identify and respond to bio-terrorism.
Currently, most cyber-crime is really cyber-vandalism as opposed to
organized terrorist attacks. However, what is truly frightening is that
the same tools used in cyber-vandalism can easily be adapted to cyber-terrorism.
Several of the recent attacks were alarmingly terrorist-like. For instance,
a youth disabled the Federal Aviation Administration's air traffic control
computers in a mid-sized city in Massachusetts and another youth interrupted
911 communications in a South Florida community. Mr. Cilluffo said that
an immediate focus is needed on critical infrastructure protection and
vulnerability.
One of the major problems with improving Internet security is that the
Internet was never designed to be secure. It grew out of a government-sponsored,
data sharing environment that was meant to be open. However now that
everyone is using the Internet, Internet security is not just a government
problem and it will take a government/private sector partnership to address
it.
Mr. Cilluffo identified some problems that law enforcement agencies
are facing when working Internet crime. First, there are no state and
national boundaries in cyber-space so it is hard to tell if an agency
is outside its jurisdiction or not. Second, the rules of engagement are
just now being drafted.
The United States needs leadership in three main areas to address this
challenge. First, we need leadership in technical developments to improve
security. Second, we need people to step forward and make the partnerships
and technical innovations a reality. Finally, we need new policies to
help guide us in managing the risks associated with cyber-terrorism.
In order to obtain this new level of leadership, we must have an increase
in education and awareness of risk.
To summarize, Mr. Cilluffo identified several steps the government can
take: (1) institute a Cyber 911; (2) create the position of presidential
advisor in the area of technology; (3) concentrate on retention of technical
employees (once these professionals are trained, they typically seek
employment in the private sector); (4) create an information assurance
discipline; (5) protect private companies who work with law enforcement
from Freedom of Information Act regulation; (6) encourage insurance companies
to underwrite coverage for companies; (7) create standards for Internet
security that each government agency must meet; (8) fund architectural
changes in the Internet; and (9) spend money to encourage all these activities.
Site Visit to the Ronald Reagan National Airport
(February 20, 2001)
Mr. Michael Chinn of the Ronald Reagan National Airport Engineering
Group and Airways Facility, and Mr. Robert Laser, Manager for Air Traffic
Control, hosted an excellent briefing and tour for us, ComSci alumni,
family and friends.
Fast facts on Reagan National Airport -- the airport opened in June
1941; the airport has three runways (the longest is 6,869 feet); and
there are 44 gates serving the flying public. The airport estimates that
it served 15.9 million passengers in 2000. That's more than 42,000 passengers
a day!
We divided into groups to tour the control tower, the automated radar
tracking control center, and the computer room. Senior Air Traffic Controller
Russ Adams led one of the groups. During our tour of the air traffic
control tower, he informed us that on average 60 airplanes land and depart
every hour. He stated that on a clear day, controllers have about a three-mile
visual of the planes from the tower. We could not help but notice the
panoramic view of the local area from the tower. We agreed that the tower
is one of the best vantage points from which to view the Washington area.
While in the tower, we viewed one of its newer technological features
-- the alphanumeric radar tagging system. This system allows the air
traffic controllers to call up each incoming and outgoing flight number
from the radar screen in a timely manner for communication purposes.
As we descended the tower, Mr. Adams pointed out a window and alerted
us to an approaching Airbus A320 coming in for landing. With his trained
eye, he noted that the aircraft's angle of decent looked too great for
it to land. We watched as the Airbus began its descent. Then suddenly
when it was about 200 yards from the tarmac, the pilot went to full throttle
and pulled up and out of the landing. Mr. Adams noted that the pilot
would have to fly about 13 miles out and then get back in line to land.
He said that this type of aborted landing is fairly common.
From the tower, we went to the control room of the automated radar tracking
system. Here the air traffic controllers assist aircraft on their approach
and departure from Reagan National Airport. The room is quite dark enabling
controllers to easily view their radar screens. On incoming flights,
the controllers assist the aircraft until they are handed off to the
controllers in the tower. Each of us had the opportunity to sit down
and take a close look at the radarscope. We were informed that the controllers
at Dulles, BWI, and Reagan National work each other's air traffic.
Our last stop was the large computer room that runs the air traffic
control center. We were amazed at how the new and old computer systems
were able to work in harmony to provide safe and efficient flying at
Ronald Reagan National Airport.
Visit to the National Institute of Standards and Technology (NIST)
-- VIP Industry Day
(March 6, 2001)
With ten laboratories (called Operating Units or "OUs"), the Advanced
Technology Program (ATP), the Manufacturing Extension Partnership (MEP)
Program and nearly 3,000 employees, NIST is a lot to see, much less appreciated,
during one three-hour visit.
NIST's story began during the summer of 1899 when Secretary of the Treasury
Lyman J. Gage initiated a campaign within the McKinley Administration
for the creation of a national "standardizing" laboratory within the
Federal Government. Interestingly, in 1899, following Great Britain's
establishment of its National Physical Laboratory, the United States
remained the only great commercial [and rapidly industrializing] Nation
without a national standards laboratory. Late in October 1899, Secretary
Gage appointed Samuel W. Stratton, a 38-year old professor of Physics
at the University of Chicago, to the nominal post of "Inspector of Standards." Dr.
Stratton's immediate task was to research the requirements for such a
national laboratory and to draft a bill for its establishment suitable
for introduction into Congress. Dr. Stratton did his work well. His report
with the draft bill contained the "overwhelming" endorsements of the
leading organizations of America's scientific and engineering community.
Included among these endorsements were those of the National Academy
of Sciences, the American Philosophical Society, the American Association
for the Advancement of Science, the American Physical Society, the American
Chemical Society, and the American Institute of Electrical Engineers.
Additionally, the Bill was supported without reservation by leading scientists,
every scientific agency in the Federal Government, and leaders from virtually
every field of industry. Congressman James Southard of Ohio, an enthusiastic
champion of the Bill said, "never has a bill come with such a number
of endorsements." The draft legislation was included in a letter from
Gage to Congress on April 18, 1900 and enacted into law on March 3, 1901
with an effective date of July 1 of that year. The laboratory was to
be called the National Bureau of Standards.
It is also interesting to note that the present NIST mission is essentially
unchanged from that proposed by Gage in his April 18, 1900 letter to
Congress:
- "The functions of the bureau shall consist in the custody of the
standards";
- "The comparison of the standards used in scientific investigations,
engineering, manufacturing, commerce, and educational institutions
with the standards . . . recognized by the Government";
- "The construction when necessary of standards, their multiples and
subdivisions";
- "The testing and calibration of standards-measuring apparatus";
- "The solution of problems which arise in connection with standards";
- "The determining of physical constants, and the properties of materials
when such data are of great importance to scientific or manufacturing
interests and are not to be obtained of sufficient accuracy elsewhere."
It is important to understand that by law, NIST was to have [and to
this day, has] no regulating or policing powers. Enforcement of standards
was left to the discretion of the states. In fact, NIST has always recognized
that its role was that of a partner with the private sector and the scientific
and engineering communities. A significant part of this role was to expand
the base of knowledge in measurement science to better enable science
and industry to utilize the standards that they, themselves, find important.
In effect, NIST provides services that correct the classic "under investment" of
the private sector in certain areas of research and development whose
results in terms of realizing economic rents cannot be wholly appropriated
by the firms that would make such an investment. NIST, by custom, tradition,
and law cannot and will not seek to control or influence the behavior
of the Nation's free market economy. Indeed, any new initiative proposed
by or for NIST is carefully analyzed to assure that it is within the
best interests of the Nation and private industry. NIST's mission is
mandated by Article 1, § 8 of the U.S. Constitution: "The Congress shall
have the power . . . To coin money, regulate the value thereof, and of
foreign coin, and to fix the standard of weights and measures . . ."
Although NIST has no policing or regulatory function, its responsibilities
for standards development, standard instruments, tests, methodologies,
and the determination of physical constants and material properties allow
its scope of research in the physical sciences to be virtually unbounded.
Moreover, NIST is free to cooperate with firms in the private sector,
academic institutions, other government agencies at the state and local
levels both in the United States and abroad, and all scientific and technical
societies and professional organizations.
We were able to see only a very small part of the activities in which
NIST is involved. At the Manufacturing Engineering Lab, we observed the
testing of high-speed computer-controlled machining techniques, and we
saw how this same laboratory was supporting the National Archives in
the development of a preservation system for the Declaration of Independence
and the U.S. Constitution. In another laboratory, we saw techniques for
measuring and calibrating color patterns for cathode ray tubes and flat
panel displays. From these limited activities, it was easy to gain a
sense of NIST's connection to United States industry, to the academic
research community, and to industry and science worldwide. As the world's
leading physical science and engineering laboratory devoted to expanding
the state of knowledge of measurement methodology and standards development,
the impact and importance of its work over the past century is beyond
question. As the markets in which United States industry competes become
ever more global and competition from foreign companies becomes ever
more intense, NIST is in the forefront of assuring that United States
standards continue to be acceptable for products and services. Evidence
for the success of NIST's international activities is found in the fact
that NIST hosts between three and four hundred foreign guest researchers
annually, and the demand for foreign guest researcher positions is intense.
Anita F. Alpern
School of Public Policy
American University
(March 7, 2001)
Topic: The Role of Federal Managers in Science
Ms. Anita Alpern provided insights on the role of federal managers by
sharing her advice garnered from a lifetime of public service. She was
one of the first employees in the Department of Defense (DOD) and continues
to this day as an academic and mentor to future public servants and as
a consultant to the Office of Personnel Management and other federal
agencies.
Advice relevant to everyone -- consider the role of humor in doing your
job. It is essential to be able to laugh at oneself. Mistakes are inevitable
and it is also inevitable that one's mistakes are often more memorable
than one's successes. The old adage about the "lemonade-making opportunities
of life when handed lemons" was illustrated when Ms. Alpern told the
story of being offered a "top step" GS-3 secretarial position after a
reduction-in-force (RIF) relieved her of her GS-14 position. Instead
of accepting the GS-3, she took a slightly lower GS-13 with the Internal
Revenue Service. In so doing, she opened opportunities for herself that
would not have been available at that time to a female civilian in DOD.
While at IRS, she shaped her career to achieve a "super-grade" GS-18
(now Senior Executive Service).
Another sage career tip given was "be politically sophisticated." Ask
yourself "what if" to find the needed information to provide management
with a fresh perspective. Early in Ms. Alpern's career at DOD, she examined
the changes in composition and political perspective to a key DOD congressional
committee given the outcome of various House elections. Gathering these
data and sending them up her chain of command was one factor in her rise
in the ranks at DOD. She exhorted scientists to have a sense of the political
environment, i.e., to be savvy. Political savvy is representing your
interests and organization in good times and bad. NASA was the example
of an agency that has been lionized and demonized in our lifetime. We
were also cautioned about not letting the oftentimes "illogical" nature
of politics to frustrate the logical, "scientific" nature of most members
of our group.
Being a manager is difficult, but being a federal manager in these times
is particularly onerous. Management problems that have always bedeviled
federal managers, e.g., hurdles to appropriate rewards and punishment,
are now exacerbated by the difficulties in a time of transition from
the Clinton to the Bush Administration. In transition times, science
suffers because new decision-makers are focused on short-term goals given
the average tenure of a political appointee. Supporting science, particularly
basic research, is difficult because the results are often not seen in
short time frames.
Politics is also important for science, especially basic research, because
politicians want to see results and the results in the investment in
science may be delayed or tangential to the original goal!
Most managers have never been taught management skills and those selected
as managers may be chosen for their technical expertise and not for their
leadership abilities or even their "people" skills. For the ambitious,
greater management responsibility is the natural progression as there
are few high-level positions where one need not handle budget, planning
and personnel. It is in the personnel area that Ms. Alpern's presentation
dovetailed with that one of our previous speakers -- Ms. Tina Sung, President
and CEO of the American Society for Training Development. Both agree
in the prediction of a coming human resources crisis in the federal sector.
The upcoming retirement of many valuable employees coupled with the difficulty
the government has in hiring talented workers given the constraints of
salary, promotion, and reputation.
Thomas A. Kalil
Former Special Assistant to the President for Economic Policy
(Clinton Administration)
National Economic Council
(March 14, 2001)
Topic: Strategies to Bridge the Global Digital Divide
We met with Mr. Thomas Kalil, who was special assistant to President
Clinton for economic policy. He first explained the three roles of the
National Economic Council (NEC): (1) as "honest broker," (2) as "entrepreneur," and
(3) as "advocate" for the President's priorities. As the honest broker,
the NEC works to reach consensus among the government departments. When
consensus is not possible, NEC presents the competing views in an objective,
politically neutral manner. The role of entrepreneur is undertaken to
develop specific policy proposals and new initiatives to motivate others.
As advocate, the NEC acts to promote the Administration's priorities
internally, before Congress, and before the public-at-large.
In addition to his work with the NEC, Mr. Kalil served as an advisor
and advocate to the Clinton Administration for information and communication
technologies. This work was organized around the following four major
questions:
Question
1. How can the United States maintain leadership in information
technology (IT) research and development (R&D)? Three program areas
were stressed: (1) high performance computing and communications, (2)
next generation Internet -- information technology for the 21st Century,
and (3) the development of nanotechnologies. In the policy arena, legislation
was supported in areas such as telecommunications reform, extension
of R&D tax credits, liberalizing of export controls, spectrum allocation,
and improving the competitive abilities of United States corporations.
Question
2. How can the "digital divide" be narrowed to prevent further
splitting the country between the "technological" haves and have-nots?
How do we ensure that an ever increasing number of Americans benefit
from the new IT developments? There was an early push to get technology
into schools, from wiring schools, training teachers, increasing content
on the Internet, and by creating national technology centers.
Question
3. How can the benefits of IT be exploited to address the national
needs? How do we move government at all levels to provide citizens
with information and services online? Issues addressed included enhance
K through 12 learning, life-long learning, and legal barriers preventing
distance learning. Also studied was IT's application to improve the
quality of disabled Americans through the Americans with Disabilities
Act.
Question
4. What are some of the downsides of IT, and how can they be accommodated?
An important focus was IT's impact on the erosion of one's individual
privacy and issues of cyber-security. Specifically, the individual
security issue was concerned with IT's enabling the creation of a profile
of one's activities and the tracking of individual medical and financial
records. Suggestions for regulating such potential intrusions into
private life were also proposed. In other areas there needed to be
a balance between the need for information and the choice of individuals
to either opt-in or opt-out of these systems.
Mr. Kalil then talked about what it took to be an effective staffer
in the White House. It is often with the staffer that politics and policies
meet. He mentioned five major points:
- An effective staffer needs to have an action agenda with priorities
to balance projects to be addressed pro-actively with those that are
more routine.
- A staffer must understand how to raise the visibility of the important
issues. The challenge is to develop strategies to raise awareness of
issues and signal that these are in fact the President's issues. Strategies
discussed included having the President mention the topic in a speech,
having it included in an executive memo, and having the President discuss
the topic from his "bully pulpit."
- A staffer needs to be a "utility infielder." This means developing
the ability to think of the entire life cycle of ideas from initiation
through implementation. Some important questions for a "utility infielder" are
the following: what will be the President's role in a new initiative;
how will this idea be seen on Capitol Hill; and how active will this
idea's constituency be?
- An effective staffer requires a strong network. Champions are needed
at the highest level in the Administration that would support the staffer's
priorities. Also needed are people who can supply good ideas, with
advice on what should and should not be done. Staffers need to cultivate
a reputation and demonstrate that they will deliver what they promise.
Finally, staffers must actually be able to execute -- to get things
done. Many times this is done through the exchange of favors.
- Finally, within the context of science and technology (S&T) issues,
there is a need to be able to communicate with various audiences. This
cannot be accomplished by using jargon. It must be done by using terms
and analogies that are meaningful to these audiences.
The final topic discussed was S&T issues in general. There needs to
be a balanced research budget between health care and the other scientific
disciplines. Advances being made in health care are dependent on research
in other sciences, especially the physical sciences. Research in these
areas is what is driving the U.S. economy today. Many of the commercially
valuable technologies emerged from these research areas. This research
creates the next generation of scientists and entrepreneurs and should
be supported for its educational as well as its commercial value.
Andrew W. Reynolds
Deputy and Chief of Staff to the Science and Technology Advisor to the
Secretary of State
(April 4, 2001)
Topic: Science and Technology in 21st Century Foreign Policy
Mr. Andrew Reynolds, the Deputy Science and Technology Advisor to the
Secretary of State, received a B.A. in International Relations and Pre-med
at the University of Virginia. This included work at the University of
Copenhagen and the Geothe Institute in Germany. Mr. Reynolds completed
graduate studies in Energy Technology Management at George Washington
University in Washington, D.C. He speaks and reads Spanish, French, and
Italian well. Mr. Reynolds has been a career civil servant since 1975.
Currently, the Department of State (DOS) is in a period of change. The
day Mr. Reynolds spoke to us was also the day DOS received the American
Association for the Advancement of Science (AAAS) Fellows as a part of
DOS's new endeavor to boost global science and technology (S&T) cooperation
efforts. DOS's S&T budgets have been on the decline for the last 20 years.
When personnel cuts were necessary in the 1990's, they came out of the
S&T offices, not the Foreign Service. S&T has depended on the arms control
area for a substantial part of funding and direction.
Mr. Reynolds said S&T has become a greater part of foreign policy and
believes that in the future, the most successful Foreign Service candidates
will have a B.S. in Science, a Masters of Public Administration, and
a taste for Foreign Service duty. For the present, a significant challenge
is to provide an outreach offering an understanding of S&T to Foreign
Service personnel and their bureaus so that informed decisions regarding
S&T policy may be made. Personnel cuts have left DOS short-handed and
deprived of experienced mid-level employees. Secretary Colin Powell is
making reforms to DOS that are intended to remedy these problems.
At the present time, foreign policy is driven more by global than regional
themes. Issues like global warming and intellectual property rights affect
the whole world, not just a region. These issues are multipolar not bipolar
as was often the case in the past. In the 1990's, DOS spent most of their
S&T activities on the environment at the expense of other S&T issues.
Some changes were offered for consideration:
- Develop a strategic plan for 5-10-20 years out.
- Lengthen Foreign Service tours -- currently tours are three years
long. One year is spent learning the local issues, one year looking
for the next tour, and one good working year.
- Shift the commercial interest of DOS to medium and small enterprises
-- the Fortune 500 companies don't need DOS help.
The modernization of DOS has been going on for a few years yet only
about 25 percent of the 159 Embassies have the classified workstations,
and are still dependent on the Cable (message) channels. Even though
95 percent of the traffic is unclassified, the old guard is resistant
to openness. Clearly many challenges remain for DOS in its efforts to
accommodate the technology of the 21st Century.
Workshop on "Academic IP: Effects of University Patenting and Licensing
on Commercialization and Research;" presented by the Board on Science,
Technology and Economic Policy of the National Academies
(April 17, 2001)
The National Academy of Sciences building, with its beautiful central
hall under the rotunda, was the setting for an all-day workshop on intellectual
property and technology transfer in universities. Leaders from industry
and universities, as well as several prominent academics who study the
relationship between intellectual property and innovation and the economy,
gathered to examine the effects of university patenting and technology
transfer activities on the commercialization of ideas and technology
created in the universities and on university research and the academic
environment.
Dr. Robert Barchi, Provost of the University of Pennsylvania, highlighted
several aspects of the movement of technology from universities to industry
and the public. He cited statistics from an annual survey of universities
conducted by the Association of University Technology Managers (AUTM)
that show the process of moving technology from university laboratories
to the marketplace supported 270,000 jobs, contributed $40 billion to
the economy, and generated $5 billion in tax revenue in FY 1999. In terms
of overall revenue to the university, he said, the income from technology
transfer is a very small fraction. However, as Provost, he finds licensing
and other technology transfer income useful because it is not earmarked
for specific projects. At the University of Pennsylvania, 45 percent
of the money goes to inventors and the inventors' labs, but most of the
rest goes to the department, school and research foundation, and can
be used for new projects and avenues of research that would otherwise
be difficult to fund. Dr. Barchi also spoke of his days as a leading
researcher, when the ability to license monoclonal antibodies saved him
and his colleagues large expenditures of time and money that would have
been used to prepare monoclonals for distribution to other scientists
who asked for them. This was an insight into a value of licensing that
cannot be quantified simply by looking at gross revenue from licenses.
Dr. Barchi also spoke of the downside of university involvement in the
commercial game of patenting and licensing technology. Probably the most
important difficulty is the potential for conflicts of interest between
faculty members, who may test or evaluate or carry out research on products
and technologies, and the companies that make the products, when the
faculty member has a financial interest in the company. The issue of
conflicts of interest, and the need for academic institutions to engender
and retain public trust was also stressed by Dr. Eugene Bauer, the Vice
President for the Medical Center and Dean of the Medical School at Stanford
University. Some examples of how these issues may be addressed in a practical
sense were provided by Ms. Joan Leonard, who has dealt with them extensively
as Vice President and General Counsel of the Howard Hughes Medical Institute
-- a major U.S. employer and funder of academic biomedical researchers.
Academicians, represented by David Mowery of the University of California
at Berkeley; Maryann Feldman of Johns Hopkins University; Donald Siegel
of the University of Nottingham in England; and Marie Thursby of Purdue
University, collectively provided an array of information demonstrating
both the extent of university involvement in patenting and licensing
activities and the changing ways by which universities try to accomplish
the development and commercialization of new technology.
Kathy Ku, the Director of the Office of Technology Licensing at Stanford
University, gave the insights of one very experienced university technology
transfer operation in an excellent talk. She expressed the difficulties
of trying to successfully transfer technology to the commercial side,
and demonstrated how creative and flexible a successful operation must
be. The industry point of view was provided by Donald Felch of UOP, Inc.
and Jim Finnegan from the IP Business group of Lucent Technologies. Mr.
Felch highlighted the interests a company has when negotiating with a
university, and pointed out how those interests often conflict with the
university position. At the same time, he pointed out that there are
many reasons why it is to the benefit of both parties to work together
and take advantage of each other's strengths. Mr. Finnegan raised the
specter of what industry might do to fight back if universities begin
to use intellectual property, particularly patents, to bully companies
into large financial settlements by threatening lawsuits.
Those of us who stayed through the afternoon were treated to a lively
discussion on the ramifications of recent Supreme Court decisions that
exempt states from damages stemming from intellectual property infringement
suits. The discussion pitted Julie Katzman, the Democratic Counsel for
the Senate Judiciary Committee, and Justin Hughes from the United States
Patent and Trademark Office against James F. Shekleton, the General Counsel
for the South Dakota Board of Regents. Mr. Hughes and Ms. Katzman discussed
the imbalance created between states that own and license intellectual
property, and private universities and companies that, because of state
sovereign immunity cannot effectively protect their own intellectual
property from infringement by the states. Ms. Katzman outlined a bill
that has been drafted to try to correct the imbalance. Mr. Shekleton
made a valiant effort to explain the rationale behind state sovereign
immunity, and to defend the Supreme Court decisions and the use of the
sovereign immunity defense by states. All in all, it was a bright, acerbic
and frequently witty exchange that was also very informative.
Anne A. Armstrong
President Virginia's Center for Innovative Technology
(April 18, 2001)
Topic: The Center for Innovative Technology's Successes and Future Outlook
The Center for Innovative Technology (CIT) is a state government organization
of the Commonwealth of Virginia developed to foster the growth of businesses,
especially technology businesses, and broaden Virginia's economic base.
It is headquartered near Dulles Airport in its own uniquely shaped inverted
pyramid office building. It has a $15 million dollar budget and employs
50 people, who are distributed throughout the Commonwealth in several
regional offices. The regional offices are typically located near and
affiliated with major universities including Virginia Tech, the University
of Virginia, George Mason University, and Virginia Commonwealth University,
among others.
The Center serves as an incubator to help "start-up" companies establish
a presence in Virginia. Its function is similar to that of venture capital
firms that provide "seed" money for business initiatives that are likely
to turn profitable. Virginia gains in that companies establishing themselves
in a state are likely to remain in that state. This makes them good corporate
citizens with roots in the communities from which workers are drawn.
The Center serves as a bridge connecting government, academia, the business
community and the taxpaying citizens.
Ms. Armstrong described the Center and its functions and noted the success
it had in creating the "Dulles Corridor" of high tech companies. She
noted that the Dulles Corridor was only one of the several areas throughout
the state that the Center helped establish. She also said that the "biotech" industry
was coming to Virginia through the efforts of CIT. She cited the new
Howard Hughes Medical Facility to be located in the Dulles Corridor and
the Biogenetic Data Center at Virginia Tech. Other areas that the Center
is working to develop include the Virginia Spaceport at NASA's Wallop
Island, Virginia.
Site Visit to Celera Genomics
(April 25, 2001)
Dr. J. Paul Gilman, Director of Policy Planning at Celera Genomics,
provided us with an excellent overview of Celera Genomics. He reviewed
the history, the basics of genome theory, the accomplishments of Celera,
and a description of their business plan.
Celera Genomics Group is engaged principally in the generation, sale
and support of genomic information and enabling data management and analysis
software. The Celera Genomics Group's customers use this information
for commercial applications in the pharmaceutical and life sciences industries
in the specific areas of target identification, drug discovery and drug
development. Celera also provides gene discovery, genotyping and related
genomics services and has recently expanded its business into the emerging
fields of functional genomics, particularly proteomics and personalized
health/medicine.
Celera Genomics was established in May 1998 by the PE Corporation, now
known as Applera Corporation, and J. Craig Venter, Ph.D., a leading genomic
scientist and founder of The Institute for Genomic Research (TIGR). While
at TIGR, Dr. Venter led the first successful effort to sequence an entire
organism's genome, the H. influenzae bacterium. Celera was formed for
the purpose of generating and commercializing genomic information to
accelerate the understanding of biological processes.
Celera announced its first assembly of the human genome on June 26,
2000. The company began the analysis or annotation phase at that time
and published the human genome in Science in February 2001.
Celera's mission is to become the definitive source of genomic and related
medical and agricultural information. Users of Celera's information,
available on a subscription basis to academic and commercial institutions,
will have access to the most comprehensive databases in genomics and
related fields, as well as proprietary software tools for viewing, browsing
and analyzing data in an integrated way that can accelerate the understanding
and use of genomic and related information.
Dr. Gilman took us on a brief tour of the two labs located at the Rockville,
Maryland facility. The Sequencing Lab contains over 300 sequencing machines
able to process and store 100 terabytes of data. Using this facility,
Celera was able to sequence and assemble the human genome in just nine
months. The Data Center is the control room for monitoring 800 Compaq
Alpha processors, which process and store the data provided by the Sequencing
Lab. The company also has an alliance with Oracle for complete database
development and infrastructure for all planned Celera Genomics databases.
Dr. Joanne Yamauchi
Professor Emeritus, School of Communications
American University
(May 2, 2001)
Topic: Nonverbal Communications
A large part of our ComSci experience has been devoted to new information
related to science and technology. Yet much of what we "know" or "feel" is
formed, in large part, by nonverbal communication. Dr. Joanne Yamauchi's
presentation reviewed the basics of nonverbal communication, taking into
consideration such factors as cultural differences, gender differences,
age differences, and socio-economic differences. Over 90 percent of our
messages are nonverbal and it was demonstrated that when a person's verbal
and nonverbal messages are in conflict, people instinctively place greater
trust in the nonverbal component. Some of the "hard" scientists in our
group had difficulty accepting some of the precepts of communications
orthodoxy. Nevertheless, Dr. Yamauchi introduced us to some new vocabulary
in the nonverbal communications.
Haptics is the study of touch as a component of communication. The most
common example is the handshake. We dissected our feelings about bad
handshake experiences and practiced our "firm but gentle" techniques.
Kinesics explains the dynamic impact of facial expressions and body movements.
One example of the importance of kinesics to communications is the problems
engendered by e-mail. When deprived of gesture and accompanying clues,
messages, especially humor, may often be misinterpreted. This leads to
muddled communications. Oculesics is a new fitness program. Actually,
Oculesics is the communication information provided by the eyes. Eye
contact clearly conveys information in personal interactions. It may
provide even more information. Dr. Yamauchi reported that research asserts
that the direction of a person's gaze when they are pondering a question
may indicate "left-brain" (analytical) or "right-brain" (creative) dominance.
Proxemics, the effect of spatial relations on communication can affect
one-on-one communication (e.g., different cultures have different zones
of personal space). Beyond that aspect of human communication, architecture
and interior design provide many cues about the importance of an institution
(e.g., the edifice of the Supreme Court), or the power of the inhabitant
(e.g., a corner, window office with massive mahogany desk). The timbre,
tone, and pitch of a persons voice are Vocalics. Vocalics also includes
the rate of speech, volume and emotional intensity and accent. Paralinguistics
includes non-word utterances such as sighs, laughter or moans. The important
nonverbal information conveyed in how individuals and cultures handle
time is called Chronemics. Finally, the roots of 1980's "Dress for Success" are
Objectics or the artifactual communication provided by objects like clothing,
cars, even the fonts you use!
AAAS Colloquium on Science and Technology Policy
(May 3, 2001)
Topic: Science and Technology in the New Administration
Dr. Mary L. Good, Chair of the Board of Directors for the American Association
for the Advancement of Science (AAAS), welcomed the largest crowd ever
to the 26th Annual AAAS Colloquium on Science and Technology Policy.
The first day of the two-day conference covered major issues of science
and technology policy as viewed from the perspective of the Bush Administration.
A strong focus was on the budget impact of such policies. The day concluded
with the William D. Carey public lecture and reception featuring Dr.
Neal Lane, former director of the White House Office of Science and Technology
Policy.
Lawrence B. Lindsey, Assistant to the President for Economic Policy
and the architect of the President's tax cut policy, gave the opening
lecture on the new Administration's policies regarding science and technology.
He said that the Nation faces a number of challenges including the energy
situation, our need to protect the environment, and the economy. Good
science is the key to defining and addressing these challenges. The new
Administration believes that the Nation lacks an energy strategy, and
that the absence of such a strategy means we will have neither adequate
energy nor a clean environment.
A new national energy strategy should look toward major improvements in
technology for more efficient, cost-effective, clean energy. He asserted
that investing in new technologies that are being developed largely in
private or non-profit sectors will be much more effective than trying to
implement current technologies for clean energy and greenhouse gas control.
He believes that in the short term things will likely get worse before
they get better, but a new energy policy will ultimately lead to our Nation's
enjoying both more energy and a cleaner environment.
Dr. Mark Wrighton of the Washington University in St. Louis was the
first of four panelists to speak on the budgetary and policy implications
for research and development (R&D) in FY 2002. He said that the 21st
Century is clearly the "age of biology" and funding is being allocated
accordingly except for plant sciences. However, the federal investment
in R&D needs to be balanced across scholarly interests to advance health,
quality of life, environment, national security, the economy, and education
at all levels. Universities contribute facilities, infrastructure, intellectual
renewal, and have great economic impact, but are not doing enough to
attract talented people. Dr. John Yochelson, President of the Council
on Competitiveness, spoke next about sustaining America's prosperity
through the process of innovation. He said science policy creates anxiety
because the time horizon of science is at odds with the time horizon
of politics, and scientific research is not a part of one's everyday
lives. Dr. Yochelson suggested that a major national initiative is needed
to increase the number of American scientists and engineers, since they
provide the basis for our country's capacity to innovate. The Honorable
Sherwood L. Boehlert, Chair of the House Committee on Science, spoke
about the new Congress. He is disturbed by this year's slashing of research
budgets in several agencies, but predicts that the numbers will get better
next year. Science has a reservoir of good will in the Congress, but
we still have to work hard at justifying our programs in a constrained
spending environment where competition will be fierce. We especially
need to convince the newer and more skeptical members of the value of
federally funded science. Kei Koizumi of the AAAS concluded the panel
with an overview and commentary on the President's budget proposal for
R&D in FY 2002.
After lunch, three separate sessions on the major issues in science
and technology policy were held. The subjects were "The Regulatory Environment
for Science: Human Subjects Protection," "The Growth of Industrial R&D
and the Federal Policy Environment," and "The New Challenges of Defense
R&D." In the session on industrial R&D, we heard from Parry Norling of
Dupont, Robert Buderi of Technology Review, David Mowery of the Haas
School of Business at University of California-Berkeley, and Duncan Moore,
former Office of Science and Technology Policy (OSTP) Associate Director
for Technology. The speakers discussed the increasing and increasingly
more embedded role of R&D in industry, and the need to enlist industrial
R&D to work on meeting national goals. There is a trend of increasing
R&D funding by industry in academia. With such increases in funding come
increases in university patenting and licensing activities. Thus, universities
and the private sector may now have competitive as well as cooperative
relationships. Another problem discussed is the apparent decline of engineering
and science degrees being awarded in the United States and the difficulty
of attracting good students to these critical fields of study.
Policy roundtable discussions with agency officials concluded the afternoon
session of the conference. At the National Science Foundation's (NSF)
roundtable was Dr. Joseph Bordogna, NSF's Deputy Director. Dr. Bordogna
spoke about their strong emphasis on people and on the integration of
research and education. Challenged about a disappointing FY 2002 budget,
Dr. Bordogna said he was optimistic about increases in the future, and
pleased that the FY 2002 budget provided funding for all NSF's major
priorities.
The evening session featured former OSTP director Dr. Neal Lane whose
topic, "Talking Turkey: Science, the Economy, and the Community," was
enthusiastically received. Peppering his talk with folksy quotes from
Will Rogers, Aldous Huxley and others, Dr. Lane reminded scientists that
they have a responsibility to make sure they communicate the value of
what they do and why it is central to our national interest. He said
science and technology, although fueling half or more of our economic
growth, are peripheral to the political arena, making it easy for the
funding to get lost in the political fray. If anything could make us
a second-rate country, it would be the failure to provide a well-educated
scientific and technical workforce. Dr. Lane called for scientists to
exhibit more leadership in telling their story, to take a more active
role in ensuring the Nation has the future workforce it needs, and to
turn increasing attention to interdisciplinary solutions to social problems
across the globe; or as Will Rogers says, "If you're riding ahead of
the heard, take a look every now and then to make sure it's still there."
Robert D. Ballard
President
Institute for Exploration
(May 15, 2001)
Topic: Undersea Exploration
Dr. Robert Ballard is best known for finding and exploring the Titanic,
but he has been involved in many undersea explorations over his more
than 30-year career. In his talk, Dr. Ballard discussed a process of
finding "sunken" artifacts by following ancient trade routes that form
what is now a rich undersea "museum." He discussed finding the Titanic
in some detail. Then, he went on to explain that searching for the Titanic
had less to do with the adventure and historical interest of locating
the "infamous" luxury liner than with providing cover to develop techniques
which would help the U.S. Navy find two sunken nuclear submarines --
the Scorpio and the Thresher.
Recently, Dr. Ballard has been working on solving the mystery of the
Black Sea. Two scientists (Ryan and Pittman) from Columbia University
have speculated that the Black Sea was formed when the Mediterranean
Sea "spilled over into an inland fresh water lake creating what is now
the Black Sea." Ryan and Pittman have speculated that this is the basis
for the Biblical story of Noah's Flood. Dr. Ballard is working with Ryan
and Pittman and others to look for evidence of earlier shorelines to
determine if catastrophic flooding did occur at a time that would be
consistent with the Noah's Flood story. Dr. Ballard sees considerable
promise in the Black Sea research because of the fact that the Black
Sea is devoid of dissolved oxygen, and therefore lacks underwater organisms
that eat organic materials. Because of that, things such as wooden ships
and other structures and objects, that would normally be degraded quickly,
are preserved as if frozen in arctic ice.
Site Visit to the Conservation Studio of the National Gallery of
Art
(May 23, 2001)
In the first area we visited was a self-portrait of Vince -- lying frameless
on a worktable; it seemed a bit too informal to be referred to as a Van
Gogh. Behind, in an alcove, also frameless was a dark, brooding painting
by Rembrandt. Between two worktables, supported on a small easel was
a Hieronymus Bosch -- not that one -- called "The Death of a Miser." This
was clearly not your average workshop.
Mr. Mervin Richard, the Deputy Chief of the Conservation Department
of the National Gallery of Art guided us through several areas of the
Conservation Studios (and labs) at the National Gallery. He and his colleagues
explained the art and science behind the conservation and restoration
of works of art ranging from paintings to prints to sculptures to frames.
Where the original work may be a combination of creation and technique,
the conservation is a combination of technique and science. The tools
and techniques of the conservationist range from brushes and sticks and
chemical solvents to light and electron microscopes, x-rays, infrared
reflective spectroscopy, and gas-liquid chromatography, to name a few.
The mysteries conservators must solve during the course of their work
are daunting. One example is distinguishing original materials from those
added later by the artist, or perhaps much later by earlier restorers.
Another is finding ways to remove added materials without affecting the
original work. Still another is discovering pentimenti-- bits the artist
painted, and then painted over, or figuring out why certain colors don't
seem to be true to the artist's style even if they are determined to
be part of the original work. Another challenge is identifying what pigments
were used and why.
Identifying added material could be difficult, especially in the case
of the Museum's collection of original Degas wax sculptures. Many of
these were modified prior to casting; some were modified even after casting.
Removal of added materials can be easy when the original painting is
old and there is a large differential between the solubility of the original
paint and later added material. When the original work is more recent,
the artists frequently had played with the formulation of the pigments
used, and the solubility differential between the original paints and
added materials is greatly reduced. Artists used pigments that were cheap,
even when they knew they would fade. What the scientists find has a large
bearing on what the conservationist does, from determining what to clean
up and what to use to clean it up, to determining that something is the
result of what the artist did, in which case the conservationist leaves
it alone.
Nicholas Godici
Acting Under Secretary of Commerce for Intellectual Property and Acting
Director of the United States Patent and Trademark Office and Commissioner
of Patents, United States Patent and Trademark Office
(May 31, 2001)
An Introduction to the PTO:
For over 200 years, the basic role of the Patent and Trademark Office
(PTO) has remained the same: to promote the progress of science and the
useful arts by securing for limited times to inventors the exclusive
right to their respective discoveries (Article 1, Section 8 of the United
States Constitution). Under this system of protection, American industry
has flourished. New products have been invented, new uses for old ones
discovered, and employment opportunities created for millions of Americans.
The PTO is a non-commercial federal entity -- one of 14 bureaus in the
Department of Commerce (DOC). The office occupies a combined total of
over 1,400,000 square feet, in 18 buildings in Crystal City, Arlington,
Virginia. The office employs about 6,300 full-time equivalent (FTE) staff
to support its major functions, the examination and issuance of patents,
and the examination and registration of trademarks. The PTO became a
performance-based organization (PBO) through legislation enacted by Congress
in 1999.
The PTO has evolved into a unique government agency. Since 1991, under
the Omnibus Budget Reconciliation Act (OBRA) of 1990, the PTO has operated
in much the same way as a private business, providing valued products
and services to its customers in exchange for fees, which are used to
fully fund its operations. The primary services PTO provides include
processing patents and trademarks and disseminating patent and trademark
information.
Through the issuance of patents, the PTO encourages technological advancement
by providing incentives to invent, invest in, and disclose new technology
worldwide. Through the registration of trademarks, it assists businesses
in protecting their investments, promoting goods and services, and safeguarding
consumers against confusion and deception in the marketplace. By disseminating
both patent and trademark information, it promotes an understanding of
intellectual property protection and facilitates the development and
sharing of new technologies worldwide.
The PTO Mission:
The PTO promotes industrial and technological progress in the United
States and strengthens the national economy by: (1) administering the
laws relating to patents and trademarks, (2) advising the Secretary of
Commerce, the President of the United States, and the Administration
on patent, trademark, and copyright protection, and (3) advising the
Secretary of Commerce, the President of the United States, and the Administration
on the trade-related aspects of intellectual property.
Current Issues Facing the PTO:
A high rate of attrition among the biotech, electrical engineering,
e-commerce and computer science disciplines remains a problem at PTO.
To stem the flow of employees to higher paying jobs in the private sector,
the PTO requested and received a special pay rate for its professional
staff in May 2001. The areas of stem cell research and software patenting
continue to be a challenge for the PTO. Prior to 1999, applications were
not subject to the Freedom of Information Act. Under the AIPA legislation
enacted in 2000, many applications will now be subject to FOIA and will
have to be published 18 months after the date of filing. The publication
of applications began in March 2001, when the first set of applications
became publishable under the new law. The annual growth for patent applications
is currently at 12 percent and is expected to remain steady for the next
few years. This represents a major challenge to the PTO in terms of space
allocations for storage, and the conversion of paper files to electronic
medium. Several pilot projects are under way to address the problem,
including work at home, and a gradual phasing out of paper files. Electronic
filing of patent applications is now possible, and the future goal is
to have all applications available electronically for all parts of the
examination and processing operation. On the trademark side, electronic
filing has been available for some time, and is strongly encouraged.
Site Visit to Lockheed Martin Global Telecommunications
(June 6, 2001)
We visited the Lockheed Martin Global Telecommunications (LMGT) campus
in Clarksburg, Maryland where representatives gave us an overview of
Lockheed Martin Corporation and how LMGT fits into the corporation. They
also presented technical briefings on LMGT's major projects.
Mr. Michael Onufry, Principle Scientist, LMGT Systems and Technology,
briefed us on the history and the business of the facility. It originally
was part of COMSAT Corporation, chartered by Congress in 1962 to provide
international satellite communications for the United States. In August
2000, COMSAT merged with Lockheed Martin and joined its Global Telecommunications
subsidiary.
Lockheed Martin has four other primary business sectors: Aeronautical
Systems, Space Systems, Systems Integration, and Technology Services.
Overall, Lockheed Martin has 130,000 employees, with more than $25 billion
in sales in FY 1999. LMGT expects to bring in close to $1 billion in
revenue annually.
LMGT concentrates on four key areas:
- Enterprise Solutions, International delivers regional data network
services, information technology outsourcing and e-business solutions
in high growth international markets such as Latin America.
- Enterprise Solutions, Domestic delivers regional data network services,
information technology outsourcing and e-business solutions in the
United States.
- Systems and Technology pursues systems development and integration
projects around the world, while also providing development engineering
support, including applications integration, for Enterprise Solutions.
- Satellite Services provides global connectivity in fixed and mobile
satellite services and in future broadband services. Investments in
INTELSAT, Inmarsat, and other satellite operations are included in
this area.
Mr. Cal Zinner, Major Accounts of the United States and Canada, LMGT
Products, briefed us on LINKWAYT and LinkStarT broadband satellite network
terminals.
Mr. Stephen Adelman, Vice President for Business Development, LMGT Systems
and Technology, provided a briefing on Astrolink, a new Ka-band, geostationary
satellite-based broadband end-to-end net for data, video and voice services
that will be available in 2003.
On our tour of the building, which had originally been built to manufacture
satellites, we saw many satellites and antennas. We were shown a device
that has evolved in 30 years or less from a 10-inch long tunable, wave
guide-like device to a solid-state device, smaller than a postage stamp.
We saw flat plate antennas that could receive television programming,
which were the size of a cafeteria food tray. We also saw LMGT's anechoic
chamber where they conduct telephone voice quality tests for the telecommunications
community and company.
John N. Yochelson
President
Council on Competitiveness
(June 13, 2001)
Topic: Federal and Private Sector Problems and Solutions in Dealing
with Economic Competitiveness
President of the Council on Competitiveness since 1995, Dr. John Yochelson
has a Masters in Public Administration from the Woodrow Wilson School
of Princeton University, and background in public service from both the
public and private sectors. He served three years in the State Department,
was a research fellow at the Center for International Affairs at Harvard
University and the Brookings Institution, and was a senior vice president
at the Center for Strategic and International Studies before moving to
the Council on Competitiveness.
The Council on Competitiveness, a non-partisan forum, is a small organization
with a $2.3 million annual budget and permanent staff of 15. Members
include corporate executive officers, university presidents, and labor
leaders. Originating as a response to the 1985 Young Commission report
on economic competitiveness, the Council was formed to keep alive the
Commission's agenda of quality, productivity, innovation and the global
marketplace in the minds of government policymakers. The early 1980's
were a time of high inflation, loss of American confidence in our manufacturing
sector, and competition with a seemingly unstoppable Japan. The work
of the Council still seemed imperative in the early years of the Clinton
Administration, while things continued to look grim for the American
economy. By 1995, there was some sense that the objectives had been met
and that the Council might have outlived its usefulness, with the United
States now recognized as the most competitive economy in the world. But
the Council did not fade away and remains quite successful. They emphasize
constant vigilance to sustain the sources of our competitiveness; science
and technology investment, our capital markets, and United States entrepreneurship.
We must be on top of our game at all times, since things change very
quickly in the global marketplace. All activities of the Council are
ultimately focused on maintaining and improving the living standards
of Americans. The council has a website at www.compete.org with details
of their activities, including running an ongoing briefing series for
congressional staff on technology and innovation.
In discussing his career, Dr. Yochelson mentioned that he finds being "on
the outside" a more productive and satisfying place for public service
than "on the inside" of the Federal Government. He finds it troubling
but generally true that in the government if you take a risk and fail
it can mean the end of your career, but if you take a risk and succeed
you may get a small award or nothing at all. In the private sector the
situation is completely reversed. Successful risk takers become the most
highly valued and highly compensated employees, while failure is looked
upon as an important learning experience. Maintaining a positive environment
for learning and innovation is one of the critical factors that will
keep the United States in global economic leadership.
back to top
Class
of 1999-2000
William A. Curtis
Principal Director for Year 2000 (Y2K)
Department of Defense, OASD (C31)
(October 6, 1999)
Topic: Department of Defense Y2K Preparations
Mr. Curtis spoke about his efforts in instilling leadership and teamwork
within the Department of Defense (DOD) during preparations aimed at avoiding
potential Y2K related automated system failures. He began by explaining
his role as "coach" of an important team and explained how teamwork was
important to initiate and complete actions to find and repair potential
Y2K glitches. He put the magnitude of the DOD Y2K problem in perspective
by making comparisons to other government departments. Mr. Curtis effectively
used graphics to explain &2K complexity and draw attention to problem
areas, noting that DOD systems consist of over two million computers
and tens of thousands of programs with potential Y2K related problems.
Mr. Curtis approached this as an opportunity to make a change and improve
a process.
DOD was confronted with potential Y2K problems in numerous non- critical
infrastructure and support systems. Mr. Curtis explained that DOD leadership
prioritized the budget to ensure funding was applied to reduce the risk
of having operations degraded due to Y2K system related problems. A plan
of action included identifying a baseline of systems and methodically
reporting actions completed to assess and reduce potential risk. Y2K
data was reported into a database and reports were generated from the
Y2K database for management reporting at all levels. Corrective actions
consisted of identifying potential problems, fixing the problems and
performing extensive testing. Reporting was conducted through the chain
of command from within Services, Agencies and other commands through
the DOD organization to the Secretary of Defense. Extensive screening
for problems, corrective actions, and re-testing reinforced DOD Y2K reporting.
The DOD Y2K database was the primary single vehicle used to report Y2K
information. Mr. Curtis emphasized the importance of presenting negative
data initially, followed by proposed action plans to correct problem
areas. He pointed out that before key briefings, those affected by the
information were given the opportunity to take corrective action or to
prepare explanations of corrective actions proposed. The DOD Y2K database
was used to report progress to Congress as well as provide a tool for
internal management. One management technique used was to present data
highlighting creditable actions and publicly acknowledging the most successful
actions, thereby motivating others to compete.
Mr. Curtis demonstrated his effective use of graphics during high-level
briefings and the resultant attention provided to critical areas. Mr.
Curtis also emphasized the importance of funding this very important
program.
Massive testing was a major contributor to success. All levels were
tested followed by the performance of integration, which was done to
assure total end-to-end support. Testing also included operations without
the use of automated systems in the event something was missed. Mr. Curtis
pointed out that as time permitted, more thought went into identifying
those areas that might have been missed with a view to expand and test
more.
Mr. Curtis' presentation concluded with a discussion about leadership,
namely leadership attributes and challenges. His final words presented
the challenge to make change by going forth and being leaders.
J. Thomas Ratchford
Distinguished Visiting Professor
National Center for Technology and the Law
George Mason University School of Law
(October 20, 1999)
Topic: Science and Technology in the Global Economy
Dr. Ratchford discussed the rapid advances in science and technology
over the last few decades. Scientists are delving ever deeper into the
origins of the universe. Technology (e.g., in transportation, computing
and telecommunications) has become omnipresent and universal. While improved
pharmaceuticals and health care technologies are improving the quality
of life, work in the biological sciences is advancing faster than the
laws and ethics that connect such work to society.
Trade and the global economy are also changing rapidly. Trade disputes
have moved from tariff to non-tariff barriers. Real per capita gross
national product (GNP) has increased five fold in the United States over
the last 100 years, and other countries are approaching the United States'
figures. Trade is increasing even faster; United States' trade went from
13 to 30 percent of our GNP between 1970 and 1995.
The United States, as well as a number of other countries, is investing
heavily in research and development (R&D). Support comes from both the
public and private sectors, with industry funding an increasingly large
percentage of total R&D in the United States. Attempting to keep up with
rapid advances in science and rapid changes in technology, corporate
R&D investment as a percentage of sales has approximately doubled over
the last 20 years. Today, companies spend the equivalent of about half
of their profits on R&D.
Noting that the United States is the only country with a strongly positive
trade balance for technology, Dr. Ratchford concluded that the United
States' industrial research complex is the best in the world. Following
a rough ten years of downsizing and shifting priorities, the United States'
system is now extremely efficient.
Richard G. Stieglitz
President
RGS Associates, Incorporated
(October 27, 1999)
Topic: Business-Government Partnerships for Information Technology Implementation
(Fact or Fiction?)
Dr. Stieglitz provided an industrial perspective of opportunities to
improve government operations through information technology implementation.
His nuclear engineering background, along with ten years service as a
Naval officer planning nuclear submarine overhauls, and founder/CEO of
RGS Associates, Incorporated, provided him with a wealth of information
technology experience. He promoted group discussion by presenting stimulating
topics.
Dr. Stieglitz opened with an opinion that future e-commerce will facilitate
a successful business-government partnership. He provided background
on his personal information technology experience and that of his company.
In a discussion about change, Dr. Stieglitz expressed concern that too
much focus on technology can overshadow a more fundamental problem of
improving a business process. He emphasized the need to focus on translating
computer and technical solutions to solve business problems. Solving
problems involves change. He explained that change is an emotional experience
that causes fear and resistance. He went on to explain that his RGS staff
is organized in competency areas, encouraged to initiate change, and
trained to address customers' fear of change.
On the subject of managing change, the discussion migrated to the politics
and public opinion impact on business situations. On one-hand, program
management competency is based on presentation of sound business cases,
fostering competition and encouraging government outsourcing to industrial
partners in the interest of effectiveness and efficiency. On the other
hand, congressional guidance in the form of budget earmarks can re-direct
funding that appears to dispel the foregoing business logic. Earmarks
can cause programs (structured on sound business cases) to be unstable
and inefficient. Government/industrial partnerships suffer. Group consensus
was that congressional earmarking of funds for special interest will
not change and must be accepted as reality.
Dr. Stieglitz talked about partnership and trust. His RGS vision is
to develop a trusting environment where government, industry and the
public work together. Focusing on Defense Logistics Agency as an example,
he used a chart to illustrate new paradigms and relationships developing
between government and industry.
Outsourcing was the subject of a model presented by Dr. Stieglitz. He
explained how outsourcing could be a financial win-win situation for
both government and industry. The requirement for training was discussed
as more technology drives the need for competencies opening new career
paths. Job opportunities are created for those willing to acquire new
technical skills. Dr. Stieglitz observed that in spite of downsizing,
unemployment has never been lower.
Next, Dr. Stieglitz explained the new paradigm whereby government inventory
is reduced while industry delivers directly to government customers.
He showed how efficiency was increased and costs reduced. Forming new,
technology-enabled relationships eliminated non-essential functions.
He used an example of government-industry use of e-commerce. This e-commerce
relationship involved industry providing the U.S. Navy Fleet support
for the 21st Century. Dr. Stieglitz explained how growth is a continuous
process of integrating changes in goals, competencies, processes, products
and technologies.
Dr. Stieglitz closed with four reminders: (1) transition is a bumpy
road -- count on it, plan for it; (2) accurate and timely metrics are
critical for success; (3) designate performance measures of success and
manage the risks; and (4) insist on steady improvement. There was no
debate that a successful relationship can be produced with satisfied
customers and satisfied contractors.
Joan Dudik-Gayoso
Senior Advisor for Science, Technology and Development Policy
Bureau for Oceans, and Environmental and Scientific Affairs
U.S. Department of State
(November 3, 1999)
Topic: Science and Technology, International Development and Global
Issues
Ms. Joan Dudik-Gayoso led a very informative and thought-provoking discussion
on science and technology in international development and global issues.
The lecture began with the discussion of the role of science and technology
to the U.S. national interest. The U.S. national interest could be defined
as assuring the security and the prosperity of the American people. Science
and technology contribute to national security by: (1) supporting national
defense and military readiness; (2) protecting and promoting the health
of the population; and (3) protecting and sustaining the environment
and natural resources of the nation. Science and technology contribute
to national prosperity by strengthening U.S. competitiveness through
increased productivity; generating new knowledge and supporting innovation;
and creating employment. The Council of Economic Advisors estimates that
over the past 50 years, technology innovation has contributed to at least
half of the increase in the national growth in productivity.
Ms. Dudik-Gayoso elaborated on the global perspective of science through
discussions on the following key points: (1) science knows no borders;
(2) many threats (e.g., to health, environment or nuclear safety) cross
borders relatively easily; (3) "Mega" science projects like the space
station or the supersonic super collider are too expensive for one country
to fund and thus require international science and technology cooperation;
and (4) international scientific cooperation can give us access to research
and researchers in other countries. Intellectual property rights, trade
in genetically modified organisms, and adoption of technology are just
three of the issues that arise. Trade is one of the primary mechanisms
for transferring technology. Trade accounts for an increasingly larger
share of U.S. gross domestic product.
Additional global issues were discussed, including the shift in threats
to national and international security since the end of the Cold War.
There are increased risks to nuclear safety and threats from biological
and chemical weapons. Priorities in U.S. research and development funding
have shifted from defense to health related research. The debt crises
in Latin America during the 1980's and Asian economic crises in 1997
evidenced the impacts of financial and economic globalization. The number
of conventions and treaties on global environmental issues has increased
since the 1992 Rio Conference on Environment and Development. There is
growing inequity in the ability of women in developed and developing
countries, to access training and employment in math and science, thus
resulting in global inequality in the earnings potential of women as
related to men.
Ms. Dudik-Gayoso referenced some alarming statistics to stress the significance
of population issues as factors affecting economic development. Of a
world population over 6 billion, 1.3 billion live in poverty. Close to
one billion cannot meet their basic consumption requirements. Seventy
percent of the poor are women. It has been projected that by 2030, the
world's population will double with 95 percent of that growth occurring
in developing countries. Ninety-five percent of the increase in the world's
labor force in the next 25 years will occur in developing countries.
World population pressure imposes significant impacts on agricultural
production and world food supply. Although "genetically modified products" is
a controversial issue in the developed countries, it may provide a means
of increasing food production to feed the growing population in the developing
countries.
Depending on a country's per capita income level, infrastructure establishment,
training and educational level of the work force, and attraction to foreign
investments, the developing world can be categorized as poor countries,
middle income countries, or high income developing countries.
The discussion concluded with a comparison of trends in assistance to
developing countries. Although the United States used to be the largest
donor, it has since fallen behind Japan, France and Germany.
Site Visit to the Bureau of Engraving and Printing
(November 17, 1999)
Mr. Thomas A. Ferguson, Director of the Bureau of Engraving and Printing
(BEP), explained the process of printing the currency that is used in
everyday life. The tour gave us the behind the scenes look at the new
electronic development of the printing process. Mr. Ferguson told us
that the new five and ten dollar bills were being created electronically
and that the safeguards for counterfeiting are in place. The Department
of Treasury Branch is a non-appropriated agency meaning they are funded
solely by selling products. The BEP is gaining revenue from selling postage
stamps printed at their agency for the U.S. Post Office. Over half the
stamps sold are printed at the BEP. Other products sold are invitations
for White House events. Internet sales of BEP products are on the rise,
but the major item is the printing of U.S. currency. A question was posed
about the affect of the Government Elimination Paper Act on the paper
currency market. He answered that U.S. paper currency would be around
for centuries since it is the most valued currency in the world. Mr.
Ferguson explained about the cost accounting overhead and that the price
to produce a single piece of currency is four cents. The annual order
from U.S. Treasury (who is the major customer) in the fiscal year is
11.3 billion notes for 1999 and 9 billion notes for 2000. The Bureau
of Engraving and Printing is the largest employer of blue collar and
minority workers in Washington, D.C.
Kenneth Prewitt
Director
U.S. Census Bureau
Larry J. Patin
Chief, Telecommunications Office
U.S. Census Bureau
(December 1, 1999)
Topic: The 2000 Census
Dr. Prewitt provided an historical perspective of the generation of
the U.S. Census, dating as far back as the Continental Congress, when
every state had a vote and the majority ruled. The vast differences in
state populations caused sharp debates over the allocation of power.
As a result, at the Constitutional Convention, a compromise was reached
wherein House seats were allocated in proportion to population, thus
requiring the performance of a decennial census to begin in 1790 for
the provision of a population count. As the country grew, population
growth and dispersion led to the establishment of states and the resultant
redistribution of power. In 1910, when the House had grown to 435 members,
its size was frozen, and the decision was made to reallocate the seats
prospectively. As the population density began to flow from the South,
from where many members of Congress were elected, to the manufacturing
North, concern grew in Congress as the impending shift in Congressional
power was foreseen. After sharp debate for a period of ten years, Congress
agreed to a reapportionment formula created by the National Academy of
Sciences, to be used beginning with the 1930 Census.
Another basis for debate arose, however, over whether power should be
distributed on the basis of wealth or population and, if wealth were
to be the basis for power distribution, the manner in which wealth would,
in fact be measured. Another compromise was reached, stating that slaves
would present some measure of wealth and that free blacks would be counted
but not as part of the representational basis. As the country went to
war in 1940, an experiment was run to determine a measure of the Census
undercount as every 18-34 year old male registered. While there was an
overall undercount, the experiment showed that the undercount for black
males was proportionately larger. Since the undercount was not random,
and thus inconsequential to individual states, and was instead differential,
the result was that the undercounted states were harmed. The early 1960's
provided the first firm sense of a differential undercount providing
inequities in federal spending.
The Baker v. Carr U.S. Supreme Court decision addressed the inequities
presented by the states' ability to arbitrarily draw district lines and
disenfranchise racial minorities. As a result, the Census was charged
with dual obligations -- to provide an accurate population count and
to determine data by race and voting age down to the geographic "block" level,
thus assigning a person to a particular location. This additional charge
created a greater burden on the Census Bureau since, for any number of
reasons; a segment of the population does not get counted. Dr. Prewitt
noted some of these reasons, including illegal aliens, zoning law violators,
criminals, those with multiple residences, those without addresses, etc.
The dual obligations of the Census thus resulted in the dual system estimation
methodology wherein the basic Census is performed and a totally independent
sampling is performed.
The discussion then shifted to the most recent Decennial Census performed
in 1990. Commerce Secretary Mosbacher made the decision not to adjust
the Census data, stating that the dual system estimation was available
for political manipulation and that the Census Bureau had the capacity
to predesign the Census to achieve a predetermined partisan political
outcome. The 1990 Census proved less accurate as the undercount rate
rose from the 1980 rate. Dr. Prewitt noted that while the Census did
not utilize corrected data presented by the dual system estimation, other
agencies were allowed to use the corrected data. Accordingly, the Department
of Labor used the corrected data to generate its labor statistics.
As a result of the increased undercount rate, President Bush provided
instruction to fix the 1990 problems. The design was litigated and, in
January 1999, the U.S. Supreme Court ordered that sampling could not
be used for apportioning House seats but could be used for all other
purposes. Thus, the Census Bureau will correct its own data after apportionment
and will provide the more accurate data to the States for use in reapportioning
state legislative seats, school board seats, etc. and to the Federal
Government for funding allocations. Dr. Prewitt concluded his discussion
with a description of efforts undertaken to increase the mail back rate
for Census forms.
Mr. Patin then addressed the logistic operation of the Census Bureau.
He described the organizational structure of the Census Bureau as well
as the relative functions of the regionally dispersed Census offices
and the data capture centers. Mr. Patin concluded by describing the use
by the Census Bureau of new technologies such as telephone networks,
imaging, and the Internet to enable faster data collection and dissemination.
Raymond A. Archer
Deputy Director
Defense Logistics Agency
U.S. Department of the Navy
(December 8, 1999)
Topic: The Purpose and Results of Focused Logistics War Games
Rear Admiral Archer, Deputy Director of the Defense Logistics Agency
(DLA) spoke about the Focused Logistics War Games (FLOW). FLOW consists
of seven pillars addressing Commander-in-Chief (CINC) issues and concerns.
He emphasized that these war games have been an annual Navy event at
Newport, Rhode Island for a number of years and that 1999 was the first
year the war games were "joint" (i.e., all four Services participated
in the Navy sponsored exercise, with the understanding that future sponsorship
would be rotated among the Services).
Rear Admiral Archer started by explaining top-level responsibilities
for operations during times of war. He explained the operational responsibilities
of the CINCs, the supporting responsibilities of the Service chiefs,
and the administrative responsibilities of the Chairman of the Joint
Chiefs of Staff. The Services are responsible to equip, train and provide
operational forces under the operational command of the CINCs. Having
explained this key relationship, he emphasized the importance of FLOW
as the first joint logistics war game, indicating a shift in focus from
individual to joint force logistic requirements.
After putting FLOW in context to the Department of Defense (DOD) mission,
he explained FLOW as a Defense planning guidance tool for assessing technology,
joint logistics doctrine and operational capabilities. Rear Admiral Archer
was in charge of the Logistics Management (LM) and Information Systems
(IS) pillar.
FLOW demonstrated CINCs, Services and Agencies requirement for logistics
capabilities, which integrate planning, decision-making and business
operations. He emphasized communication issues stressing the importance
for information in a trusted environment with communications on demand.
FLOW confirmed a gap between individual service efforts that were not
in harmony. As examples, he cited degraded operations due in part to
impeded CINC situational awareness, supply pipeline visibility, inefficient
use of lift resources, and redundant requirements.
FLOW assessment of degraded logistics communications support identified
areas that needed to be addressed, such as logistics traffic relegated
to unacceptable priority, inadequate connectivity to the industrial support
base, inequities in levels of communication (i.e., CINC area of responsibility),
insufficient connectivity to Internet infrastructure with limited surge
capability.
In reference to force readiness and sustainability, FLOW issues indicated
that assets prepositioned to support a major theater war were instead
being used for small-scale contingencies, such as humanitarian assistance
and disaster relief. This usage was having a debilitating effect on large-scale
war fighting capability by depleting resources that could not be replaced
without additional funding. Using communications technologies to provide
better information is closely associated with restructuring the way planning
and operations are conducted.
Technologies are making joint logistics planning-tools a reality. There
is much to be done to improve the way logistics can be managed through
the application of better information. Services and DOD Agencies are
dealing with the FLOW issues. DOD needs knowledge-based information systems
to improve alignment of the DOD acquisition process in support of operations.
FLOW issues indicate a need to program for development of knowledge-based
systems.
Q. Todd Dickinson
Under Secretary of Commerce and Director of the United States Patent
and Trademark Office
United States Patent and Trademark Office (USPTO)
(December 15, 1999)
Topic: World Intellectual Property, and Technology of WIPO
Director Todd Dickinson explained that the USPTO is funded by user fees,
and is charged with administering laws relevant to the granting of patents
and registering of trademarks. [Over six million patents have been issued
since the first patent in 1790. Last year, PTO issued 161,000 patents
and registered 104,000 trademarks.]
On behalf of the agency, Director Dickinson advises the Administration
(including the Secretary of Commerce and the President) on issues of
national consequence with respect to patent, trademark, and intellectual
property considerations. Director Dickinson will, from time to time,
propose policy and programmatic changes to the Administration and Congress
designed to keep the USPTO in the forefront of efforts to improve the
development and maintenance of the nation's patent and trademark system.
Director Dickinson gives special emphasis to critical needs for the accommodation
of intellectual property rights and the protection of U.S. interests
in this regard.
Director Dickinson explained his quick and early efforts to improve
agency operations by championing customer needs. Patent and trademark
customers can now, through the USPTO Web site, pay maintenance fees by
credit card; replenish deposit account balances by credit card; request
a deposit account statement; and add, change, or delete deposit account
authorized users. During the seminar, Director Dickinson was asked about
electronic-based filing of patents. He answered that "The USPTO continues
adapting its operations to the ease of the Internet, and we believe our
customers appreciate the convenience of USPTO's financial services and
e-commerce initiatives, particularly small businesses and independent
inventors who are more inclined to handle transactions personally."
Director Dickinson explained that in November 1999, the USPTO solicited
comments on a proposal to amend its Rules of Practice to include payment
of any patent or trademark fee by credit card. Previously, the USPTO
had limited credit card transactions to payment of charges connected
with trademark applications filed over the Internet and information products.
Director Dickinson discussed the legislation converting the USPTO to
a performance based organization and the impact of adopting the legislation.
He noted that many revisions have been made to U.S. patent law during
the last half century, and stated, "We are energized by the challenge
of implementing new legal directives, which reflect the nation's shift
away from a mechanistic, industrial-based economy and toward an information-based
economy." Director Dickinson remarked that intellectual property protection
is a core issue on which he has built his career, and noted its importance
to the health of research and business innovation that drives the nation's
information-based economy. Director Dickinson discussed the patent ability
of business methods, noting that a "prior use defense" can be used to
protect companies.
Director Dickinson spoke about his efforts to engage labor and management
in productive discussions aimed at smoothing over past difficult relationships,
and noted that he will soon sponsor a labor-management "retreat" to help
build a lasting partnership among stakeholders in the USPTO's future.
Director Dickinson ended his address to us by noting that an agency
such as the USPTO -- where customer service is key to success -- must
stay abreast of rapid changes in information technologies and their application
to problem solving. Effective future usage of technology is nowhere more
importantly illustrated than in the 25 percent increase in patent application
filings from FY 1998 to FY 1999. "We're moving as fast as we can," he
noted.
Dennis R. Shaw
Chief Information Officer (CIO)
United States Patent and Trademark Office
(December 15, 1999)
Topic: Automated Information Systems
As the CIO, Mr. Shaw is the principal advisor to the Under Secretary
of Commerce and Director for the United States Patent and Trademark Office
on the architectural design and acquisition of supporting automated information
systems, and the underlying information technology infrastructure.
Mr. Shaw began the meeting by reviewing some accomplishments of the
automated systems being designed and developed for the internal and external
customers of the USPTO.
Mr. Shaw described the USPTO's recently announced upgrade of its Revenue
Accounting and Management (RAM) system to allow financial transactions
over the Internet. The upgrade is part of the agency's long-term strategy
to modernize financial management practices and procedures, allowing
increased options for paying required fees and providing improved service
to USPTO customers.
Mr. Shaw explained that the USPTO's CIO division is developing an electronic
library to make some types of patent and trademark information available
to the general public, with applications for interested parties around
the world. As a senior executive for the USPTO, Mr. Shaw has traveled
extensively throughout Japan and Europe, working on behalf of the World
International Patent Office, as well as setting the electronic standards
for the transfer of business information worldwide. He is in the process
of arranging for approximately two million patents and trademarks to
be available on the Internet by 2001. In addition, he championed the
re-engineering of the Patent Search system. Mr. Shaw pointed out the
increasing ability of examiners to electronically search patent and trademark
reference files, thus allowing the examiners added flexibility and greater
ease when searching files.
With the agency's explosive growth of over 7,000 employees, a site move
is in the agency's future and Mr. Shaw took a moment to explain the logistics
and impact of the agency's relocation in the Northern Virginia area.
The future of the USPTO is based on information and state-of-the-art
technologies that allow government employees and the public to work together
in a secure and efficient way. Mr. Shaw has demonstrated over the years
that he looks forward to the 21st Century and the progression that is
evolving with modern technologies.
Takao Kuramochi
Science Counselor
Embassy of Japan
(January 12, 2000)
Topic: Japan's Future Science Plan and Major Science Initiatives
Mr. Kuramochi provided a broad overview of Japan's science and technology
(S&T) and highlighted the Japanese government's evolving priorities in
S&T. Japan is in the midst of major changes in its S&T enterprise. While
the nation has been traditionally strong in corporate research and development
(R&D) activities (technology), basic scientific research has been weak.
With the sustained economic slump since the early 1990's, Japanese government
leaders realized that S&T could play a key role in realizing a strong,
competitive economy. The "Science and Technology Basic Law" enacted in
1995 directed the government to promote basic research, improve research
facilities, promote research collaborations and researcher exchanges,
and develop capable and creative researchers. The following year, the
Cabinet approved the "Science and Technology Basic Plan," a five-year
plan outlining policies and programs for implementation of the law. The
plan called for institutional and management reforms of the R&D system,
including: supporting up to 10,000 postdoctoral fellowships per year;
introducing fixed-term (as opposed to permanent) appointments for researchers
at national laboratories and universities; investing in research facilities
and equipment; increasing the numbers of research support personnel;
facilitating private sector use of research results from national laboratories
and universities; and establishing an evaluation system for government-supported
research. Mr. Kuramochi reported that the Basic Plan's target of 17 trillion
yen ($160 billion) in government funding for R&D during the five-year
period 1996-2000 would be reached.
While interim assessments suggest that the Basic Plan has been successful
in a number of areas (e.g., over 10,000 postdoctoral fellowships were
supported in 1999), the government recognizes a continuing need to promote
basic research, improve research facilities, improve the quality of research
through competitive funding, facilitate the transfer of research results
to the private sector, and enhance public understanding of S&T. The Japanese
government is now actively working on the next five-year Science and
Technology Basic Plan. This involves inter-agency coordination and surveys/discussion/interaction
with the research community, the corporate sector, universities, and
the public. The Science and Technology Council is directing the effort.
In addition to the new five-year plan, Japanese S&T will be stimulated
by a new initiative by the Prime Minister called the "Millennium Project." This
initiative is meant to tackle critical social issues by stimulating innovation
in industry. The three focus areas are: (1) preparation for an advanced
information society, (2) coping with an aging society, and (3) establishing
a recycling-oriented society.
Japanese S&T will also be impacted by the "Basic Law for Administrative
Reform of the Central Government" enacted in 1998. That law calls for
a streamlining of administrative structure, including a decrease in the
number of ministries/agencies from 22 to 13 by January 2001 and a 20
percent reduction in the number of government employees over 10 years.
As part of the restructuring, the Ministry of Education, Science, Sports
and Culture will merge with the Science and Technology Agency to become
the Ministry of Education and Science. Being moved to a Cabinet Office
will strengthen the Science and Technology Council, which provides S&T
policy coordination.
Asked about how Japan and the United States might better work together
in S&T, Mr. Kuramochi said that our countries should target collaboration
in areas of global importance. He gave several examples, including global
climate change, life sciences, space, and earthquake disaster mitigation.
Hyman H. Field
Senior Advisor for Public Understanding of Research
Directorate for Education and Human Resource Development
National Science Foundation
(February 2, 2000)
Topic: The Public Understanding of Science in the United States
Dr. Field began the discussion with the real definition of public understanding
of science and how to reach the public.
Science literacy consists of: (1) a basic level of understanding of
science principles across a range of disciplines, (2) understanding of
the methodology/process of science and how to use it, and (3) understanding
of the applications of science and the impact it has on everyday life.
To illustrate the misconception of science, Dr. Field showed a short
video recording of Harvard graduates' explanation to an astronomy question,
and 9th grade science class students' understanding of seasons in relation
to sun, earth and moon. The video showed the misunderstanding of science
by the public, even by Harvard graduates.
The importance of science literacy is: (1) science and technology are
having an increasing impact on our lives. The general public needs to
understand what the importance and relevance of these changes is likely
to be. (2) Most entry-level jobs of the future will require a level of
science literacy. And (3) an early introduction to science in an interesting
and exciting context is crucial in motivating youth to pursue careers
in science (science, mathematics, and technology). Dr. Field showed us
that a high percentage of junior high students who expressed career interests
in science are planning to drop math. There is clearly a lack of appreciation
of the importance of science and math.
Dr. Field then explained the Informal Science Education Program in NSF.
The goals of the program are to: (1) improve general science literacy
of children and adults; (2) increase the number of youth excited about
science, mathematics, engineering, and technology; (3) promote stronger
linkages between informal and formal learning communities; (4) encourage
parents and caregivers to provide greater support for and involvement
in children's science, mathematics, engineering, and technology learning
activities; and (5) make informal science education available in large
underserved areas including inner city and rural areas.
Jonathan R. McHale
Director
Telecommunications and Electronic Commerce
Office of the United States Trade Representative (USTR)
(February 9, 2000)
Topic: USTR and the Future of E-Commerce
Mr. McHale provided an overview of the development of electronic networks.
Initially, telecommunication companies profited by leasing access to
banking and other industry sectors that wanted to conduct business electronically.
The Internet married the two groups. Now the Internet, combined with
cheap communications, falling computer prices, and increasing competition,
has led to e-commerce. In many ways, the sale of physical products over
the Internet is analogous to use of mail order catalogs. Using that analogy,
international issues related to tariffs are similar to inter-state issues
related to the collection of sales tax. The situation is more complicated
when information (e.g., music or software) or services (e.g., banking)
are involved.
USTR is working to ensure that negotiated trade frameworks facilitate
e-commerce to the benefit of the United States' economy. This includes
working with the World Trade Organization (WTO) to protect e-commerce
from trade barriers that countries might try to impose in the form of
tariffs. While the WTO has imposed a moratorium on tariffs for information
exchanged electronically, this is an area of intense negotiation. USTR
is pushing for WTO rules to apply to e-commerce, for an extension of
the moratorium on tariffs, and for the use of trade regimes to promote
development.
Countries attempt to block e-commerce in a variety of ways. There are
a number of elements that will be essential for e-commerce to advance
and prosper. These include a right to distribute products, a way to pay
for service, access to the telecommunications infrastructure, and a delivery
infrastructure. A new area of critical importance is electronic authentication
(substituting for signature). The United States is encouraging market-based
standards for authentication, giving companies the flexibility to develop
systems and standards as needed.
Mr. McHale feels that e-commerce will have huge economic and social
impacts. Attempts to control the advance of e-commerce will ultimately
fail, and the United States is positioning itself to be a global leader
in this rapidly advancing area.
Jonathan B. Sallet
Chief Policy Counsel
MCI WorldCom, Incorporated
(February 16, 2000)
Topic: Technology and Competitiveness Issues Confronting the Telecommunications
Industry
In his capacity at MCI WorldCom, Incorporated, Mr. Jonathan Sallet develops
and coordinates the company's public policy positions, and oversees advocacy
to the Federal Government.
Mr. Sallet began his remarks by discussing the history of MCI WorldCom,
which has evolved from several mergers. In an advocacy role, his office
consists of approximately 50 individuals, mostly economists and policy
people who work with the Federal Communications Commission on enforcement
issues, core telecommunications policy, Internet and data issues, and
wireless issues.
Mr. Sallet discussed the misunderstandings of economic regulations and
the commercial power of data. He explained that data is the main driver
of business today. When determining public policy goals, he emphasized
the importance to grasp which way commercial flows are going. Other issues
which he discussed with us included: encryption policy issues; Internet
and telephone taxation; telephone access charges; the philosophy of the "dial
around" services; and the government's role in setting standards.
Robert D. Atkinson
Director of the Technology, Innovation, and New Economy Project
Progressive Policy Institute
(March 1, 2000)
Topic: Technology and the New Economy Project
Dr. Atkinson discussed his work educating federal, state, and local
policy makers about what drives the "New Economy," and fostering policies
that promote technological advances, economic innovation, investment,
and entrepreneurship. He described how in the last 15 years a "New Economy" has
emerged in the United States that is defined by a new industrial and
occupational order, increased levels of entrepreneurship and competition,
and a trend toward globalization. The "New Economy" that is emerging
is a knowledge and idea-based economy where the keys to wealth and job
creation are the extent to which ideas, innovation and technology are
embedded in all sectors of the economy.
Dr. Atkinson discussed his recent report ranking the states according
to how well positioned they are to succeed in the "New Economy." In creating
the ranking, Dr. Atkinson looked at each of the 50 states across five
key policy areas; investment in workforce skills, investment in innovation
infrastructure, promoting customer-oriented government, fostering a digital
economy, and fostering civic collaboration. Dr. Atkinson also stated
the importance of K-12 education to the "New Economy." He also discussed
the role of labor unions and their need to evolve a new role and rationale
in the "New Economy."
Carol Kessler
Senior Coordinator for Nuclear Safety
U.S. Department of State
(March 8, 2000)
Topic: Nuclear Safety Around the World
Ms. Kessler shared with us her concerns with worldwide nuclear safety
issues. She started out clearly explaining the nuclear fuel cycle and
enrichment, leaving us with a sound understanding and appreciation for
the difficulty in assuring nuclear safety. She explained how most other
countries differ with the United States on handling nuclear waste. While
the United States stores nuclear waste, most other countries reprocess
it. She explained the reprocessing techniques. The efficiencies of doing
so were obvious as were the risks. Her explanation of plutonium generation
and control was direct. We quickly came to appreciate the difficulty
of accounting for small amounts of powerful plutonium setting the stage
for discussions that followed on non-proliferation treaties and strategies
to develop international cooperation on nuclear safety.
Ms. Kessler effectively used charts to explain key points of her presentation,
which focused on history of nuclear safety treaty development, major
world players in the nuclear arena, and reactor technology and construction.
She also addressed non-proliferation treaties and nuclear safety conventions.
Her first-hand experience in these matters was impressive.
We found Ms. Kessler's experience in working with the Russians to be
particularly interesting. She talked about cultural and value differences
that underscore the difficulty in working safety issues. She cited as
an example that under the Soviets, electricity was viewed as a free right,
thereby emphasizing productivity at the risk of safety. She explained
that while Russians were excellent at design, they did not share our
values for construction quality control. Western standards for ensuring
safety and quality construction were not a part of the thinking that
went into electricity production during the Soviet era. There is now
a legacy of crumbling and dangerous reactors in post soviet bloc economies
that cannot afford to close down or rebuild.
In response to questions, Ms. Kessler explained the Three Mile Island
and Chernobyl accidents. The Three Mile Island crisis was one of pressure
build-up inside the safety container. While the pressure could be released,
an explosion could have resulted from the release had there been a hydrogen
bubble build-up inside the container. The Chernobyl crises was one of
not having a safety container, thereby not being able to prevent nuclear
contamination from spilling into the environment as controls to slow
down the reactor failed.
Ms. Kessler emphasized the enrichment process and difficulty in accounting
and controlling plutonium during reprocessing and left many of us thinking
about a dilemma.
Site Visit to the FBI Crime and Forensic DNA Laboratories
(March 21, 2000)
Ms. Angelia Heller hosted an exceptionally informative visit to the
FBI Crime and Forensic DNA Labs at the FBI building in Washington, D.C.
We first visited the Forensic Analysis Section, where Mr. Ron Pauley
explained that latent fingerprints are developed using various techniques,
including lasers, chemicals, and fuming with cyanoacrylate vapor. We
were surprised to learn that lip prints could also be used for identification.
Mr. Robert Winston explained how digital imaging is used to enhance
fingerprints. He pointed out that ridges on fingertips never change,
even as one grows older. He told us about the FBI's 36-million fingerprint
database, which is useful to other federal, state, and local law enforcement
agencies needing fingerprint identification information.
Mr. Jack Lutkewitte provided a briefing of the activities of the Computer
Analysis Response Team (CART). This team was established in 1985 to examine
computer evidence. When the team was initially established, its focus
was the development of a magnetic media program. The team then studied
the impact computers would have in the future and, most recently, is
focused on how evidence relates to cases. The CART serves in a variety
of roles including investigative and prosecutorial roles and provides
trial assistance and field support. The CART is trained in evidence preservation
and serves in a leadership role in the international efforts in this
area. Its present goals are to be more selective in its seizures to enable
quicker data extraction and to develop better tools for reviewing evidence,
including the development of the Automated Computer Examination System
(ACES).
Tours of both the DNA Analysis I Lab and the Mitochondrial DNA Lab were
conducted by Ms. Jennifer Luttman and Mr. Kevin Miller, respectively.
The focus of the DNA Analysis I Lab is serology/nuclear DNA. This lab
performs a presumptive test for blood or semen in a sample, confirms
whether the sample is human, and conducts two types of tests, RFLP and
PCR-based tests, to identify samples. The DNA Analysis I Lab is staffed
with teams of technicians who perform the testing and examiners who perform
the interpretations of the data and provide testimony regarding the data.
The Mitochondrial DNA Lab, in contrast, focuses on areas outside of the
cell nucleus. An example of a use of mitochondrial DNA analysis would
be in the testing of a hair sample.
Mr. Doug Deedrick provided a briefing on the expertise exhibited within
the Trace Evidence Unit. This unit focuses on hair, fiber, and fabric
evidence recovered from crime scenes. The briefing included a discussion
of a familiar local case and the manner in which evidence was recovered,
identified, and used to convict the alleged perpetrator in the absence
of other evidence more commonly used as the basis for convictions.
Mr. Jim Cadigan provided a briefing within the Firearms and Tool Marks
Unit. Within this unit, ballistics tests are conducted, general tool
marks are identified, and serial number restorations are performed. Mr.
Cadigan conducted a tour of the unit's extensive standard ammunitions
file and gun collection.
Mr. Bob Rooney conducted the tour of the Chemistry Unit. This unit focuses
on chemical evidence such as drugs, bank dyes, paints, polymers, and
toxicology.
Our visit to the FBI concluded with a tour by Mr. Gene O'Donnell of
the Investigative Prosecutive Graphic Unit. This unit is responsible
for creating graphics for trial and exhibits. Examples of products of
this unit are composites, photo enhancements, facial reconstructions,
and wanted flyers.
Jennifer S. Bond
Director
Science and Engineering Indicators Program
National Science Foundation
(March 31, 2000)
Topic: National and International Measures of Science and Engineering
Investments and Results
Ms. Bond provided a general overview of the National Science Foundation
(NSF), which gives grants to support basic research in science, engineering,
and education. Since its establishment in 1951, NSF has been responsible
for tracking statistics related to science and technology. To do this,
NSF works closely with the various technical agencies (including the
United States Patent and Trademark Office), and with counterpart agencies
in other countries. NSF's National Science Board publishes a biennial
report, Science and Engineering Indicators. The report includes
information on pre-college and higher education, the science and engineering
work force, research and development (R&D) investments/expenditures,
academic R&D, industry and technology, and public attitudes toward and
understanding of science. It also highlights trends over time and compares
the efforts of the United States with those of other countries.
The United States funds more R&D than any other country (more than twice
as much as Japan, which is second). However, when expressed as a percent
of gross domestic product, U.S. funding for total R&D is less than Japan,
and U.S. funding for civilian (non-defense) R&D is significantly less
than both Japan and Germany. While U.S. investments in R&D have increased
significantly over the last 30 years, most of the recent increases have
come from the private sector. That is, government funding of R&D went
down in the 1990's in response to budget constraints and efforts to reduce
the federal deficit.
It is interesting to note that U.S. patents are increasingly citing
scientific and technical articles, reflecting a close coupling between
technological advances and basic scientific research. Japan receives
more U.S. patents than any other country. Germany is a distant second,
followed by France and Britain. While still small, the number of patents
granted to Korean and Taiwan entities is increasing rapidly.
While the public is generally supportive of scientific research in the
United States, many people have a poor understanding of science. Educating
the general public about scientific and technological issues is a major
challenge facing our government.
Robert Lewis, Jr.
Deputy Chief
Research and Development
Forest Service
U.S. Department of Agriculture
(April 5, 2000)
Topic: The Role of Science and Research in Forest Service Decision-Making
Dr. Robert Lewis, Jr. began his discussion by relating a personal story
of his own life to illustrate how to be the leadership in science.
Dr. Lewis, a native of Mississippi lived on his father's small farm
when he was young. At that time, he had no appreciation of the value
of natural resources, yet he wanted to be a medical doctor when he grew
up. After he graduated from Jackson State University with a major in
biology, he worked at the Forest Service Southern Experiment Station
in Stoneville, Mississippi as a biological technician. While he did not
know what the job would involve, he could understand the relevance of
research to the real world situation. His interest in research spurred
him to attend Texas A&M University to study plant pathology. While working
on his Ph.D. dissertation, he was faced with results contradicting those
of his faculty members. Dr. Lewis did not compromise his research findings;
however, he did change some of the wording in his report. With this simple
example, he illustrated how to handle the situation of dealing with conflicts
with your boss. Honesty and integrity are very important, especially
in the area of leadership in science.
In the role of science and research in Forest Service decision-making,
objective views are valued. This is especially important in dealing with
vocal activists such as environmentalists and the forest industry. In
the case of the Pacific Northwest Forest Plan, science and research provide
a major role in shaping the plan. The Forest Service is always cognizant
that their decision-making may be challenged or litigated in court.
The objective publications of researchers and scientists help the National
Forest System make management decisions on public lands. Current debate
on the issue of "forest road or road less" in the Congress is another
example of how science and research can provide objective analysis for
the agency's decision-making. Another area is the agency's new planning
rule on managing the National Forest lands. The new rule focuses on ecological,
social and economic sustainability. The scientists and researchers provide
a critical review of the draft rule, giving a constructive view on whether
the goals and objectives set by the rule are realistic and achievable
for managing the public forests. Traditional roles of science such as
global climate change, crosscutting themes, and issues in the international
arena such as international trade issues relating to importing wood products
and tropical forest products are also very important. On other issues
such as environmental justice and urban forestry, scientists are also
making important contributions.
25th Anniversary AAAS Colloquium on Science and Technology Policy
(April 11-13, 2000)
As this was the 25th annual American Association for the Advancement
of Science (AAAS) Colloquium on Science and Technology Policy, the theme
was very appropriately, "Science and Technology at the Millennium: Retrospect
and Prospect." The first day included a keynote address by Dr. Neal Lane,
Assistant to the President for Science and Technology; a panel discussion
on the "Budgetary and Policy Context for Research and Development (R&D)
in FY 2001" moderated by Dr. Mary Good, President of AAAS; and a lecture
by Dr. Rita Colwell, Director of the National Science Foundation.
The April 12th morning plenary session focused on the "Social, Economic,
and Political Implications of Information Technologies." A panel discussion
was held on the impacts of e-commerce, privacy issues, implications of
the "digital divide," antitrust issues, intellectual property, and encryption.
Dr. Bill Joy, Co-founder, Chief Scientist, and Corporate Executive Officer
of Sun Microsystems, was the luncheon speaker. The afternoon began with
three concurrent sessions on "Major Issues in Science and Technology
Policy," which included: (1) Technology Transfer and Academic Capitalism;
(2) Do Medical Research and Technology Advances Really Lead to Improved
Health Care; and (3) Globalizing R&D. An afternoon plenary returned to
the theme of the colloquium with a roundtable discussion of "The 2000
+/-: Retrospective and Prospective Views of Science and Technology Policy."
April 13th began with a breakfast speech by former Speaker of the U.S.
House of Representatives, Newt Gingrich, who is now a Senior Fellow at
the American Enterprise Institute. The colloquium's last plenary session
addressed the topic of "Genetic Modification of Foods: The Public's Mistrust
of Science, and Science's Misunderstanding of the Public."
Richard A. Guida
Chair
Federal PKI Steering Committee
U.S. Department of the Treasury
(April 19, 2000)
Topic: Current PKI Initiatives
Our discussion with Mr. Richard Guida was informative and a wake-up
call to all government agencies when it comes to Public Key Infrastructure
(PKI). This is the electronic security that every agency is in the process
of developing for its databases.
Mr. Guida graduated from the Massachusetts Institute of Technology in
1973 with a Bachelors Degree in Electrical Engineering (Computer Science)
and a Masters Degree in Nuclear Engineering. Commissioned as a naval
officer, he joined the engineering staff of Admiral Hyman Rickover and
served until 1977 as a reactor engineer responsible for the nuclear propulsion
plants aboard USS ENTERPRISE and several nuclear powered submarines.
In 1977, Mr. Guida became a civilian employee and assumed additional
responsibilities within the Naval Nuclear Propulsion Program, culminating
in 1988 with his selection as the Program's Associate Director for Regulatory
Affairs and appointment in 1989 to the Senior Executive Service. In April
1998, Mr. Guida was appointed as a member of the Government Information
Technology Services Board (Champion for Security) and Chair of the Federal
Public Key Infrastructure Steering Committee. He has continued to function
in these roles since joining the Department of the Treasury in October
1998 as a Senior Technical Advisor to Mr. James Flyzik. In August 1999,
Mr. Guida was also appointed by the Secretary of Commerce to the Computer
Systems Security and Privacy Advisory Board.
At the present time, Mr. Guida is championing the Federal Bridge Certificate
Authority (FBCA). This will allow secure and safe agency-to-agency communication.
As Mr. Guida explained, the bridge would eventually allow commercial
private sector companies to communicate under a secure environment along
with other governments. This will occur within a year's time. Mr. Guida
taught us the Key Infrastructure process of public and private keys and
the exchange during secure encrypted sessions. The slide presentation
showing how the agencies are communicating now and how they will be communicating
in the near future was an eye opener. Mr. Guida, along with OMB, is again
championing the Government Paper Elimination Act.
Site Visit to the National Institute of Standards and Technology
(April 26, 2000)
After Mr. Raymond Kammer, Director of the National Institute of Standards
and Technology (NIST), provided us with a detailed briefing of NIST,
its mission, functions, and goals, we visited the Fabrication Technology
Division of the Manufacturing Engineering Laboratory. There, Mr. Richard
Rhorer explained to us the Charters of Freedom Project. In collaboration
with the National Archives, NIST is contributing to the re-encasement
of the U.S. "Charters of Freedom" -- the Declaration of Independence,
the U.S. Constitution, and the Bill of Rights. The goal of this project
is to design and fabricate encasements that will protect the documents
from oxygen and air contaminants for over 50 years without maintenance,
while allowing daily viewing by the public. Mr. Rhorer told us that the
first of nine encasements has been delivered and now houses the Transmittal
Page of the U.S. Constitution. During the next phase of the project,
NIST plans to build the second "prototype" of a smaller size, followed
by the completion of the production run of seven more encasements.
At the Building and Fire Research Laboratory, Dr. John Gross of the
Structures Division, spoke to us about the Earthquakes Hazards Reduction
Program. Earthquake, fire, and wind engineering research is performed
at NIST to reduce economic losses from earthquakes, extreme winds, and
post-disaster fires and to increase public safety through the development
and adoption of next-generation technologies and practices for disaster
mitigation, response, and recovery. Measurement, evaluation, and performance
prediction research is done to seek an understanding of the most probable
technical causes of failures and whether the causes resulted from failures
to follow established standards and codes or from inadequacies of the
standards and practices.
A visit to NIST is not complete without a stop at the Materials Science
and Engineering Laboratory where Dr. J. Michael Rowe, Director of the
Center for Neutron Research, briefed us on the varied research being
done each year by more than 1,500 people from industry, universities,
and other government agencies.
Our next stop at NIST was at the Physics Laboratory, where Mr. Bert
Coursey, Division Chief of the Ionizing Radiation Division explained
quality assurance in radiation measurement for health care. He told us
that the Ionizing Radiation Division develops, maintains and disseminates
the national measurement standards for ionizing radiation. The Division
is also developing standards and calibration facilities for small, sealed
radionuclide-containing seeds, called brachytherapy sources, used in
cancer therapy and now in the treatment of arterial blockages.
Mr. Rex Pelto, Outreach Coordinator for the Advanced Technology Program,
provided our final briefing. Since 1990, the Advanced Technology (ATP)
has worked with U.S. industry to advance the Nation's competitiveness
-- and economy -- through the development of powerful new technologies
that underlie a broad spectrum of potential new applications, commercial
products, and services. It is a rigorously competitive program, which
co-funds research by individual companies or joint ventures. The sole
aim is economic growth by developing high-risk, potentially high payoff
enabling technologies at the pre-product development stage. These are
technologies that otherwise would not be pursued at all or in the same
market-critical time frame because of technical risks and other obstacles
that discourage private-sector investment. This is the stage of development
that occurs after basic research has been completed, but before product
development begins.
Site Visit to the Pentagon
(May 9, 2000)
We met with Sergeant Brooks, who gave us an interesting tour of the
building. We were escorted through several of the broad passageways as
Sergeant Brooks explained Pentagon history and highlights. Built in only
16 months in the early 1940s, the Pentagon was designed so that the approximately
22,000 employees could reach any point in the building within about 10
minutes. This was a big improvement for the military having workers spread
over 17 buildings in Washington, D.C. and Virginia.
The original construction was close to $50 million. Currently, the Pentagon
is undergoing a $1.2 billion renovation targeted for completion by 2006.
In addition to adding a mezzanine level in the basement, the renovation
is needed to upgrade mechanical, electrical and communications infrastructure
to support automated information.
Following our introductory tour, we visited the National Military Command
Center. We came to a greater understanding of the need to upgrade the
Pentagon communication infrastructure as watch team members briefed us
on operations. Watch teams, headed by a general officer, are on duty
around the clock to monitor the world situation. Duty personnel explained
how data is processed and how the National Command Authority, specifically
the President, Secretary of Defense, and Chairman of the Joint Chiefs
of Staff), is notified of any serious incident requiring immediate attention.
Following a visit to the situation room, we went to the "Tank" where
the Secretary of Defense meets with the Joint Chiefs of Staff on a regular
basis. We also visited a smaller briefing room developed by General Colin
Powell for holding discussions with select personnel during Desert Shield
and Desert Storm. We also had the opportunity to visit the room where
the United States and Russia's hot line is maintained. A daily check
is performed to assure communications are working.
Site Visit to the U.S. Naval Observatory
(May 15, 2000)
We met with Mr. Geoff Chester, Office of History and Public Affairs,
who provided an introduction to the Naval Observatory, explaining its
history and the importance of having accurate time for navigation. In
addition to explaining basic requirements for navigation, Mr. Chester
explained the reasons for moving the observatory to its present site.
As we viewed pictures of those who provided leadership for the Observatory,
Mr. Chester pointed out a picture of his grandfather, Admiral Chester.
We visited the library and saw ancient books on display. The library
was designed in the round with natural light and fountains. The space
was well kept. The librarian explained that all books were available
for reference and research. It is a working reference repository in addition
to being a visitor site. Mr. Chester went on to show us the Observatory's
automated support and explained how computers keep accurate time and
how maintenance is planned to provide a reliable backup.
Our visit concluded with a stop at one of the Observatory's powerful
telescopes. The building featured a vertically moveable floor to facilitate
use of the telescope by astronomers. Increased visibility due to size,
shape, and lens positioning was explained. The U.S. Naval Observatory
exemplifies the combination of old and new technologies to achieve navigation
and accurate time.
Evelyn J. Fields
Director, Office of Marine and Aviation Operations, and
Director, NOAA Commissioned Officer Corps
(May 17, 2000)
Topic: Overview of the Marine and Aviation Operations, and the NOAA
Commissioned Officer Corps
Appointed in May 1999, Rear Admiral Evelyn Fields is the first woman
and African American to serve in her appointed position. The Office of
Marine and Aviation Operations manages and operates the fleet of NOAA
ships and aircraft, the largest fleet operated by a Federal agency. The
Office is made up of civilians and officers of the NOAA Commissioned
Corps. The Corps officers provide NOAA with operational, management,
and technical skills in support of NOAA programs. The officers are trained
as engineers and scientists in NOAA program disciplines.
Rear Admiral Fields provided an overview of the performance measurement
efforts conducted by NOAA, as well as the elements of the Strategic Plan
developed by NOAA. NOAA's mission focuses on five areas, namely, climate
forecast, the promotion of safe navigation using nautical charting, the
building of sustainable fisheries, the recovery of protected species,
and the sustenance of healthy coasts. As noted by Rear Admiral Fields,
workforce-related issues, particularly with respect to retention and
recruitment, present unique challenges.
NOAA's aircraft fleet of 11 fixed wing and 3 rotary wing aircraft cooperatively
performs storm research, including hurricane surveillance, climate and
global change evaluation, air quality research, nautical/aeronautical
charting, marine mammal surveys, and snow melt surveys. NOAA's fleet
of 15 ships cooperatively performs oceanographic and atmospheric research,
global climate change studies, mapping and charting, fisheries research,
marine mammal surveys, and coastal monitoring and research. As fishery
research vessels age and their reliability decreases, NOAA plans to replace
these vessels with new ones that will include desirable features and
comply with present international standards.
Rear Admiral Fields described the Internet at Sea pilot project, which
was successfully established a few years ago. Through this project, real-time
interaction is provided, allowing communication with, among others, school-aged
children.
The NOAA Diving Program includes over 300 divers, the largest of any
civilian Federal agency. This program supports the NOAA Diving Center
and contributes to activities such as nautical charting efforts and sanctuary
research.
Rear Admiral Fields' presentation included a discussion of her experience
as a former ComSci fellow and former President of the ComSci Council.
Site Visit to the National Security Agency
(June 8, 2000)
Ms. Joyce Snyder arranged an extremely informative visit to the National
Security Agency (NSA) for us. Dr. Stuart W. Katzke, Senior Executive
Account Manager for the Department of Commerce, and Chief Scientist,
Information Assurance Solutions Group, hosted our one-day visit. Dr.
Katzke works on evaluating security products. He explained that NSA likes
to partner with other government agencies for a unified direction. NSA
also likes to combine government and commercial interests. NSA maintains
its technological superiority with respect to cryptology for analysis
in order to protect the United States against aggressors and to protect
classified government communications.
Other individuals who briefed us included MSGT Robert Storey, who provided
an NSA/CSS overview and shared NSA's vision and the means by which it
plans to obtain its long-term goals; Mr. Stan Heady, who briefed us on
INFOSEC science and technology trends; Mr. John DeRosa who demonstrated
wireless technology; Mr. Chris Kubic, who demonstrated global grid technology;
and Mr. Jeffrey Dunn of the Biometrics Authentication Division, who demonstrated
personal identifiers including fingerprint, face and voice recognition
using, among other things, algorithms of eye and finger details.
Our day concluded with a tour of the Super Computer Facility. Mr. William
Johnson, Technical Director of the Facility, showed us various powerful
and state-of-the-art computers in use at NSA.
Newt Gingrich
American Enterprise Institute for Public Policy Research
(June 13, 2000)
Topic: Understanding the "Age of Transitions"
Our last seminar of the year brought us together with Mr. Newt Gingrich,
former Speaker of the U.S. House of Representatives.
Mr. Gingrich began his remarks by conveying his belief that we are currently
living through two tremendous patterns of scientific and technological
change. Each very powerful in its own right, the first pattern is the
computer and communications revolution. He stated that we are only about
one-fifth the way through it. The second pattern, which he described
as a combination of the nano world, biology and the information revolution,
is just now beginning to rise. The overlapping period of these two patterns,
which we are just entering, is what he believes is the "Age of Transitions." He
emphasized that "nations that focus their systems of learning, health,
economic growth and national security on these changes will have healthier,
more knowledgeable people in more productive jobs creating greater wealth
and prosperity and living in greater safety through more modern, more
powerful intelligence and defense capabilities" during this "Age of Transitions."
Mr. Gingrich is a strong advocate of increasing funds for research and
development. He indicated that he is in favor of legislation to double
the budget for science research and development over the next five years.
It was clear to see that he feels we are currently failing to fund many
vital areas of opportunity for future scientific knowledge -- knowledge,
which could be important steps toward significant breakthroughs providing
the potential for greater wealth, more jobs, a higher quality of life,
and greater national security.
Mr. Gingrich concluded his remarks by stressing the importance for scientists
to recognize that they have real responsibilities as citizens. "Scientists
must come forward to explain why research matters. No one else is qualified
to make a case for increased funding in science research and science
education. No one else has the understanding and creditability," he stated.
Mr. Gingrich warned that without a continued commitment to funding scientific
research and development, and science education, it is very unlikely
that the United States will maintain the momentum we have created over
the last sixty years.
back to top
Class of 1998-1999
Guy S. Gardner
Associate Administrator for Regulation and Certification
Federal Aviation Administration (FAA)
(September 30, 1998)
Topic: Aviation Safety and the Role of Government in Regulating the
Aviation Industry
Mr. Gardner has been the head of the FAA's Regulation and Certification
Office since 1996. He is responsible for certification and safety oversight
of the United States aviation industry, to include manufacturers, control
systems, operators, and service providers. Mr. Gardner came to the FAA
in 1995 following a career as a fighter pilot in Southeast Asia, a test
pilot, and a NASA pilot-astronaut. He flew two Shuttle missions, "Atlantis" in
1988 and "Columbia" in 1990. His first assignment in the FAA was as Director
of the FAA Technical Center in Atlantic City, New Jersey.
Mr. Gardner opened his talk with several very important questions: what
is the proper role of government in regulating a private business; and,
who are the customers who must be satisfied by that regulatory activity?
The aviation industry is the most stringently regulated in the United
States, and those regulatory activities must satisfy the flying public.
Often, the statistics of aviation safety are overlooked by the disproportionate
attention paid to aircraft accidents. The requirement of FAA regulations
is to establish an appropriate level of safety; that is, to reduce risk
while permitting business operations to continue. Mr. Gardner stated
it would be very easy to make flight 100 percent "safe" -- to reduce
its risk to zero -- but that would mean grounding all aircraft. The government
must seek to strike the proper balance between acceptable risk and keeping
the industry going.
To help the public in defining "acceptable risk," regulatory agencies
need to better articulate what they do. An agency must educate the general
public in how it accomplishes risk management, to convince the "customers" that
the agency is doing its proper job well. Often, there is no "best solution," as
public perceptions are driven by emotionalism. The FAA is currently developing
and collecting global data and tools to perform analytical studies in
order to justify its risk management decisions.
Efforts are underway to meet the future challenges of technology. The
critical problem Mr. Gardner identified is how difficult it is to change
a regulatory structure. Especially in aviation, where the industry is
growing rapidly, the number of people who are flying has increased dramatically,
new technology is coming into play, and new business practices are being
implemented that are unfamiliar to the regulators. It may take several
years to change a regulation, but changes in practices are occurring
at a much faster rate. For example, equipment is being introduced that
improves how safely a general aviation pilot can operate his aircraft,
but it has not been certified. The users of that equipment cannot afford
the systems that have been certified. How does government make the trade-off
between "safety" and "cost?"
The challenge to the regulatory work force is giving them the tools
and the training to conduct business an entirely new way -- to oversee
the processes of the industry and ensure the quality of those processes,
rather than expensive post-manufacturing inspections. The government
must continue to work in partnership with industry to identify the correct
processes and the proper amount of oversight to guarantee the continued
safety of the National Airspace System.
Honorable Vernon J. Ehlers
U.S. House of Representatives
(October 6, 1998)
Topic: The National Science Policy Study
As the new millennium approaches, we find that it is imperative to reexamine
many policies and procedures which have been followed for years. The
National Science Policy is under revision at this time. The Speaker of
the House for the 105th Congress, Newt Gingrich, generated the momentum
behind the project. The speaker charged the House Science Committee with
the development of a new long-range science and technology policy. The
Committee conducted a national science policy study which was lead by
the Science Committee Vice Chairman, Representative Vernon Ehlers (R-Michigan).
Congressman Ehlers' extensive background in science and technology provided
the expertise required to accomplish the Speaker's tasks. Presentation
of the study to Congress is scheduled for the end of the 1998 session.
The study focuses on three primary areas: science, which must be allowed
to thrive under the direction of research; the roles of the private sector
and the Federal Government and the relationship between the two as they
relate to the advancement of science, technology and economic issues;
and the reinforcement of the importance of science and technology from
the educational perspective ranging from kindergarten to graduate school.
In addition, the study addresses the ability to utilize the resources
that we have within the United States. Another emphasized point within
the report is the need for decisions to be based on research conducted
in the science and technological areas of study.
The final emphasis of the study focused on educating the American public
regarding science and technology issues. Through improved communication,
the public's exposure to these issues can be increased. This will enable
a more thorough understanding of decisions that are made in the science
and technology community.
Jacob Rabinow
Inventor/Independent Consultant
(October 20, 1998)
Topic: The Inventive Process
With a life that witnessed firsthand many of the historical events of
this century, Mr. Rabinow shared not only the stories of his public and
private inventions, but also his view of history.
Born in Russia (in the former capital of the Ukraine) in 1910 to the
son of a shoe manufacturer, Mr. Rabinow and his family moved to Southern
Siberia in 1914. There they lived through the Russian Civil War. When
his father's shoe factory was taken over, the family decided to move
to China by train. In 1921, they decided to move to the United States.
Mr. Rabinow took a series of federal entrance exams and was offered
a variety of jobs including lighthouse keeper. Someone sent him over
to the National Bureau of Standards (NBS), where he met Mr. Stetson who
did not hand him his application, but instead hired him on the spot at
$2,000 a year (1938) as an inventor. Mr. Rabinow spent the remainder
of his federal career at NBS (now the National Institute of Standards
and Technology).
Mr. Rabinow's reputation began slowly at NBS. He advised the White House
on radios. When it appeared that a war was coming, Mr. Rabinow advised
his superiors that he intended to enlist. They had other plans, and he
was placed on a project which involved rocket testing. During the war,
he had a series of professional enhancements and rose in public service
to division chief.
During this period, Mr. Rabinow realized that his personal style of
invention did not conform to military laboratory and for awhile, he went
out on his own. He returned to NBS in 1972, where he remained until October
1998. He intends to continue to do consulting work for NIST, primarily
in the area of evaluating other inventor's proposals.
Mr. Rabinow commenced his discussion of the inventive process by dispelling
some widely held concepts. First, he believes that really good inventors
do not accomplish their inventions logically. The inventing process is
random. If one has a problem and an idea of a solution, it is merely
drudge work to carry it out, anyone can do it.
Sometimes inventions have to wait for the proper marriage of idea and
equipment. For example, the concept of television was first discussed
as early as 1875, but the equipment available was too cumbersome to make
the invention practical, much less economical. Another problem inventors
have is getting encouragement, since one of the attributes of an invention
is to be before its time.
One of Mr. Rabinow's most economically successful inventions was a machine
which can read print. At the time he proposed working on it, his superior
dismissed it because it was cheaper to hire young women to read checks,
addresses on cards, etc. His machine is now at the Smithsonian Institution.
Another invention of Mr. Rabinow's is the mechanical magnet clutch.
As he mentioned, the principles were so easy any high school student
could do it, but didn't. After several variations, it is now used in
the manufacture of shock absorbers. Mr. Rabinow passed around a sample
kit for us to try. The magnet particle clutch is an example of a public
patent. This means that because he developed it on public time, any U.S.
citizen can have access to his patent. However, he is able to hold foreign
patent rights and does so in about 22 countries.
Another dearly held notion, which Mr. Rabinow quashed, is that all patents
result in financial success for their inventors. Not so. He made approximately
$26,000 on this invention.
Some inventions come about because friends ask for a solution to a problem.
Once such was his solution to telephone calls which come in when the "owner" is
gone. His friend wanted something which would light up to let him know
he had a phone call while he was out. But it had to be simple and inexpensive.
By remembering a little toy which worked by means of a loose connection
which reacted to vibrations, Mr. Rabinow was able to create a box, with
a loose connection and light. Mr. Rabinow used this example to continue
his discussion of inventors -- many inventions require refinement, sometimes
years of refinements in order to make them marketable.
Some inventors become "successful" because they take ideas others have
given up on and do them quicker or easier. Once an inventor proves that
something can be done and has a model, it is a truism that others can
and will invent the same thing. He used the example of the atomic bomb.
Once it was a reality, then other countries could and would develop one.
Time is not necessarily an inventor's curse. He told us of one invention
which took over 40 years to perfect and which, when successful, was rejected
by industry. He tried to invent a lock which could not be opened except
by the authorized party. After explaining how Yale locks and combination
locks can be opened, he demonstrated his lock. Mr. Rabinow perfected
the lock and showed us a model which cost approximately $1,000 to build.
Timing is, however, ironic in Mr. Rabinow's life. He told us of spending
years trying to interest the watch industry in his solution for a self-regulating
watch. Watch manufacturers were not interested because it would have
required re-tooling their industry from the ground up and would not have
gained them much. Then one night, he agreed to sell his patent to a watch
manufacturer for two cents per copy and a down payment. The following
morning, automobile manufacturers called and wanted the same product.
Mr. Rabinow sent them to the new owner. For 20 years, every car manufactured
in the United States had his self-regulating clock in it.
Mr. Rabinow displayed inventions he has created for fun, or for the
use of his friends. One is a foot rest for short people sitting in theater
chairs; another is vertical venetian curtains. Both of these inventions
he has created in limited copies.
An inventor should sometimes consider his market. Mr. Rabinow invented
a straight arm record player. It is still made by SONY. However, he targeted
it for the average consumer and it did not catch on with the average
music listener.
Finally, Mr. Rabinow waxed philosophically on his industry. "Human beings," he
said, "are the low pass filter." People do not like sudden dramatic change.
This is particularly true of larger companies. The more investment they
have in machinery, a process, a market, the more conservative they are
and the less they want to change. Frequently, change comes about because
a smaller company becomes competitive by accepting the invention, and
the larger company does have to adopt it to keep from loosing the market.
Mr. Rabinow left us with two challenges: the first is indigenous to
the inventor's industry. The costs for filing a patent, and then maintaining
it at three, seven, and ten years is prohibitive to the single inventor.
Nor does the Federal Government have a mechanism for funding proposed
inventions. This means an inventor has no incentive to invent. Yet the
great changes in our culture came from the single inventor, with the
exception of the transistor.
The last challenge: two men are on an island. They have an irregularly
shaped container half full of water. Each wants to drink half. How can
they accomplish it? Mr. Rabinow promises to be back for the ComSci class
of 2004-2005 to tell us the answer.
Donald S. Abelson
Chief Negotiator for Communications and Information
Office of the U.S. Trade Representative (USTR)
(October 27, 1998)
Topic: How Science and Technology Issues Impact Trade
Why should we care about trade? What difference does it make? After
posing these questions, Mr. Abelson launched a fascinating discussion
tracking the evolution of trade agreements and legislation, the overlay
of political concerns, and the unique problems posed by access to the
Internet.
Before World War II, the response to economic strife was to put up tariffs
(duties) to protect United States' sovereignty. Countries subject to
United States' tariffs felt punished. Economists sought ways to improve
the economy and cut tariffs. The result was the trade bill of rights
known as the General Agreement on Tariffs and Trade (GATT). Commencing
in 1948, the GATT formalized rules on conducting trade of goods in an
open and fair way. The basic part of GATT is the concept of most favored
nation (MFN) treatment. Mr. Abelson demonstrated this concept by holding
up a bottle of Evian water. Water imported from a country with MFN status
is treated the same as domestic water.
Although United States' tariffs on imports are generally 3.5 percent,
tariffs on glass can be 38 percent. Textiles and apparel similarly have
high tariffs in order to protect United States' industry. Despite all
the goods, the United States' economy is primarily a service industry
(75 percent), and there were no protections for that in the GATT. The
General Agreement on Trade and Services (GATS) formalized a concept of
countries being open unless they say they are not. For example, in Japan,
providing port services is a closed industry to United States' companies,
but this is not a big issue for us. For the last ten years, the United
States telecommunications industry has been competing in long-distance
service and other countries would not permit United States competition.
In 1997, Mr. Abelson, who headed the United States delegation to the
World Trade Organization Telecom negotiations, bargained for a precedent-setting
agreement on basic telecommunications in which 70 countries made commitments
on local, long-distance and international delivered services by wire,
wireless, and satellite technologies.
Mr. Abelson presented some thought-provoking scenarios for our consideration.
Regulations and trade restrictions are often based on policy or science;
but scientists don't always agree. As a matter of policy, if the United
States wants to protect porpoises and prevent their being caught with
tuna, can we implement that policy through trade restrictions? Can the
United States say that if porpoises are caught with tuna, the United
States will block the sale of that tuna? If the United States wants to
block the import of avocados from Mexico because they may have a fruit
fly under the skin, is this a legitimate concern or is this an attempt
to block trade?
The United States is the creator of the Internet, which was started
by seed money from the Department of Defense. Although 80 percent of
Internet use occurs within the United States, Internet use has inspired
some cutting edge trade issues. Last July, the Clinton Administration
set some standards -- the private sector should lead, and government
involvement should be minimal, predictable and non-discriminatory. The
reaction from some foreign countries where free speech is a foreign concept,
is that they will ban Internet access in part or entirely or limit it
to a select group.
Within the United States, the question of where sales tax is paid when
goods are purchased on the Internet, is being discussed. Under the terms
of a new law, there is a three-year moratorium. For three years, no state
will tax in a discriminatory way or a new way, while the Internet tax
issues are being debated. To find out more about the USTR, visit their
website at www.ustr.gov.
Visit to the Federal Bureau of Investigation's Forensic DNA and Crime
Laboratories
(November 3, 1998)
We toured some of the laboratories at the Federal Bureau of Investigation
(FBI) Crime and Forensics Laboratory at the FBI headquarters, Washington,
DC. Included were:
- Computer Analysis Response Team -- provides analysis of computers
and computer files that may be used as evidence
- Two DNA labs, one for nuclear DNA, the other for mitochondrial DNA
- Chemistry and Toxicology Lab
- Trace Evidence Unit
- Latent Print Unit
It was clear that we saw only a part of the laboratory operation. We
were told the complete tour requires two full days. However, we visited
a variety of interesting labs, and we gained some appreciation for what
the Forensics Laboratory does, some of the challenges it faces, and how
it supports the agency's operation.
The Computer Analysis Response Team consists of a headquarters group
and field examiners in most of the FBI's field divisions. They deal with
computer evidence, mostly on disk, tape, or other computer storage devices.
They examine data, help in the investigation, collection, and preservation
of evidence, train agents, and serve as leaders in the international
forensic computer examination community. Most of the work currently involves
investigations of child pornography cases.
We saw the headquarters computer lab, a large room with at least one
of every kind of small computer, both current (e.g., the i-Mac) and obsolete
(e.g., Commodore 64). In addition, they are developing a suite of the
most common operating systems -- Mac, Windows, various Unix -- with a
single interface that examiners can access through a network from their
desks. The growth of computer storage has affected the team; more cases,
each involving more storage, have resulted in about a 300 percent annual
growth in data examined. Part of the team's research work is to find
ways to automate some of the examination.
The two DNA labs do type matching of DNA to support investigation. The
nuclear DNA lab uses established techniques to compare DNA evidence with
DNA from known sources (so-called "DNA fingerprinting"). If there is
enough material, restriction fragment length polymorphism is used to
produce gel electrophoresis patterns. With less material, polymerase
chain reaction (PCR) amplification is used to produce enough sample for
comparison. The techniques are well-established and well- accepted as
evidence. The mitochondrial DNA lab uses DNA from outside the cell nucleus.
This technique has the advantage that mitochondrial DNA can be obtained
from hair, bone and teeth, and mitochondrial DNA is more resistant to
environmental degradation than is nuclear DNA. The disadvantage is that
the mitochondrial genome is several orders of magnitude less complex
than the nuclear genome and does not provide as accurate an identification
of an individual's DNA as when nuclear DNA is analyzed. Furthermore,
only a limited number of DNA probes from the mitochondrial genome have
so far been characterized and used as primers for PCR amplification.
The technique is still somewhat experimental and has been used in about
a dozen FBI cases so far. The labs are well-equipped with multiple automatic
gene sequencers and other equipment.
The Chemistry-Toxicology Laboratory analyzes explosives, consumer products
for evidence of tampering, substances for general chemical identification,
and does toxicology. We toured well-equipped chemistry laboratories with
multiple gas- and liquid-chromatography mass spectrometers, Fourier transform
infrared spectrometers, and other analytical equipment. The lab has an
extensive collection of paint samples, and does frequent identification
of paint, plastic, tape, and cosmetics that are part of evidence. We
saw examples of the latest technique in drug smuggling -- incorporation
of drugs into plastic that is then molded into apparently ordinary plastic
parts and shipped as such. The plastic is later dissolved and the drug
is extracted from the solution.
We had a brief tour of the Trace Evidence Unit, which identifies things
like hair, fiber, glass, building material, and soil from their appearance
and by comparison with known samples.
Finally, we saw the lab where latent fingerprints are developed, using
a variety of techniques, including various direct and fluorescent dyes,
silver nitrate, and cyanoacrylate (superglue) vapor. In addition to hard
and smooth material, fingerprints can be obtained from rough and porous
material, including paper, cloth, and wood.
It was an intensive tour through a diverse set of labs. The labs are
housed on two floors of the FBI headquarters building in space that we
were told was built for offices. Lab utilities and hoods have apparently
been installed with difficulty. The labs seem to be well-equipped, with
up-to-date equipment including automatic gene sequencers, an argon-ion
laser in the fingerprint lab, and the latest mass spectrometers in chemistry.
The building is a maze of windowless corridors with closed doors that
lead to windowless labs, all separated by numerous doors that require
a person to use a key card to move from one area into another. Visitor
entry is down a fortified parking ramp to the basement, through a guardhouse
with metal detectors, through a reception area where identification is
checked again, and through turnstiles that operate only with a badge.
Security is tight.
Some labs are expanding and others are not. The computer lab is trying
to grow rapidly as their work increases and finding it difficult to hire
enough qualified information technology professionals. The DNA labs are
growing and have plenty of qualified applicants. The latent fingerprint
unit has decreased in size and has had to provide new staff with extensive
additional training as the number with prior conventional fingerprint
experience declines. Our guide, who is from the personnel office, said
applicants may be otherwise qualified, but a significant number cannot
pass a background screening that includes lifestyle and drug history.
Susan M. Landon
Petroleum Geologist/Independent Consultant
Thomasson Partner Associates
(November 19, 1998)
Topic: Technology and Its Impact on Minerals and Energy Mineral Fuels
Ms. Susan M. Landon, an independent petroleum geologist, is also a member
of Thomasson Partner Associates of Denver, Colorado. Actively involved
in petroleum exploration and production in the United States, Ms. Landon
is the Chairperson of the Committee on Earth Resources of the National
Research Council.
Using her career-long interest in the Precambrian Mid-continent Rift
system in the north central United States as an example, Ms. Landon traced
the incremental impact of technology as it affected the petroleum geologists
over the past 20 years.
Research, which formerly required the personnel and equipment resources
of a large, well-organized petroleum company, can now be accomplished
through computer interface by a geologist working alone or in partnership
with colleagues in related sciences. Improvements in the ability to create
complex computer graphics of explored (known) geological formations,
permit a better extrapolation of the probability of oil or natural gas
at the field site. In addition to improving the quantity of available
data and its quality, technology has also reduced the cost of data.
German-American Academic Council (GAAC) Symposium on Intellectual
Property
(December 3, 1998)
The symposium began with a welcome and introduction of the keynote speaker
by Dr. Jack Halpern, Chairman, Kuratorium of the German-American Academic
Council Foundation; Vice President, National Academy of Sciences. He
welcomed symposium participants and emphasized that the purpose of the
symposium was to provide a forum for the exchange of information on national
and international issues, particularly between various sectors of the
German and American communities in the intellectual property arena. The
symposium would include all aspects of intellectual property including
patent, copyright and trade secrets.
Professor Kenneth W. Dam, University of Chicago Law School, presented
the keynote address. Professor Dam spoke about intellectual property
and the academic enterprise. The university environment presents unique
issues in the area of intellectual property. The academic and research
environment of the university is based on data collection and data manipulation.
Intellectual property protection is critical in this setting to provide
protection for innovations in technology and basic research.
The first session began with an opening address by the session moderator,
Dr. Karl-Heinz Hoffmann from the Technical University, Munich, who introduced
the session on information technology and database protection. Dr. Hoffmann
emphasized the importance of legal protection of data, particularly now
since many businesses, public and private research institutions, governments
and other industries rely heavily upon data collection and information
exchange to successfully conduct business. The session speakers were
Professor Jerome H. Reichman of Vanderbilt University Law School who
spoke about "Database Protection at the Crossroads: Recent Developments
and the Long-Term Perspective," and Professor Ulrich Loewenheim of Johann
Wolfgang Goethe University in Frankfurt who spoke about "Information
and Database Protection in the European Union (EU)." Professor Loewenheim
talked about how patent trends in the EU differ from those in the United
States in that the EU model emphasizes fast commercialization and less
dependence on basic or leading edge science than does the United States
model.
The topic of the afternoon session was "Genetic Information and Research
Tools in Molecular Biology/Biotechnology." Dr. Purnell W. Choppin, Howard
Hughes Medical Institute, moderated the session. The introductory speaker
was Professor Joseph Straus of the Max Planck Institute for Foreign and
International Patent, Copyright and Competition Law, who addressed the
question "Intellectual Property Rights in Human Genome Research Results
-- The United States and the European Approach, Common Problems -- Different
Solutions?" This session ended with a panel discussion. The panelists
included: Dr. Francis S. Collins, Director, National Institutes of Health,
National Human Genome Research Institute; Dr. Hans-Georg Landfermann,
Federal Ministry for Justice, Bonn; Dr. Jack L. Tribble, Merck and Company,
Incorporated; and Dr. J. Craig Venter, Celera Genomics Corporation.
Mr. Bruce Lehman, Assistant Secretary and Commissioner of Patents and
Trademarks, U.S. Department of Commerce, gave a special evening lecture.
His topic was "The Changing Character, Use and Protection of Intellectual
Property." He spoke about the fact that intellectual property is more
necessary now than ever before, particularly in this era of globalization.
He cited the fact that the creation and application of knowledge are
now truly globalized and as such, other countries are positioning themselves
to challenge the United States and each other as technological leaders
in various sectors. In order for the United States to maintain its competitiveness
in the area of technology, it must secure intellectual property protection
for its research, and science and technology innovations. The symposium
provided an opportunity for in-depth discussion of national and international
intellectual property issues.
John M. Logsdon
Director, Space Policy Institute
George Washington University's Elliot School of International Affairs
(December 8, 1998)
Topic: The Changing Character of the United States' Space Program
Dr. Logsdon is a professor of Political Science and International Affairs
at George Washington University and is also the Director of the Center
for International Science and Technology Policy there. He holds a B.S.
in Physics and a Ph.D. in Political Science. Dr. Logsdon is author of
The Decision to Go to the Moon: Project Apollo and the National Interest,
and the general editor of the series Exploring the Unknown: Selected
Documents in the Evolution of the United States Civilian Space Program.
Dr. Logsdon opened his talk with a brief overview of the history of
the United States space program from the decision in 1958 to create a
civilian organization to manage United States space interests (rather
than make it a Department of Defense (DOD) function). The National Advisory
Committee for Aeronautics was converted into the National Aeronautics
and Space Administration (NASA) in that year, mostly in response to the
successful Russian launch of the Sputnik satellite in 1957. That is,
the entire initial purpose of a United States space program was for propaganda,
not scientific research and exploration. This complexion changed in 1961,
when President Kennedy called for the United States to place a man on
the moon by the end of the decade.
The space program mobilized resources and people on the same scale as
a war, a significant commitment that resulted in Project Apollo, the
successful landing of humans on another celestial body in July, 1969.
However, since the completion of Apollo, the space program has drifted
for the last 20 years.
Dr. Logsdon next discussed the structure of space activity in the United
States. First, a national security program was established for strategic
intelligence satellites separately from the DOD. This activity was managed
similarly to the Central Intelligence Agency's U-2 reconnaissance project,
in a program called "CORONA," which later became the National Reconnaissance
Office (NRO).
Another defense-type space activity was designed around the idea of
placing humans in space for military purposes. This program was a failure,
in general, but did lead to the creation of the communication and navigation
satellite constellation to support the war-fighting effort, a capability,
referred to as "force multipliers." Dr. Logsdon stated that there had
not been any successful development of deployment of weapons to attack
space resources, and there was a tacit agreement between nations to make
space assets non-targets.
Finally, the third space activity is commercial. The government funded
corporations to compete with AT&T in the early years of satellite relayed
communications (TELSTAR), which transitioned into a very successful commercial
venture. The use of satellite imagery has been predicted to be the next
great commercial enterprise, but it remains to be seen whether there
will be any large-scale market for this.
The overall result of the space program is that the government funds
over 70 percent of global space activity. Only the United States has
a significant national security and military space presence; and only
the United States pays for the cost of humans in space. The United States
spacecraft builders dominate the communication satellite market, but
the ability to launch equipment is still an open competition.
Dr. Logsdon holds several important assumptions:
- Space systems are essential for national security;
- The United States is in position to be the leader of space for the
21st Century, which is what we ought to be aiming for.
The end of the Cold War has seriously undercut the rationale for NASA's
existence, keeping in mind it was set up as a propaganda tool to counter
the Soviet's success in space. Also, the end of the war reduces the requirements
for strategic intelligence, and highlights a mismatch between military
capabilities and military requirements.
The NRO is transitioning to smaller, distributed systems. Its budget
(some $8 to $15 billion a year) is being devoted to establishing a global
information superiority, the ability to provide field commanders with
real-time satellite imagery, anti-terrorism, and nuclear non-proliferation
monitoring.
The DOD is trying to rationalize its vision for military activities
in space to include defense of the various civil and military satellite
constellations and missile defenses based on space-born weapons.
NASA has the smallest budget of a major federal agency, claiming only
seven percent of the federal dollars since 1977. Its objectives are based
on the reduction of launch costs by orders of magnitude and smaller packages
in space. The International Space Station; however, is an old-style program:
it was not reinvented or reorganized along these organizational lines
with the rest of NASA.
Dr. Logsdon stated that the Space Station is based on some fundamental
assumptions:
- There are tangible benefits to having humans in space;
- Humans in space are a public "desire" -- therefore, the Space Station
is a "place to go."
- Space Station must be accomplished and operating smoothly before
manned missions beyond Earth orbit are politically acceptable.
The first two modules have been launched and joined. The next module,
from Russia, is a year behind schedule, and such is its criticality that
it must be launched for the other components to remain in orbit. The
delay means there is a risk that the first parts will fall out of orbit
before the next one is ready. With the Space Station, NASA is "priming
the pump" to generate a scientific utility of humans in space; that is,
changing its mission to be more in line with science and exploration
in the post-Cold War era.
Commercialization of space will soon be an international $100 billion
industry. This includes the derivative markets of launch services, GPS
nav/timing, commercial industrial production in Low Earth Orbit (LEO),
the transmission of power from space for consumers, space tourism, and
extra-terrestrial resources.
Gary Cohn
Staff Reporter
The Baltimore Sun
(December 10, 1998)
Topic: Ship Scrapping in Countries Without Technology
It was the pictures of ships being scrapped in the United States and
in India that really made the three-part Baltimore Sun series command
the reader's attention. The colorful, chaotic photos depicted the dangerous
work of ship scrapping and its consequences. The series received a lot
of attention, and Gary Cohn and his co-author, Will Englund, won the
coveted Pulitzer Prize, even though photographer, Perry Thorsvik did
not, since photography is a separate category in the competition. However,
together they won other nationally prestigious awards.
Mr. Cohn's discussion with us was an eagerly awaited event. The home
agency of one of us is the Maritime Administration (MARAD), which oversees
the scrapping of obsolete MARAD vessels. Another one of us participates
in the program from the U.S. Environmental Protection Agency (EPA). Prior
to becoming a ComSci Fellow, she was working with MARAD and the U.S.
Navy to address the environmental hazards of ship scrapping and the issues
surrounding export of obsolete government vessels for scrapping abroad.
It was a chance encounter that sparked the idea for the story and sent
the reporters around the world to the Alang Coast in India. Mr. Englund
spotted pieces of the ex-USS Coral Sea, an aircraft carrier, being scrapped
in Baltimore Harbor not far from his job at the Baltimore Sun. As he
participated in a September 1995 tour of shipwrecks of the Baltimore
Harbor, the guide said, "there's an aircraft carrier over there and the
guy's having some kind of problems." Mr. Englund looked into the scrapping
of the ex-USS Coral Sea at Seawitch Salvage and discovered environmental
and safety problems, which would ultimately lead to the criminal conviction
of the salvager. From there, Mr. Englund, who had teamed up with Mr.
Cohn, secured a listing of Navy ships being scrapped in the United States.
From Wilmington, North Carolina to Brownsville, Texas, the two reporters
found environmental, health and safety problems at each ship scrapping
facility. Cognizant that the most captivating story would be a multi-faceted
one, they set out to tell the story from every perspective.
They became an unrelenting presence at the doorstep of the Hispanic
ship scrappers, following them home with beer and pizza. In Brownsville,
illegal Mexican labor seemed to be the management's key for dismantling
the old ships. Eventually, the workers grew comfortable with the reporters
and told them stories about ripping out asbestos from the vessels with
their bare hands and no protective clothing or masks.
The reporters discovered lawsuits filed across the country by workers
who were injured and families of workers who were killed. The reporters
filed Freedom of Information Act Requests (FOIAs) to get files, reports
and notes from the Occupational Safety and Health Administration (OSHA)
to pursue their investigation.
The businessmen who ran the ship scrapping businesses spoke openly to
Mr. Cohn and Mr. Englund. The reporters soon discovered that ship scrapping
in the United States operated on the margin. The businessmen cut corners,
especially in the case of environmental and safety regulations, because
they couldn't afford not to. In the United States, there is little profit
in ship breaking, the demolition of a vessel to recover scrap metal.
The money is made by selling the dismantled steel for smelting. Since
the industry operates on the margin, bankruptcy filings were common.
Then the story took a different turn. In what Mr. Cohn called "his big
break," he found out that the Navy, tired of its domestic ship scrapping
woes, was planning to send ships overseas for scrapping. Ship scrapping
overseas, where workers are paid pennies a day and expensive procedures
like asbestos removal and hazardous waste disposal are non-existent,
is very profitable. India is an especially large market for recycled
steel. Where the Navy may receive a high bid of $200,000 to scrap an
obsolete United States' aircraft carrier in the United States, bids coming
from ship brokers planning to export vessels to India for scrapping are
closer to $7 million for the same vessel.
Without too much effort, Mr. Cohn convinced his editors to send him
first to London and then to India. In London, he spoke to the ship brokers,
those middlemen who rack up a profit by procuring vessels to be scrapped
in third world countries. Next, accompanied by an Indian reporter, he
met with many Indian ship scrapper managers for tea and soft drinks.
Careful to avoid any representatives of the Indian government, he was
given freedom to walk around the Alang ship scrapping operation taking
pictures as they went. More than 30,000 laborers cut ships apart with
torches and live in squalid shacks. The striking pictures of rag-headed
and barefoot workers toting tons of steel told a powerful story. Accidents
and death abound. "It is better to work and die than to starve and die" is
what the workers told Mr. Cohn. Finally, he went to the Indian government,
who denied that what he had photographed existed. They showed him the
strict safety regulations requiring hard hats and other safety precautions.
While winning a Pulitzer Prize has its satisfaction, Mr. Cohn said the
most satisfying aspect of the experience was to write a story with impact.
Following publication of the series, MARAD and the Navy voluntarily suspended
their efforts to export vessels for scrapping. Congress held hearings
on government ship scrapping. The EPA held a public hearing. The Department
of Defense convened a high-level interagency panel to address the serious
environmental and worker safety issues. On September 23, 1998, Vice President
Gore signed a memorandum to Defense Secretary Cohen and Transportation
Secretary Slater requesting that both the Navy and MARAD observe a moratorium
on exporting vessels for scrapping through October 1, 1999 to allow for
the panel recommendations to be fully considered and implemented.
To read the series and view the pictures, visit the Baltimore Sun website
at www.sunspot.net/news/special/shipbreakers/.
Joel C. Willemssen
Director, Civil Agencies Information Systems
General Accounting Office
(January 5, 1999)
Topic: The General Accounting Office's Efforts to Monitor and Evaluate
the Year 2000 Compliance Efforts of Executive Agencies
Mr. Joel Willemssen has been tasked with ensuring that Federal Government
agencies identify and address Year 2000 issues. The Year 2000 concern
is that we have hundreds of millions of computers and devices that literally
cannot read the year 2000. This means that when the clock strikes midnight
on January 1, 2000, everything from air traffic control to water systems,
heart monitors to nuclear power plants could be affected.
Although federal agencies are Mr. Willemssen's primary focus, his responsibilities
extend to state governments, local governments and key economic sectors.
Water, power and telecommunications are defined as the key economic sectors
and together help form the infrastructure of each Nation. Mr. Willemssen
coordinates the efforts of the General Accounting Office (GAO) with other
organizations that are responsible for the Nation's infrastructure, such
as: the local water authorities, the Federal Communications Commission,
and the North American Reliability Council.
In 1977, GAO identified their concerns regarding the Year 2000 issues.
These concerns were published in a summary report and were forwarded
to the Office of Management and Budget (OMB). After reviewing the report,
OMB concurred with the urgency of the Year 2000 problem. This report
assisted Congressman Stephen Horn in identifying the need for a committee
to be formed focusing on Year 2000 issues. The first committee meeting
was held in 1986 with Congressman Horn as the Chairman. Mr. Willemssen
works in conjunction with the Committee to assist agencies in becoming
Year 2000 compliant and has had to testify before congressional committees
on numerous occasions. A quarterly meeting is held on Capitol Hill where
Mr. Willemssen identifies the compliance status of each agency and any
institutions having mission critical systems such as: the Department
of Defense, the Federal Aviation Administration and the airline industry.
Because of the massive effect they have on the functioning of the country,
these federal and private institutions are exposed to constant scrutiny.
Federal agencies were given a Year 2000 compliance date of March 31,
1999 with the additional nine months in the year to be used to correct
any issues that may not have been identified or addressed.
Organizations must go through four phases when becoming compliant: awareness,
assessment, renovation and validation/implementation. Awareness is the
institution's ability to recognize the processes within the organization
that may have Year 2000 issues related to them. Assessment of an organization
is the inventory of all systems, including hardware and software, and
whether or not the inventory can be affected by Year 2000 issues. The
renovation phase is preparing a plan and solution for each problem area
identified. This phase also incorporates the concerns for dealing with
data exchanges between institutions and which "digit" standard should
be adopted. One standard is the eight-digit standard used by the National
Institute of Standards and Technology. The eight-digit standard uses
a two-digit month, a two-digit day and a four-digit year, but has not
been deemed mandatory for adherence by all agencies. Finally, the validation
and implementation phase deals with testing the proposed solution, implementing
it and being prepared with a contingency plan in the event that there
is some type of failure.
Although most institutions are somewhere in the compliance process,
state and local governments are at the milestone marker that the Federal
Government was approximately one year ago. During the evaluation process,
the discovery was made that the more decentralized an organization, the
lower the priority for that agency to commit resources to working on
Year 2000 issues. Mr. Willemssen also emphasized the fact that many organizations
are experiencing Year 2000 failures now because their fiscal years do
not follow the calendar year causing problems with any item requiring
a twelve-month cycle. An example is unemployment compensation. In many
state governments, it is calculated for a one-year period. If the system
is not compliant, it cannot calculate the cycle past January 1, 2000.
Mr. Willemssen's discussion of the efforts of the GAO's ability to monitor,
evaluate and guide federal agencies into the new millennium regarding
Year 2000 technology concerns was greatly appreciated. Three GAO publications
distributed emphasizing the importance of the issues addressed in the
discussion were: Year 2000 Computing Crisis: Business Continuity and
Contingency Planning; Year 2000 Computing Crisis: An Assessment Guide;
and Year 2000 Computing Crisis: A Testing Guide. We gained an in-depth
analysis of what type of impact Year 2000 can have throughout an organization
and the rippling effect it can cause. A special thank you was also extended
to Mr. Mirko Dolak for accompanying Mr. Willemssen and participating
in the discussion.
Christopher T. Hill
Vice Provost for Research
Professor of Public Policy and Technology
Institute of Public Policy
George Mason University
(January 26, 1999)
Topic: Technology Policy -- The Role of Science and Technology in Promoting
Economic Growth
Dr. Christopher Hill provided insight into the role of science and technology
in promoting economic growth for the United States. Dr. Hill's interest
in technology paralleled the significant contributions of the late Dr.
J. Herbert Holloman, first Assistant Secretary of Commerce for Science
and Technology, who initiated the ComSci Program. Dr. Holloman promoted
technical policy as an independent field of study related to, but not
identical with, science policy, economic policy and defense- related
research and development (R&D).
Traditionally, economic growth for countries (including the United States)
fell into one of four categories: capital, labor, land, or natural resources.
If one or more of these economic generators could be controlled by a
nation, then growth results. Capital was more commonly known as gold,
mining, or investment monies; labor as a nation's colonization and slavery
practices; land as fortuitous geographic location -- near a river, which
would allow placement of tariffs on the movement of goods; and natural
resources, a nation's availability of minerals, water, spices or whatever
commodity was highly regarded at the time.
Science, and particularly technology, had no place in these traditional
economic generators until the later half of the 20th Century. Technology,
was viewed with suspicion. For example, shortly after the Depression,
Congress commissioned a study to determine whether or not the advance
of technology led to the Depression because it put numerous people out
of work. "Where technology is" in relation to people, their training,
and understanding of its use, was the mission of the now defunct Office
of Technology Assessment.
Technology policy (or lack thereof) is also embroiled in the history
of states' rights versus federalization. Since technology involves industry
and industry is always owned by someone, hence is property, the states
maintain they are the proper party to protect and foster property/technology.
The State Technology Service Program Act provided money to the states
to set up Industrial Extension Program (similar to that developed for
Agriculture). The Manufacturing Extension Partnership (MEP) program provides
funding to centers with matching funds from the states.
The goal of the states is a free and open market with wisdom vested
in each participant. This market is unregulated, and an ideal. The role
the states foresaw for the Federal Government was to develop programs
that address market failures.
The federal authorities, sought a more expansive role, usually depending
upon the commerce clause in the Constitution to base their historical
activities in this area, that is, that the role of the Federal Government
is to ensure that no state is to erect barriers to trade. Since some
states "protectionism" amounted to barriers, there was a role for the
Federal Government to protect, promote, and foster trade and its component
technology. The U.S. Supreme Court backed up this role in regulating
railroads and barge traffic.
Dr. Hill discussed the Charpie Report, which became a touchstone for
technology policy in the same manner that the Bush Report defined science
in the later half of the 20th Century. It solidified the premise that
technology's impact on industry could be measured, controlled, analyzed,
and modeled, and therefore was an economic generator. Further, it endorsed
the role of the Federal Government in encouraging industries such as
textiles, glass manufacturing, and housing to use the technological advancements
utilized by the aerospace and chemical industries to increase their output,
minimize waste, and compete on a worldwide basis.
Since the Charpie Report, federal support of R&D has included:
- Legal protection without budget expenditures -- Initially, individual
companies were subject to individual suits for anti-trust damages if
they banded together to undertake research, the National Cooperative
Research Act was passed. Companies desiring to undertake cooperative
research advised the Department of Justice and the Federal Trade Commission
simultaneously of their intent, and they are no longer subject to anti-trust
suits for treble damages, just for actual damages if their research
turns out to be harmful. Hundreds of companies have registered for
this protection and the Federal Government did not have to expend funds.
- Support of research at universities -- Federal budget allotments
to universities through grant programs administered by the National
Science Foundation, the National Institutes of Health, etc., not only
produce research, but also foster the resource (scientists for industry)
through training/education of students.
- Creation of universal standards, National Institute of Standards
and Technology -- In the 1950's, a Massachusetts Institute of Technology
researcher devised a model which stipulated that after the significant
generators had been accounted for what was left over must be the effect
of technology on economic growth. This accounted for between 35 and
70 percent. Since that time, economic analysts have been ascribing
other factors to economic growth and downsizing the impact of technology.
Dr. Hill indicated that when the Clinton Administration took over in
the early 1990's, technology policy initially looked like it would be
a primary area for legislation with corresponding economic development
programs. Congress voted it down. Since 1993, policy has remained relatively
quiet, while economic growth has advanced through technological advances
such as the Internet.
John T. Preston
President and CEO
Quantum Energy Technologies Corporation
(February 2, 1999)
Topic: Commercialization of Technology
Mr. John T. Preston addressed us on the commercialization of technology.
Mr. Preston is the President and CEO of Quantum Energy Technologies Corporation
and Senior Lecturer at the Massachusetts Institute of Technology (MIT).
With a Master's in business administration, and founder of over a dozen
technology based companies, Mr. Preston has served as an expert witness
for both the U.S. Congress and state legislation providing advice and
comment on corporate investment.
Commerce and technology come together when an inventor/researcher has
an idea/product which he/she wants to develop/market as a commercial
product. Mr. Preston provided comments and observations on this aspect
of business administration including the following:
Radical innovations never originate with market leaders -- For example,
when Westinghouse brought their alternating current idea to Thomas Edison,
he actively tried to discourage any development of it. Similarly, the
fledgling company which developed LOTUS 1-2-3 originally offered it to
IBM, who did not take advantage of the offer and ultimately paid considerably
more for the product.
Management teams -- A first-rate management team with a second-rate
technology can make the company profitable. But, a second-rate management
team with a first-rate technology will usually have a hard time of it.
One of the advantages the United States has over Europe is that the best
and brightest of the United States want to enter into business and attend
universities like MIT to learn how to start their corporations. Four
or five entrepreneurs working together as a management team increase
the probability of success. Particularly, if they have complementary
skills.
Patents -- Use them. Mr. Preston has over 1,600 at MIT alone. Investors
are reluctant to invest if property rights are not protected.
Passionate believers -- The example provided was Southwest Airlines
where company employees (also the owners) take an interest in all aspects
of the airline's service. Similarly, employees of LOTUS who had stock
options, found it in their own best financial interest to promote the
corporation, being both creative and consciously looking for ways to
improve the profit line and reduce waste.
Investors -- Provide leverage for the company. They often have contacts
which open doors of opportunity for the managers of the company. They
have deep pockets and provide sufficiently long-term capital to allow
for long-term growth. Several studies concerning short-term investment
versus long-term investment indicate that long-term investment eventually
pays back more. SBIR groups tend to undercapitalize. If we, as government
managers, are going to fund something, fund it to succeed, not just survive.
Quality versus speed to market -- Again, management expertise is required.
A product must have sufficient quality to be marketable, but this is
often in conflict with the desire to get to the market as fast as possible.
If a product can reach the market six months before its competitor, it
will have a life earning one-third greater than its competitor. The danger
is to rush to market and then find a flaw which causes the market to
no longer desire the product (recall).
Flexibility -- The United States tends to take the lead in invention
type products such as movie making and software, while Japan tends to
lead in consumer-related products. This is a reflection of the emphasis
the United States places on research, while Japan improves products already
invented, making them better or more consumer-oriented.
Barriers to innovation -- Can include regulations and laws as well as
tradition. For example, in Japan until recently, one could not receive
credit for learning outside the classroom. While protecting the university
industry, it stymied the desire to advance by studying independently.
In the United States, we foster distance learning with recognized degrees
for knowledge acquired.
Closing the boarders to foreign competition spoils the advantage in
the long run -- Japan would not allow foreign car competition for years,
now their industry cannot compete with the advances of car companies
outside of Japan.
Location of the company -- By clustering complementary industries near
each other, they have a synergistic effect. For example, flower growers
in Holland, Route 128 (Silicon Valley), and Singapore's hard disk industry,
all have supporting industries nearby, which complement the major industry.
Mike Causey
Federal Diary Columnist
The Washington Post
John Schwartz
Science and Technology Reporter
The Washington Post
(February 9, 1999)
Visiting the Washington Post isn't just visiting a newspaper. It is
more like visiting a historical site, since the Washington Post is the
paper whose coverage eventually contributed to the resignation of President
Richard Nixon in the wake of the Watergate scandal. The Washington Post
newsroom is memorialized in the movie "All the President's Men."
Mike Causey gave us the grand tour, showing us the glass-enclosed office
where former editor Ben Bradlee made tough decisions concerning the Nixon
stories. We saw Bob Woodward's office as well as where Carl Bernstein
sat. Mostly, we looked out at a sea of empty desks, since 9:30 a.m. is
still early in the morning for the newspaper business. By the time we
left the Washington Post at 1:00 p.m., the staff presence had grown and
the newsroom was in full swing. Like a symphony, the staff each does
their own part and somehow it all comes together by the deadline to become
a newspaper.
Out of the 2,811 full-time employees at the Washington Post, the name
Mike Causey is most familiar to federal workers. Mr. Causey started writing
the column, "The Federal Diary," in the early 1970's. Since then, we
have learned everything from retirement plans to great opportunities
for federal workers (like the ComSci Program) from Mr. Causey. The subject
matter of his columns impacts us and so we were thrilled that he agreed
to be our host for the day.
Mr. Causey started at the Washington Post as a messenger in the advertising
section. After a break for the Army, he moved to the newsroom and eventually
took over the Federal Diary column from retiring columnist, Jerry Kluttz.
Mr. Causey has experienced the impact of Watergate from a career perspective.
Watergate changed journalism forever, making it a career instead of a
temporary type of job. Prior to 1972, the Washington Post had no retirement
system because journalism wasn't considered a career field. Mr. Causey
sees his column as bringing comfort to the distressed and distress to
the comfortable. As sort of a daily office gossip, he lets people know
what's happening in their agencies without a filter.
When asked to opine on what he sees coming for federal managers, he
quips that the Office of Personnel Management's (OPM) new report says
that only 3.4 percent of the federal population is incompetent so OPM
wants automatic pay raises tied to performance. He is quick to alert
his devoted readership when he sees the Federal Government used as a
guinea pig for the national work force. While Mr. Causey's column is
always serious, we couldn't imagine the Washington Post's humor columnists
being funnier or more entertaining in person.
At Mr. Causey's invitation, science writer John Schwartz joined us to
share some surprising thoughts. He compared science writing to a sailboat
rather than a train. "Dare to be dull" is his motto. Since April 1993,
Mr. Schwartz has written numerous exciting stories. The key to his work
is exploring the middle ground. The slam dunk conclusions are well known,
but in investigations about linkages of power lines to cancer, and silicon
breast implants to auto-immune disease, he's developed stories along
lines most people haven't thought about. "If you can make the argument
that everyone else is missing, that's news."
"Is it possible to make a safe cigarette?" This was the query posed
by his story in the Sunday magazine a few weeks before we met him. It
was a thought-provoking piece that let the reader contemplate the premise
through an interview and then speculate on the result. If a safe cigarette
could be developed, then what would be the role of government?
Another story we were familiar with, was Mr. Schwartz' story about a
recent hand transplant. First he interviewed the surgeon, then he asked, "who
hates this?" That person became the next interviewee, so that his stories
are multifaceted and balanced.
Mr. Schwartz and Mr. Causey joined us for lunch in the Washington Post
cafeteria. A lively discussion ensued. What kind of science sells? That
depends. The Washington Post was compared with a bazaar and writers compared
to merchants peddling their wares. Shop your wares to different editors,
shine it up, emphasize different aspects, and maybe one editor will buy
it, and your story gets published. Mr. Schwartz' favorite story? He still
gets a kick recounting his story about "multi-user dungeon," where people
play a role on the Internet. It was a story about a young man who was
so caught up in fantasy-role-playing in one of these multi-user environments
that he dropped out of college. The man, who had moved back home with
his folks, lost track of time and lost so much sleep that he fell asleep
at the wheel and was killed in the resulting car wreck. His father went
online to try to warn his friends online to "get a life" -- and found
that they had rich lives online that his son was an important part of.
He realized in the end that his son's online community was vibrant, and
valuable.
Founded in 1877, the Washington Post went through many changes before
it was purchased in a public auction on June 1, 1933 by California-based
financier, Eugene Meyer, for $825,000. Mr. Meyer turned the paper around,
tripling circulation and advertising as well as starting the paper on
the path to winning 1,200 awards for excellence, including 31 Pulitzer
Prizes. Mr. Meyer was succeeded at the Washington Post by Philip L. Graham,
his son-in-law, who had been assistant publisher. In 1963, Katherine
Graham became president of The Washington Post Company following the
death of her husband, Philip Graham. In 1979, Donald Graham became publisher
of the Washington Post, succeeding his mother. In 1991, he became the
chief executive officer and in 1993, he was named chairman of the board
of The Washington Post Company, positions formerly held by his mother.
The technology involved in producing a newspaper has completely changed
from the 1930's to the 1990's. The biggest change occurred in 1980 when
the paper's printing process was converted from the hot type method to
photo-electronic or cold-type composition. The new method used video
display terminals (VDTs) to write and edit stories. Then copy was automatically
photoset in the composing room and pasted down on layout sheets, eliminating
the use of linotype machines and metal chases from which heavy metal
plates were made. The first edition with this new improvement rolled
off the presses on October 6, 1980. On November 12, the Washington Post's
$60 million Springfield, Virginia satellite printing plant was opened,
housing four 10-unit off-set presses, a revolutionary plate-making system
and a computer controlled distribution system. These presses produced
128-page newspapers at approximately 65,000 copies per hour.
In March, 1981, the Washington Post changed its platemaking process
for its eight downtown web presses from the stereotype method, which
used lead heated to approximately 600 degrees Fahrenheit to form a 50-pound
printing plate to a NAPP direct platemaking process. This process produced
a lightweight plastic-coated steel plate ready for the presses and provided
reproduction similar to offset quality printing.
Advances in printing technology continue to be implemented at the Washington
Post. On March 23, 1997, the first of eight new presses arrived from
Mihara, Japan. Weighing 1,000 tons, the one press required approximately
85 trucks to transport to the Springfield, Virginia plant. The new presses
can print, cut, fold and assemble 96-page papers at 65,000 newspapers
per hour and will allow the Washington Post to print up to 28 of those
pages in color. Since early 1999, four of the new presses are at the
new Washington Post plant in College Park, Maryland, while four are located
at the older plant in Springfield as part of a $250 million upgrade of
the paper's printing operations.
Other technological aspects of the paper include the early launching
of their website. Since 1996, one could visit the Washington Post at
www.washingtonpost.com. A Delta 2000 master computer, primarily concerned
with interior environmental control, monitors 400 strategic points in
the Washington Post building and records any malfunctions of key machinery,
including the elevators.
P. Pearl O'Rourke
Deputy Director of Science Policy
Office of Science Policy
National Institutes of Health
(February 23, 1999)
Topic: The National Institutes of Health's Science Policy and Current
Science Issues
Dr. O'Rourke proved to be a very animated and effective speaker. The
details and depth of her knowledge in the modern health sciences was
apparent from the ease of her presentation. The first portion of her
discussion centered on her role as Associate Professor of Anesthesia
(Pediatrics) at the University of Washington. She touched briefly on
the discouraging developments of managed care plans, as a whole, and
the detrimental effect on the treatment of patients and the steady decline
of funds to continue research in emerging fields of medicine. According
to Dr. O'Rourke, her decision to accept the Robert Wood Johnson Health
Policy Fellowship and her subsequent work with Senator Kennedy on the
Labor Committee whetted her appetite for her current position as Deputy
Director of Science Policy at the National Institutes of Health.
During our remaining time with Dr. O'Rourke, she addressed some of the
national concerns, both legal and ethical, with recent developments in
stem cell research and the methods for developing and harvesting pluripotent
cells. Pluripotent cells are cell structures that contain the necessary
DNA formations to create several different types of tissue. A pluripotent
cell has the capability to develop into one of several types of tissues
or cells, such as liver, heart, or lung tissue. Experimental results
indicate these cells may have marked affects in cancer treatment and
severe burn therapy.
If stem cell research might provide a cure for cancer or AIDS, why is
it such a controversial topic? According to Dr. O'Rourke, a large portion
of this problem is rooted in the methods used to harvest the cells. Currently,
two methods are used to harvest pluripotent cells for the creation of
stem cells. Both techniques involve the destruction of either an embryo
or a first trimester fetus. The largest source of embryos is fertility
clinics, where the standard practice is to fertilize tens of eggs from
clients, choose three to five of the healthiest eggs for implantation
and freeze or destroy the remaining fertilized eggs. As an alternative
to completely destroying the embryos, researchers can harvest totipotent
cells from these embryos before their destruction. The totipotent cells
are then used to create the pluripotent cells described above. A second
source of pluripotent cells is early term fetal abortion. From an early
term abortion, researchers can harvest pluripotent cells directly from
the fetus prior to its destruction. The controversy is not whether or
not stem cell research is promising or beneficial. The controversy hinges
on the ethical and moral implications of destroying an embryo or fetus
to obtain the necessary pluripotent cells. Regardless of the ethical
or moral arguments, current legislation prohibits federal funding for
any research that involves destruction of embryos or fetuses.
Susan M. Gordon
Senior Government Advisor
Enterprise for Information Technology Solutions
Central Intelligence Agency
(February 25, 1999)
Topic: The Role of Technology in the Intelligence Arena
The Central Intelligence Agency (CIA), established by the National Security
Act of 1947, is an independent agency responsible to the President through
the Director of Central Intelligence (DCI), and accountable to the American
people through the intelligence oversight committees of the U.S. Congress.
The CIA's mission is to support the President, the National Security
Council, and all officials who make and execute the United States' national
security policy by:
- providing accurate, comprehensive, and timely foreign intelligence
on national security topics; and
- conducting counterintelligence activities, special activities, and
other functions related to foreign intelligence and national security,
as directed by the President.
To accomplish this mission, the CIA works closely with the other organizations
in the intelligence community to ensure that the intelligence consumer
-- whether Washington policymaker or battlefield commander -- receives
the best intelligence possible. The CIA collects foreign intelligence
information through a variety of clandestine and overt means. The CIA
also engages in research, development, and deployment of high-leverage
technology for intelligence purposes and -- in support of the DCI's role
as the President's principal intelligence advisor -- performs and reports
all-source analysis on the full range of topics that affect national
security. The CIA is organized along functional lines to carry out these
activities and to provide the flexible, responsive support necessary
for its worldwide mission.
Throughout its history, but especially as new global realities have
reordered the national security agenda, the CIA has emphasized adaptability
to meet the needs of intelligence consumers. With the new business trends,
development cycles and rapidly changing technology, the CIA is attempting
to create a new business model to adapt to these changes and Mrs. Susan
Gordon leads this effort. She is creating a non-profit organization whose
charter will be to find, develop and produce equipment for not just the
CIA and other federal agencies, but the private industry also. At the
center of this organization, is a core cadre of Agency cleared staffers
that provide the necessary interfaces into the federal agencies and the
required clearances for classified discussions and designs. The key to
this program is creating a self-sufficient Enterprise with the designers
and engineers maintaining their patent and development rights. This allows
each of the engineers to develop projects that might provide profits
from outside the Federal Government and creates a self-sustaining income.
Unlike other governmentally sponsored organizations, the Enterprise can
solicit funds and tasks from outside the Federal Government. The final
choice of projects and direction is provided by a board of directors,
that Mrs. Gordon created from leading firms in the technology sector
of private industry. One of the board's first responsibilities is to
identify an appropriate Chief Executive Officer (CEO) to lead this organization.
Once the CEO is identified, the board will become more of an advisory
committee for future development.
In conjunction with the establishment of the "Enterprise," a mirror
image of the organization will be created within the Agency to provide
classified development and guidance. This internal group will also be
responsible for monitoring and controlling the production procurements
of the Enterprise equipment. It is Mrs. Gordon's belief that the Agency
fully understands the problems it is facing and can direct such a group
to address these concerns. Furthermore, by establishing this independent
organization, Mrs. Gordon expects that private companies as well as other
government agencies will utilize this resource and reduce the amount
of support that a single entity might have to bear for this new effort.
Mrs. Gordon's presentation was a prime example of an agency's effort
to deal with the technological advances in the research and development
environment.
Randall R. Rader
Judge
U.S. Court of Appeals for the Federal Circuit
(March 16, 1999)
Topic: The Federal Court's View on Intellectual Property Protection
of Software and Databases
Judge Randall Rader began by telling us about a Supreme Court case decided
in 1972: Gottschalk v. Benson. In Gottschalk, the Supreme Court was faced
with the issue of patent eligibility of mathematical algorithms. The
Supreme Court ruled that mathematical algorithms were not subject matter
eligible for patent protection. The Supreme Court's view, however, began
to change at around 1981 when it decided the case of Diamond v. Dier.
By 1992, the Court of Appeals for the Federal Circuit was persuaded
that a computer program for analyzing electrocardiograms fell within
the scope of patent eligible subject matter when it issued its In re
Arythemia opinion. The Federal Circuit's view on the patent eligibility
of computer software and mathematical algorithms continued to evolve
and expand with its In re Allapat decision in 1994 and its State Street
Bank decision in 1998.
In discussing the history and evolution of the Federal Court's view
on the patentability of computer software, Judge Rader explained to us
the value of the patent system and the disadvantages of the alternatives
such as trade secrets and copyrights. He also took time to explain to
us some of the fundamental concepts in patent law including "the test
of obviousness." Moreover, he provided us with insights on the economic
aspects and relationship between research and development investment
by industries and patent policy and practice. Although the subject matter
of the seminar was highly specialized and technical, Judge Rader was
able to bring it down to a level that all of us could appreciate.
Visit to the Washington Metropolitan Area Transit Authority METRO
Operations Control Center and Brentwood Rail Maintenance Facility
(March 23, 1999)
We had the opportunity to visit the Washington Metro system and their
largest repair facility on the District of Columbia Metro railway system.
Mr. Randy Howe from the Metro Corporate Relations Office organized a
very exciting and informative day. During the morning session, we discussed
the history, background and numerous impediments that the Washington
Metropolitan Area Transit Authority (WMATA) has experienced since it
was created in 1967. Particular attention was given to the governing
board and the unique regional regulations the transportation system must
follow. The majority of individuals envision a railway system as a single
entity with a central person or company controlling its growth, maintenance,
and development. Due to its location at the juxtaposition of Virginia,
Maryland and the District of Columbia, the railway system is governed
by a very unique group of individuals. In an effort to reduce the overhead
and increase efficiency, a governing board was created to plan, develop,
build, finance and oversee the regional transportation system in the
National Capital Area. This 12-member board is comprised of two principal
members and two alternate members from each of the two states and the
District of Columbia. As the metropolitan area expands and the transportation
system grows, the board has eased the numerous restrictions on the system.
Most recently, WMATA replaced the complicated bus fare requirements with
a single unified price. Prior to this change, each district, city, and
state levied taxes and tariffs at a consumer level on each ticket which
created multiple pricing levels and fees depending on the specific route
a consumer traveled, even though the destination was the same in all
cases. With the new system, a single fee will allow a traveler to reach
any destination within that zone regardless of the path traveled.
We then had the opportunity to visit the control center for the Metro
Rail system. This area is set up to allow real-time monitoring of the
progress of every railway car on the entire system. All of the cars,
switches and relay systems can be monitored and operated remotely from
this central location. Since the recent failure of some strategic relay
systems, WMATA has directed that all railway cars must be operated in "manual" mode,
vice automatic. In manual mode, each series of cars are driven by an
engineer on board each train, with direction and guidance from the central
office. To increase the safety of this operation, WMATA has mandated
that each desk have at least two individuals monitoring the trains at
all times. Manual operation has dramatically slowed the schedule for
the railway, as the automatic train systems travel at higher average
speeds and require less space between trains to operate safely. Once
the repairs are completed, the Metro operations will return to the automatic
mode.
Later, we were able to visit the WMATA's large repair location on the
Red Line, which is used to inspect and conduct major repairs to the railway
cars. Mr. Lem Proctor is the General Superintendent, Office of Car Maintenance,
at all of the repair facilities and is located at this site. He provided
us with a personally guided tour of the facilities and an overview of
the repair and preparation cycles. There are currently 764 cars in service,
all of which were manufactured in Europe by Rohr Corporation and Breda
Costruzioni Ferroviarie. An additional 110 cars are on order from a joint
venture between AAI Corporation of Baltimore County, Maryland and CAF
of Madrid, Spain. The repair and replacement procedures are convoluted
and difficult, since the only manufacturers of industrial grade rail
cars are located in Europe and the cars cost hundreds of thousands of
dollars to replace. The Federal Government places restrictions on expensive
foreign procurements. With addition of an American partner, Breda's quality
cars are an excellent choice for replacing the older units. Each of the
cars is approximately 60 tons, 75-feet long, 10-feet wide and can travel
at top speeds of 59 miles per hour.
Finally, we were briefed by the Director, Office of Customer Service,
Mrs. Karen Lamb. She provided a detailed briefing relating advances in
technology to consumer products WMATA has been developing. WMATA has
established an interactive website which allows travelers to purchase
SMARTLINK's online. These permanent fare cards allow travelers to establish
an account to debit tickets, parking fees and bus fares. The balances
can be increased either online or at any ticket stand. WMATA is expecting
to enhance this system to provide custom travel information and customer
preference information to its users. Mrs. Lamb described a situation "in
the not so distant future," in which a traveler will use these types
of fare cards to get morning commuting information and travel updates
each morning. The hope is to provide the consumer with all the available
information for them to minimize their travel time and maximize their
enjoyment of the metro system.
Although the planning and budget stage of the Washington Railway system
was started in 1952, the actual construction of WMATA began in December
1969. The original design included 83 miles of track and 65 stations,
all of which were to be completed by 1980 at a cost of $500 million.
During the intervening years, the miles of track, number of stations
and, of course, the total cost of the rapid rail system increased. To
date, $30 billion dollars has been expended to construct 89 miles of
track and 74 stations. The ridership is on the rise and average daily
trips taken on buses and rail cars are about 750,000. Upon the railway's
final completion, WMATA anticipates 103 miles of metro railways, 83 railway
stations, 12,000 bus stops, 1,000 bus shelters and 322 bus routes. WMATA
has certainly come a long way since the railway's opening day ceremony
on March 27, 1976 and its initial 4.2 miles of Metro's Phase I route
from Rhode Island Avenue to Farragut North.
Maxine F. Singer
President
Carnegie Institution of Washington
(March 30, 1999)
Topic: Current Science Policy and Ethical Aspects of Genetic Manipulation
Dr. Singer began the seminar by giving us a brief history of the efforts
of early researchers in genetics and genetic manipulation. Dr. Singer
remarked that there are currently two big public policy issues: stem
cell research, and human genetics and genetically modified (GM) food
products. However, due to the time constraint, she would only be able
to discuss in-depth one of the two issues. We all voted to hear more
about agricultural biotechnology and GM crops. According to Dr. Singer,
GM plants not only provide pest or herbicide resistant crops, but also
contribute to phytoremediation. Some plants have also been engineered
to produce drugs, vaccines, and plastics. Dr. Singer remarked that being
the world's largest GM crops producer (50 million acres in the United
States were planted with GM crops last summer), the United States believes
its approach to GM crops to be both scientifically and economically sensible.
The Europeans, on the other hand, are strongly opposed to the planting
and/or importing of GM crops. The controversial issues of food safety
and biodiversity will undoubtedly have significant impact on the global
trade of GM crops.
Visit to Starpower
(April 6, 1999)
Starpower is the first provider in the Washington Metropolitan area
to offer Internet access, local and long-distance phone, and cable television
as a bundle package. It is a joint venture between RCN Corporation, an
existing full-service telecommunications company and Pepco Communications,
L.L.C., a separate affiliate of Potomac Electric Power Company (PEPCO).
Starpower was born in August 1997 with an initial start up cost of $25
million and an agreement for both PEPCO and RCN to invest $150 million
each into this venture over the following three years to lay thousands
of miles of fiber-optic cable throughout the Washington Metropolitan
area. One fiber-optic strand has the capacity to carry up to three million
simultaneous telephone calls along with a million faxes, computer or
Internet connections. Prior to this venture, PEPCO was in a tight spot
with their 100-year-old monopoly over the provision of electricity and
gas to 680,000 households in the Nation's capital about to end. According
to PEPCO, they had invested $7 million in a joint venture with Metricom
to build a wireless modem network in the District of Columbia. After
three years, the network had a business user base only in the thousands.
By creating Starpower, the two companies expect to develop a 6,000 mile
fiber-optic network offering phone, Internet and cable TV services throughout
the Washington/Baltimore Metropolitan area. Since opening in late March
1988, Starpower has signed up 25,000 customers for Internet, local or
long-distance phone service, or all three. The purchase of the Internet
service provider (ISP), Erols's, by RCN earlier this year provided 180,000
Internet customers to Starpower's base.
Although the Telecommunications Act of 1996 opened the flood gates to
the utilities market, to date the unique political and social factors
in the Washington Metropolitan area have been quite difficult to overcome.
Each county, city and district must provide an agreement to allow Starpower
to enter their region. In 1998, Montgomery County, Maryland officials
signed an agreement to allow Starpower to compete for subscribers in
the southern and eastern parts of the county. This was the first time
since cable was introduced to the county in 1983 that Cable TV Montgomery,
the local provider, had to go head-to-head with a competitor. Choice
in cable television service could be available for 200,000 Montgomery
County, Maryland homes by year's end. On the other hand, stringent requirements
of Fairfax County, Virginia have impeded development in this area for
over a year and will continue to hold development off for at least another.
The key to Starpower's mission is to successfully deliver an all-inclusive
communication package to the consumer. By combining multiple services
into a single contract, Starpower reduces the cost per product accordingly.
Any two, or more of these Starpower products can be combined to reach
higher discount levels. Customers can choose up to four services and
enjoy incremental savings with each service added to their Starpower
account. Selecting both long-distance and local phone service from Starpower
provides customers with additional usage discounts and reduced monthly
expenses. For example, customers who choose Starpower's premier cable
television service along with Starpower's local phone service and long-distance
phone service, will receive up to $4.00 off their monthly cable television
subscription rate. Starpower strives to bring the highest quality Internet,
telephone and cable services to its customers. In the District of Columbia,
Starpower is putting its 96-channel basic cable plus high-speed Internet
access and local and long-distance phone services up against District
Cablevision's 55-channel expanded basic cable. Its hope is to unseat
the incumbent cable company with better products, service and support.
On top of that, Starpower's basic cable TV prices are lower and its help
lines are staffed by "real" people, not machines.
Our time spent with Mr. Anthony Peduto, General Manager, and his excellent
staff at Starpower was very informative. The key issues of the day were
the political and social impediments of utility monopolies that have
been established over the past years. It is certain that the Telecommunications
Act of 1996 will enable the consumer to realize the greatest savings
and benefits possible. Companies like Starpower are the future for our
Telecommunications Age and their continued success is important to our
economy.
Cable choice will be available in parts of Maryland including Bethesda,
Silver Spring, Aspen Hill, Derwood and Colesville, and the following
municipalities: Chevy Chase Village, Town of Chevy Chase, Village of
Chevy Chase Section III and V, Martin's Addition, Chevy Chase View, Garrett
Park, Glen Echo, Kensington, Rockville, Somerset, Takoma Park, Village
of North Chevy Chase and Washington Grove.
John M. Starrels
Senior Public Affairs Officer
International Monetary Fund
(April 13, 1999)
Topic: The International Monetary Fund's Role in Today's Globalized
World
Beginning his career at the International Monetary Fund (IMF) as a consultant,
Mr. John Starrels currently holds the position of Senior Public Affairs
Officer for the organization. In addition, Mr. Starrels maintains responsibility
for congressional liaison duties, a position which he has previously
held. His emphasis during our discussion was the composition of the IMF
and the role it plays in the international community.
The IMF is often believed to be a bank. It is also mistakenly believed
to be associated with the World Bank. The IMF operates independently
of other entities and is a financial lending institution exclusively
for its members. The funds used for lending come from dues which are
paid by members. The current membership consists of 182 countries. Each
country selects one governor to reside on the Board of Governors, which
conducts the business for the IMF. An alternate governor for each country
is also selected and serves on the Board of Governors. Members of the
Board of Governors are normally finance ministers or central bank presidents.
They are required to address issues which ensure that currency flow between
nations remains constant. The Board of Governors also reviews policies
affecting the world economy related to monetary issues. The Board of
Governors meets annually. If a member has a concern or issue which needs
to be addressed before the next meeting, they can express their views
through their representative on the Executive Board.
The Executive Board has 24 executive directors and represents all members
of the Board of Governors. Five members -- France, Germany, Japan, the
United Kingdom and the United States -- hold permanent seats on the Fund's
Executive Board. The additional 16 chairs are shared by the remaining
countries. Each country contributes dues which determine the status of
the member's country on the Executive Board. Countries which contribute
more to the fund have permanent chairs and determine how much influence
and voting power the country will have. These Executive Directors vote
on monetary issues, primarily loan approvals, and normally there is a
consensus vote. In the event that a vote is not reached by consensus,
the percentage that the member contributes to the fund determines the
weight of their vote cast. As the largest contributor and strongest member
of the IMF, the United States provides more than 18 percent of the monies
provided to the IMF.
Headquartered in Washington, DC, approximately three out of every four
IMF employees come from outside the United States. Not surprisingly,
the office environment reflects a rich cultural diversity. The languages
spoken by the staff members are primarily English, Spanish, and French.
There are three primary functions performed by the IMF: technical assistance,
policy advice and lending. Through technical assistance, the IMF assists
countries in managing their economies more efficiently. Developing countries
which identify the need for free trade are the primary focus. Once the
request from the country has been approved, an IMF team goes to the country
to help set up the process requested. Technical assistance is provided
in areas such as: initiating social security systems, controlling inflation,
preparing budgets and offering social benefits. Another functional area
of the IMF is policy advice. Advisors also provide assistance reviewing
the impact of the decisions made and the strategies formed as they relate
to their neighboring countries. The provisions of this guidance are normally
used during two instances: when a small country has open economies and
chooses to use the IMF as a reference point and when the terms of the
policy advice are incorporated into the loan agreement. Approximately
one-third of the members borrow money at any given time. The third area
of consideration references lending points. When a member country requests
funding for a program, it recognizes that lending is conditional based
on whether or not the program can be supported by the IMF. The program
must also be supported by the country and be willing to adhere to the
terms of the program established. Funding used to support the programs
are normally short term. The IMF only puts conditions on the money lent;
it does not set policies for the borrowing country. Included in the discussion
was a brief history of the IMF, why it was established and its interaction
with organizations which have influence over international policy.
Mortimer L. Downey
Deputy Secretary
U.S. Department of Transportation (DOT)
(April 20, 1999)
Topic: Science and Technology in the Department of Transportation
Mr. Downey is the Deputy Secretary of Transportation, the "umbrella" organization
responsible for the Federal Highway Administration, the Federal Aviation
Administration (FAA), the Maritime Administration, the Federal Railroad
Administration, the Coast Guard and others. Prior to accepting his appointment
to this position, Mr. Downey was responsible for the major transportation
rebuilding project in New York. He also served in DOT during the Carter
presidency.
Mr. Downey opened his seminar with a brief history of DOT. The Department
is 32 years old, formed by combining several formerly independent agencies.
While there are still centrifugal forces acting within the agencies,
one of the ways DOT keeps things together is through cross-cutting, particularly
in terms of budget, policy, and legislation.
Another way of keeping DOT together is through its science and technology
assets. Research labs, such as the FAA's Technical Center and the Volpe
National Transportation Safety Center, are world-recognized facilities
working multi-modal transportation research problems.
In the early 1970's, transportation research was undertaken in various
areas, for example, the supersonic transport (SST), tracked air-cushion
vehicles (300 mph trains), and the Morgantown PeopleMover (a commuter
rail system in Morgantown, West Virginia).
For many years following, the only significant research done in the
transportation field was in support of the Department of Defense. Recent
actions have reinvigorated transportation research. The formation of
an interagency research and technology council and the creation of the
National Science and Technology Council's transportation committees have
developed a framework for links to the National Aeronautics and Space
Administration (NASA), the Environmental Protection Agency, the Department
of Defense (DOD), the Department of Energy, and so on to concentrate
upon inter-modal transportation issues.
Mr. Downey detailed some specific research plans that were driving investments
in transportation: the joint FAA-NASA-DOD Aviation Safety Program aimed
at reducing the accident rate by 80 percent by 2007; the development
of intelligent vehicles and intelligent transport infrastructures ("smart
highways"); and next generation global air transport technologies.
Mr. Downey discussed aircraft noise as an international policy issue.
Research money is being devoted to making air transport not only safer
and cheaper, but also quieter. The European Union (EU) has strong views
on aircraft noise mitigation strategies, and they have just adopted new
regulations that go far beyond the phased approach laid out by the International
Civil Aviation Organization (ICAO). These new restrictions would seriously
hamper aircraft built in the United States, since the ICAO process governs
United States' manufacturers. However, the EU's procedures do not restrict
Airbus Industries, the EU aircraft manufacturer. There are obviously
political entanglements to this issue.
Another topic discussed was the modernization of the National Airspace
System (NAS). The objective is to allow capacity to keep up with demand
and to integrate multiple systems into a more reliable whole. An ambitious
automation project undertaken in the mid-1980's, which was terminated
after major contract difficulties, was reorganized into its component
functions. Some of those functions are moving ahead nicely; others are
stalled indefinitely.
Mr. Downey discussed new initiatives for automotive safety, such as
collision alert warnings for cars. He also handled some tough questions
from us on the automobile airbag issue. Mr. Downey said, science and
technology must be reconciled with policy, and a manager must often balance
the use of technology with the politics surrounding the issue.
Visit to INTELSAT
(April 27, 1999)
INTELSAT, the International Telecommunications Satellite Organization,
is an international treaty organization that acts as a commercial cooperative
to provide wholesale satellite telecommunications for use around the
world. Founded in 1962, it pioneered commercial satellite communication
starting with the Early Bird satellite in 1965, and it currently operates
19 modern geostationary satellites, with five more on order.
There are 143 countries whose governments are parties to the treaty,
one signatory in each of these countries (traditionally the Post, Telephone,
and Telegraph organization, but in the United States, COMSAT), and a
28-member representative Board of Governors that handles operational
matters.
Our hosts for the morning were Ms. Allison Barr and Mr. Eric Lamm who
are from the Corporate Communications Department of INTELSAT. They showed
us a brief video and slide presentation, and then we saw some of the
building's operating areas and had a chance to discuss the mission and
operation of INTELSAT at considerable length.
We saw one-half size models of the various series of satellites used
by INTELSAT and discussed launch and operations. Satellite launch is
done by a variety of companies and countries, including Boeing and Lockheed-Martin
(United States), Ariane (France), China (an unsuccessful launch), and
newcomers such as Ukraine. Preflight design and testing are done by INTELSAT,
and launch control of the satellite is done from a control room in the
headquarters building. We looked from a balcony over the room, which
was not in use since a launch was not in progress. There are one to five
launches a year.
Technically the satellite transponders use a variety of microwave bands
for uplink and downlink of signals. Some transponders are leased to dedicated
services, and some services are provided as needed. Satellites currently
have a design life, limited by the supply of positioning thruster fuel
of ten years, but by conservative operation they may last for 15 years.
All are solar powered. We were told the spacecraft service reliability
is better than 99.99 percent. Ground stations have antennas from 1 to
30 meters in diameter, depending on use. Most ground stations draw on
a variety of signals from the local area and link them to and from a
satellite, but we were told that in some remote areas of Africa, there
are telephone booths with a dedicated one-meter satellite antenna for
direct uplink.
The majority of the services are provided to corporate and government
customers through the signatory organizations such as COMSAT (United
States) or British Telecom. In some countries, the signatory organizations
also permit others to use the services directly. In addition to telephone
and fax services, the satellites carry data from banking, airline, news,
and government organizations. Video signals come from news and broadcast
services and regional networks. Two of the smaller but clearly important
functions of INTELSAT are to provide backup services for international
cable communications in case of cable failure and to provide lifeline
communication for a surprisingly large number of countries (60) for whom
INTELSAT is the only means of operating an international telecommunications
service.
INTELSAT operates financially as a cooperative, in which signatory countries
provide capital, the company sells wholesale satellite capacity for voice/data
or video, and profits are returned to the user countries in proportion
to their use. Operating revenues were over 1,000 million U.S. dollars
in 1998. In addition to satellite capacity, INTELSAT provides technical
assistance, training, and a small amount of financing to member countries
for the ground facilities that the countries themselves must provide
to use the system.
Both the current structure and future plans for governance are interesting.
Currently, the organization is fundamentally governed by an international
treaty to which all parties subscribe. In addition, each country is represented
by an organization, the signatory, that is designated as the main user
of the satellite system in that country. The smaller Board of Governors
is carefully weighted among large users (e.g., the United States) and
regions of the world. In addition to these bodies, there is a staff of
about 900 responsible for managing the construction and operation of
the satellites and for marketing the services.
Originally, INTELSAT had a monopoly on communications satellites, and
such satellites revolutionized international telecommunication. However,
there is now considerable competition, both other satellite systems and
systems based on high-capacity optical fiber cable. In addition, the
telecommunications industry has grown from a regulated utility to a fast-moving
high-technology industry with many competing companies. Thus, INTELSAT
is in the process of converting itself into a private corporation, but
it faces problems inherent in its structure, from conditions that may
be placed on it by the U.S. Congress (or potentially by other governments),
to the difficulty of planning with such a diverse set of organizations
involved in its current structure, to the need to maintain its public
service role, to the question of how it can function in a telecommunications
market that is changing so rapidly. We were told that conversion plans
will be presented to the Assembly of Parties (the treaty countries) in
the fall of 1999, a final plan will be developed by spring 2000, and
then the conversion will be implemented by the end of 2000.
During our visit, we discussed both the technical operations of a major
organization in global telecommunications and the management and governance
of issues of a large multinational treaty organization that is in the
process of converting itself into a large multinational private corporation.
Visit to WJLA-TV Channel 7
(April 27, 1999)
The host for our visit to the studio and administrative offices of WJLA-TV
Channel 7 (an affiliate of ABC) was Ms. Angela Mathis, the Senior Producer
and Community Relations Coordinator of the station.
We toured the station control rooms and news room, with weather studio
and chroma screen. Mr. Alan Siskin, the Station Engineering Supervisor,
accompanied us and answered questions about technical and operational
details. Local news and weather broadcasts originate here. Reports from
the field come in on microwave links, and some common places such as
the MCI Center (sports events) and the Washington National Cathedral
(Christmas programs and other events) have installed optical fiber links
directly to the studio. The broadcast signal travels by fiber to the
transmitter and antenna on a tower shared with Channel 9. The control
room is in the middle of a conversion from analog equipment (tapes, slider
controls) to digital controls (computer controlled devices, digital memory,
knobs that actually are inputs to the control computer). Digital controls
allow more flexibility and faster access to program material but require
some learning to use.
We also met with Mr. Mark Olingy, the Director of Engineering and Operations,
who talked about the technical operation of the station and its conversion
to digital broadcasting. In November 1998, WJLA began broadcasting a
digital television signal on Channel 39, simulcasting the Channel 7 program
24 hours a day. Digital broadcasting is required in major markets now,
with the rest of the country included in 2003, and the analog signal
turned off by 2006 if 85 percent of users have a digital receiver by
then. The advantages of digital television are better picture resolution,
sound as high quality as from a CD, and much flexibility in the transmitted
signal. There is the promise of integrating digital television with computers.
For example, it is possible to transmit as many as 18 different display
formats, and data may be broadcast along with the television signal.
Disadvantages include expensive receivers (currently $5,000-$10,000,
but considerably less after the market grows) and a perceived lack of
demand in the viewing audience. Mr. Olingy told us that content drives
the television industry, and the advantages of digital television may
not be worth it for most content. This is an area of rapid technical
development and considerable controversy.
Finally, we toured the editing, art, and marketing departments, and
we saw a demonstration of video editing that is now being done in the
Marketing and Promotions Department. With a fast general-purpose computer,
a large disk array, and a specialized video card and monitor, Mr. Bill
Dion showed us how a person in marketing can create and edit video images
for use on the air. Previously, the marketing person developed the concepts
and gave the construction and editing work to a person who works as a
video editor. This has changed the nature of the jobs in marketing and
has involved reductions in the station's work force.
During this visit, we saw the inside of an operation that is publicly
prominent, depends heavily on technology, and is undergoing rapid technical
and organizational changes.
Visit to the National Weather Service Forecast Office
(May 4, 1999)
We visited the National Weather Service (NWS) Forecast Office in Sterling,
Virginia, near Dulles International Airport. Our host was Mr. Steven
Zubrick, the Science and Operations Officer for the office. He talked
with us about the organization and operation of the office, and we went
on a tour of the laboratory where the forecasters work.
The NWS mission is to provide weather information and forecasts to protect
life and property. In doing so, the NWS relies on a group of meteorologists
and an impressive array of technology. The Sterling office is similar
to many others around the country. Each office has a Meteorologist-in-Charge,
a Science and Operations Officer, and a Warning and Coordination Meteorologist.
A group of 8 to 12 meteorologists staffs the office around the clock,
7 days a week. Each works five shifts a week, with two forecasters present
at any time. During the day there is a hydrologist as well, and in addition
there are technical and administrative support staff. Data and forecasts
are provided to the media (radio, television, and newspapers) to local,
state, and Federal Government agencies, and to the public.
The Sterling office has responsibility for forecasting around a 500-km
circle emanating from the office; the size is set by the range of the
relatively new doppler weather radar. This area includes northern and
central Virginia, Maryland except for the far western part, the District
of Columbia, and eastern West Virginia. The radar uses a 27-foot dish
antenna mounted in a round cover on top of a tower near the office, and
this feature visually dominates the site. The radar detects both the
location and type of precipitation as well as the wind velocity, and
it has enabled the NWS to improve both forecasts and warnings of severe
weather. In addition, field offices including Sterling have just received
new advanced computer workstations, although much work is done on older
workstations as well as personal computers and Unix workstations.
Mr. Zubrick talked with us about severe weather, including the severe
tornadoes that had hit parts of Oklahoma and Kansas the previous day,
the January 14-15, 1999 ice storm in the Washington, DC area, and the
March 9, 1999 snowstorm that had dumped an unexpectedly large amount
of snow on the region. (We were in the U.S. Capitol that day and remember
well the snow and the long trip home.) We also talked about progress
and what would be needed for further improvements in forecast accuracy.
Forecast accuracy has clearly improved over all forecast time periods
due to more data, more accurate data, and better computer models of the
atmosphere. The major limiting factor in the future appears to be data
for input to atmospheric models that are used as the basis for forecasts.
In addition to data from the NWS, Mr. Zubrick told us about getting data
by using the Global Positioning Satellite system to obtain column moisture
density in the atmosphere, and about the use of data from commercial
airline flights that report data on temperature and winds along the flight
path.
After the discussion, we went on a tour of the computer laboratory where
data are analyzed and forecasts are made. There is a great deal of computer
equipment of several vintages, and Mr. Zubrick showed us data from both
the Oklahoma tornadoes and also from a small tornado that had appeared
briefly in Spotsylvania County, Virginia on April 9, 1999. He used the
latter example to show us the use of doppler wind velocity data to look
at shear and rotation in the wind field that is the signature of a tornado.
Finally, he showed us the use of Convective Available Potential Energy
(CAPE) from model atmospheric soundings to predict the likelihood of
severe thunderstorms and possible tornadoes.
The Sterling Forecast Office has a website, www.nws.noaa.gov/er/lwx/,
through which a large amount of weather data, forecasts, and model calculations
are available. Providing weather data, which changes constantly, is an
excellent use of the World Wide Web, and the NWS website provides timely
and accurate data and information.
Visit to the Federal Aviation Administration Washington Air Route
Traffic Control Center
(May 4, 1999)
The Federal Aviation Administration's Washington Air Route Traffic Control
Center (ARTCC) is responsible for providing air traffic control services
for the mid-Atlantic states. Its area of geographic responsibility extends
from central New Jersey to the Georgia-South Carolina border and from
Indiana to 200 miles offshore. Its airspace contains several major airport
terminals: Philadelphia, Ronald Reagan Washington National, Dulles International,
Baltimore-Washington International, Raleigh-Durham and Charlotte, North
Carolina. The facility is divided into areas of specialization, which
are further divided into sectors. Controllers generally train and qualify
for a single area and are teamed by area.
Mr. Tom McCarthy, an air traffic control specialist, led us through
the major functional divisions of the facility. We started at the Traffic
Management/Central Weather Service Unit sections and received an orientation
on flow metering. One of the displays was set up to show the stream of
arrivals into the Newark, New Jersey airport, so we could visualize the
requirement for metering into this busy hub. While we were there, Traffic
Management and specialists from the Air Traffic Control Systems Command
Center in Herndon, Virginia were setting up delay programs and new flow
restrictions for Newark, New Jersey caused by some significant weather
activity on the route.
From here, we went to the new control area under construction. New displays
and communications equipment are being installed. The controllers should
move over some time in mid-2000. Our next stop was the training room
where Mr. McCarthy showed us the computer-based instruction work stations
and explained that the center controller will spend about two years in
this phase before being sent to the control room floor. Also, recurrent
and refresher training is done here.
While the old control area is in the process of asbestos removal, one
of the teams has been relocated to the emergency control room in the
basement. During midnight operations, when traffic levels slow considerably,
the entire facility is run from here. Our stop here included a briefing
on the controller display, functional input devices, and some of the
display symbology used. Each of us were given a handset and permitted
to plug in with a controller to listen to the radio and interphone communication
exchanges with the pilots and other controllers.
Visit to the National Cancer Institute, National Institutes of Health
(May 11, 1999)
We met with several researchers at the National Cancer Institute (NCI)
to learn about the latest developments in cancer research. Dr. Richard
D. Klausner, Director, NCI, gave us an overview of the ongoing programs
and recent developments at NCI. Of the $3 billion annual NCI budget,
approximately 85 percent of the funds are distributed for extramural
research. Ongoing major projects at NCI include Surveillance Epidemiology
End Result (SEER), which comprises more than 9,000 physicians and a network
of clinical trials; and the Cancer Genome Anatomy Project (CGAP), which
is aimed at discovering genes involved in cancer in a systematic manner.
One of the latest developments in enhancing the diagnosis of cancer is
the "gene-chip." We were also impressed by the rapid progress in potential
cancer drug discovery made in recent years. According to Dr. Klausner,
the number of new drugs entering clinical trials rose from about 40 per
year in the late 1980's to 450 in the year 1999.
We also met with Dr. Kenneth Buetow, Chief of the Laboratory of Population
Genetics. Dr. Buetow provided us additional information on CGAP, which
is a genetic annotating initiative. By employing advances in information
technology, CGAP aims at mapping cancer susceptibility genes in the human
genome and linking cancer phenotypes to genotypes.
Finally, we met with Drs. Lance Liotta and Elise Kohn from the Laboratory
of Pathology. Their main research goal is to develop techniques that
will allow early cancer detection and treatments. They also briefed us
on an anti-angiogenesis therapy recently developed in their laboratory
that is currently undergoing Phase II clinical trials.
F. Henry Habicht II
Chief Executive Officer
Global Environment and Technology Foundation (GETF)
(June 8, 1999)
Topic: Technology and Sustainable Development
Sustainable Development -- it's the newest concept in environmental
protection. A sustainable America means crossing traditional boundaries
and seeing economic, environmental, and social issues as part of the
same fabric. The concept includes smart growth in urban, suburban, and
rural areas as well as efficient resource use. The goal of sustainable
development goes hand-in-hand with improving quality of life and exercising
good stewardship.
When the President's Council on Sustainable Development asked GETF to
co-sponsor the May 1999 National Town Meeting for a Sustainable America,
our speaker, Mr. Habicht was the ideal choice to lead efforts in the
design, organization and implementation of the historic meeting. As a
former senior Vice President with Safety-Kleen Corporation and as the
former Deputy Administrator for the U.S. Environmental Protection Agency,
Mr. Habicht has experienced the challenge of environmental protection
from the perspective of both government and private industry. In his
new role with GETF, Mr. Habicht is dedicated to building an infrastructure
for sustainable development by utilizing approaches in three different
areas: using information technology to facilitate the spread of best
practices and best ideas; promoting new technologies and practices to
improve the quality and environmental impact of what we produce and consume;
and helping key institutions and people that have not historically worked
together to forge lasting partnerships for sustainable development.
The Town Meeting, which was held in Detroit, Michigan, brought together
leaders from business, government, communities, and non-governmental
organizations. Over 3,500 people within the city and 60,000 others via
satellite and on the Internet participated in a meeting that showcased
instances of a healthy environment and a thriving economy as goals in
concert, not conflict. The meeting program is a comprehensive look at
current environmental issues and programs. It is an aggregate of different
pockets of innovation from urban brownfields to sustainable agriculture.
The next step to building onto the success of the first meeting is to
take the website, www.sustainable-usa.org, and build it into the Sustainable
USA Network. This website is a public-private partnership featuring summary
information about the town meeting, the latest news and information on
sustainability, best practices, and resource links.
Regional initiatives are starting to blossom as community foundations
begin to proliferate. Some examples of this are "Envision Utah," "Sustainable
Denver," and Sustainable Pittsburgh." National initiatives combining
transportation, energy and agriculture are also possible. Mr. Habicht
referenced a National Aeronautics and Space Administration and Department
of Transportation collaboration to use remote sensing to map out future
transportation systems.
Mr. Habicht highlighted technology as a vehicle for sustainable development.
He defined technology as the application of new management practices
or improving the application of old ones. GETF encourages new ideas through
projects that actually test and apply these new ideas. Environmental
technologies advance sustainable development by reducing risk, enhancing
cost-effectiveness and creating products and processes that are environmentally
friendly. Agriculture/bio-based renewable products are examples. Mr.
Habicht said that revolutionary alternatives will soon be available to
replace conventional lubricants and building materials.
There are several challenges in the development of new technology. Mr.
Habicht listed four: regulations/tax and other policies; the marketplace,
domestic and foreign; availability of capital; and research and development
(R&D) budgeting. The ensuing discussion of the R&D budget echoed a familiar
note: the R&D budget has no constituency, so it will always be a target
for budget cuts unless there is a mission to direct R&D. The first mission
to the moon is an example of that.
To get more information on a wide range of GETF projects, including
EnviroJobs, a program providing occupational skills training associated
with environmental management, check out their website. GETF is developing
distance learning and on-site training materials and is utilizing the
Internet and extranet for continuing education. The website is www.getf.org.
To access the global network of environmental technology, go to www.gnet.org.
For general environmental developments, tune into www.earthvision.net.
Visit to the National Institute of Standards and Technology
(June 15, 1999)
We visited the National Institute of Standards and Technology (NIST)
in Gaithersburg, Maryland. We heard two introductory talks about the
wide range of work done at NIST, and visited individual laboratories
in four of the seven NIST Measurement and Standards Laboratories. Our
host for the day was Jan Hauber from the Public and Business Affairs
Division.
Mr. Marc Stanley, Associate Director for Policy and Operations of the
Advanced Technology Program (ATP), spoke about the program and its activities.
It is a competitive program that co-funds research by individual companies
or joint ventures. The aim is to encourage economic growth in the United
States by developing high-risk, potentially high-payoff technologies
after basic research has been completed but before product development
begins. By sharing the cost of development projects, the ATP augments
industry's ability to pursue promising technologies and accelerate their
development. Projects often involve several companies as well as universities
and government laboratories. Funding for the program has remained constant
at about $200 million in the past few years. Economic impact analysis
demonstrates that the impact of completed ATP projects stimulates many
times this amount in product sales.
Mr. Ray Kammer, Director of NIST, spoke to us about the mission of NIST:
technology, measurements, and standards. The organization has about 3,300
employees including a large number of technical professionals. In addition
to the ATP, NIST has three other areas of responsibility: Manufacturing
Extension Partnerships, which provide technical assistance to small businesses
around the country, much like the agricultural extension agents; the
Baldrige National Quality Program for acknowledging high-quality operations
in business, education, and healthcare; and the Measurement and Standards
Laboratories. The laboratories' missions are based on the U.S. Constitution's
requirement for a trusted system of weights and measures, and they develop
basic measurement techniques, standard reference materials, and calibration
services.
Our tour began with the Structures Division, where the strength of concrete
was being measured. Techniques were being developed for increasing the
strength of existing reinforced concrete beams, and research was being
conducted in collaboration with other agencies to develop structures
that would be more resistant to earthquake damage.
We then visited the Semiconductor Electronics Division. Transistors
and integrated circuits are increasingly incorporated into consumer products
because of their rapidly increasing capabilities and rapidly decreasing
cost. The division is developing measurement techniques that can be used
by industry to characterize semiconductor devices as they become increasingly
smaller and more complex. We saw a demonstration of one of the techniques
being used, scanning probe microscopy.
Our next visit was to the laser trapping and cooling laboratory in the
Physics Division. There, scientists are developing ways of using atoms
at very low temperatures to create assemblages of cold atoms, atomic
lasers, and atomic clocks, the accuracy of which is as much as 1,000
times better than existing standard clocks. Since NIST maintains the
standard for time, they are always striving for the highest accuracy
of clocks. Dr. William Phillips, who met with us, won the 1997 Nobel
Prize in Physics for his work in developing methods to cool and trap
atoms with laser light.
Finally, we visited the NIST Research Reactor and the Center for Neutron
Research. The reactor operates at 20 megawatts and produces a large flux
of neutrons, subatomic particles that are used to investigate the structure
of matter. We were told about applications ranging from basic research
on the structure of solids, to the investigation of why airplane turbine
blades failed unexpectedly, to the dating of artworks and archeological
artifacts. Much of the work here and in all the laboratories is done
in close cooperation with industry.
back to top
Class
of 1997-1998
David Y. Peyton
Director
Technology Policy
National Association of Manufacturers
(October 1, 1997)
Topic: Issues in Technology Policy -- A Modern Manufacturing Perspective
Mr. Peyton began our weekly seminar series by providing an overview
and brief history of the National Association of Manufacturers (NAM).
He stated that NAM does not attempt to write technical standards. Instead,
NAM strives to increase economic growth by lobbying the U.S. Congress
and federal agencies -- not state governments -- on behalf of U.S. manufacturers.
Mr. Peyton stated that NAM's 14,000 members represent virtually every
industry from all states, and range in size from large, multibillion
dollar corporations to small, family-owned businesses. He emphasized
that NAM represents more than 80 percent of American industry and for
more than 100 years, has worked to promote manufacturers' interests.
NAM is affiliated with hundreds of trade associations, as well as state
manufacturing and employer associations. NAM advocates policies that
encourage faster U.S. economic growth; that create jobs and increase
incomes, thus improving our quality of life and global competitiveness;
and that reduce the government's deficit.
Mr. Peyton focused his presentation on several issues, most of which
are NAM priorities. He addressed improving protection of intellectual
property rights, especially patent reform; making the research and development
tax credit permanent; expanding exports of U.S. produced goods and international
trade; next generation manufacturing, and computer security, including
domestic law enforcement.
In closing, Mr. Peyton stressed the importance of government and industry
cooperation and mutual problem-solving. Working together will create
a resurgence in national prosperity and in the standards of living for
all Americans.
Allan R. Hoffman
Acting Deputy Assistant Secretary for Utility Technologies
Office of Energy Efficiency and Renewable Energy
U.S. Department of Energy
(October 8, 1997 and December 3, 1997)
Topic: Renewable Energy Technology Overview
Dr. Hoffman started with a discussion of the converging trends that
will shape our energy debate in the 21st Century. The U.S. alone consumes
one-fifth of the world's annual energy production. Electrical energy
demand in developing countries is expected to grow by five million megawatts
in the next 30 years. Energy shortages will become more acute in developing
countries. Dr. Hoffman does not consider fossil fuels or nuclear energy
to be the solution to this problem, because there is increasing environmental
awareness of the consequences of using fossil fuels and there are social
issues related to the use of nuclear power. There are energy security
risks and uncertainties with respect to our oil reserves.
With these trends in the background, Dr. Hoffman discussed the availability
of new technology as an option to the current conventional energy sources
to satisfy the increasing energy demand. The energy technology that he
feels provides the greatest promise is that of the renewable energy sources:
photovoltaics (PV), solar thermal, wind, biomass, and geothermal. For
each of these energy sources, he showed cost trends and capacity predictions.
He discussed development of asphalt roof shingles that are composed of
linked PV cells, so that the roof provides electrical power to the dwelling.
He described shoe box sized PV packages which bring electricity for communications,
light or refrigeration to remote huts and villages.
Dr. Hoffman discussed five strategies for sustainable development. We
must increase the efficiency of our energy usage; develop a balanced
energy resource portfolio; invest in science and technology advances;
reinvent environmental protection; and engage in the international market.
He ended the seminar with a review of the recent accomplishments and
challenges in each of the areas of the renewable forms of energy.
Dr. Hoffman returned, at our request, for a second lecture. His enthusiasm
for the renewable resources was unabated from his first visit. In addition
to an overview of the renewable energy technologies, Dr. Hoffman discussed
energy storage systems which are very important as intermittent energy
sources such as wind or sun are developed.
On this second visit, he focused on some very practical economic issues
-- the costs of maintaining or increasing the U.S. and the world energy
supply. Costs were broadly defined. Environmental destruction is a cost.
Deteriorating health is a cost. Military operations are costs. We incur
these costs and pay them with our tax dollar, but we don't allocate them
to the cost of driving our car or running our air conditioner.
The renewable energy sources are not yet competitive with fossil fuels
using the current cost allocation system. If we allocated all the costs,
the renewable sources would be more competitive.
Dr. Hoffman stated that we will have to sell some new ideas to make
the renewable fuels work. He gave an example of Brazilian villages where
batteries are the source of electricity and kerosene is the fuel. Both
polluting items which have to be carried into the village, could be replaced
by PV roofs on the huts. The infrastructure item of PV cells on the roofs
would have to be made available on time payments so that villagers who
are able to pay for batteries and kerosene could make the up-front investment
in the new roofs.
A 40 percent increase in worldwide demand for power is expected; some
of the new installed power will be renewable sources. The U.S. needs
to master the renewable energy technologies in order to be assured a
part of this business. Dr. Hoffman emphasized, however, that we must
not repeat the mistake of the 1980's when we put tax incentives on investment,
not on results; any incentives for this development must be for actual
energy production.
Michael M. Harmon
Public Administration Department
George Washington University
(October 22, 1997)
Topic: Administrative Responsibility
According to Dr. Harmon, the American debate about responsible government
has remained largely unaltered for more than half a century. Dr. Harmon
believes that it is the points of agreement that present the most formidable
barriers, both practical and moral, to realizing responsible government.
The principal belief is that responsible action is same as morally or
legally correct action.
Dr. Harmon believes that the struggle for and against responsibility
plays out both consciously and unconsciously in our inner lives, in intimate
relations with others, and in social institutions that enable and regulate
public conduct. The struggle may be expressed in a variety of vocabularies:
spiritual (sin and redemption); psychological (guilt, shame, and individuation);
moral (blame and obligation); and institutional (accountability and,
especially recently, empowerment).
Of these vocabularies, rationalism has been the least successful in
enabling an understanding of the paradoxical character of responsibility.
Dr. Harmon argued that the chief reason for the failure of rationalist
discourse arises from its misplaced insistence on dividing categorically
issues of public conduct from those of private life and collective obligation
from personal development.
Dr. Harmon challenged the essential point of agreement in the rationalist
discourse by arguing that the idea of responsibility connotes multiple
and conflicting meanings that render it inherently paradoxical. Because
of responsibility's paradoxical nature, the rationalist equation of responsibility
with correctness is necessarily flawed in a most fundamental way. Only
through an appreciation of responsibility's paradoxical nature can the
idea of personal responsibility be reinstated to its rightful place in
the moral discourse on government.
Craig I. Fields
Chairman
Defense Science Board
(October 29, 1997)
Topic: Private/Public Partnerships for Science and Technology
Dr. Craig Fields is currently the Chairman of the Defense Science Board
and is a member of the Board of Directors of a number of technology related
corporations. His distinguished career also includes being a chairman
and CEO of the Microelectronics and Computer Technology Corporation,
and the Director of Defense Advanced Research Projects Agency (DARPA).
Dr. Fields shared with us a unique perspective of managing science and
technology programs in both the government and private sector.
Dr. Fields went into details of the Defense Science Board (DSB), its
history, procedures and purpose. He also discussed the Internet and the
Next Generation Internet. He concluded with some comments on the differences
and similarities of being a civil servant and having a corporate career.
The DSB was established forty years ago to provide non-defense science
and technology input to the defense strategy. The chairman is selected
by the Secretary of Defense and also reports to him. The DSB is comprised
of technical people from industry, academia, former civil servant and
retired military personnel. The DSB conducts three meetings a year, each
meeting lasting about two days. They review the Defense research and
development investment strategy. All reports are public record and are
available to Congress. The current areas of interest to the DSB are the
vulnerability of our Information Systems, conventional threats versus
terrorists threats, coalition warfare, and environmental security issues.
These areas pose the biggest risk to the military of today.
Dr. Fields then proceeded to discuss the Internet. As the former Director
of DARPA, he was heavily involved in the development of the original
internet, called DARPANET. DARPA funded and developed the internet for
the purpose of connecting the research and development community that
included military labs, universities and defense industries. The original
undertaking was to get three separate networks or nodes to communicate.
Standards and interfaces were needed to be established to allow this
to happen. Once the standards and interfaces were set, the number of
nodes could then be increased. Almost all of the internet was funded
by the Department of Defense. The potential for daily commerce is enormous
and currently untapped. The Next Generation Internet needs to be funded
through private investment.
Donald E. Kash
Hazel Chair of Public Policy
The Institute of Public Policy
George Mason University
(November 6, 1997)
Topic: Technology Trends
Dr. Kash's opening question, "What technology policy will produce a
trade surplus?", provided a focus for his presentation of the impact
of technology trends on trade.
Dr. Kash argued that the increasing trade deficit necessitates an analysis
of the changing composition of international trade. This analysis shows
a shift in trade from commodities to manufactured products.
Manufactured goods and processes were then analyzed into four complexity-dependent
categories. The trend in trade value has been from the lesser to greater
complexity. But complexity, that which cannot be understood by one person,
requires accumulated learning and distributed responsibility. These characteristics
are in conflict with the American tradition of linear leadership and
accountability.
Because of his analysis and because the areas in which the U.S. is competitive
-- aerospace and defense, health care and medicine, and agriculture and
chemicals -- are all areas in which the government has had a technology
policy, Dr. Kash concluded that the U.S. should have a technology policy
that encourages distributed authority.
Visit to COMSAT Laboratories, Clarksburg, Maryland
(November 18, 1997)
The COMSAT Corporation consists of three parts: Satellite Services,
International, and the Laboratories. COMSAT Laboratories is the research
portion of the COMSAT Corporation, founded in 1967, and consisting of
a 120,000 square foot facility employing about 200 scientists and engineers.
This highly educated staff includes 17 percent Ph.D. and 70 percent Master
level personnel.
Mr. Eric Timmons, Director of Marketing and Product Management, briefed
us and provided a tour of the facility. The customers include both U.S.
and foreign governments and companies. The areas of research and services
provided at the Laboratory include: network technology, communications
technology, RF and satellite technologies, system modeling and system
engineering. The Laboratory had revenue of $35 million in 1997, reflecting
increased pressure to make the Laboratory revenue positive, unlike previous
years where research was considered research.
We received a tour of the facility, seeing some of the work in networking,
antenna designs, and battery research.
Visit to the U.S. Naval Observatory, Washington, D.C.
(December 10, 1997)
Our visit to the United States Naval Observatory began with a presentation
by Dr. Kenneth Johnson (Scientific Director) on the history and mission
of the Observatory and how the mission has evolved with advances in technology.
The Observatory was founded in 1830, shortly after the development of
the chronometer as a navigation tool. The original mission of the Observatory
was to maintain the Navy's maps and navigational equipment, but the mission
soon evolved to require reliable celestial observations for navigation.
During the late 1800's, the Observatory carried out many significant
scientific studies in the field of astronomy, and its 26-inch refractor
telescope was the largest in the world at that time. However, the principal
focus of the Observatory was and is navigation.
Today, the Observatory is at the forefront of technology in the field
of precise timing and astrometry, a branch of astronomy in which the
positions and motions of the sun, moon, planets, and extremely distant
objects, such as galaxies and quasars, are precisely determined. The
cesium beam and hydrogen master clocks maintained by the Observatory
are constant to within one nanosecond per day and provide the source
data for the U.S. Master Clock as well as the satellite-based Global
Positioning System. To produce the most accurate celestial references
for navigation, the Observatory uses radio telescopes to observe extremely
distant quasars to determine their positions and compute their future
positions.
Mr. Geoff Chester (Office of History and Public Affairs) arranged for
us to visit two of the telescopes maintained at the Observatory grounds,
the Observatory's library, and the U.S. Master Clock. His tour was filled
with many interesting anecdotes about the Observatory.
Visit to NASA Goddard Space Flight Center, Greenbelt, Maryland
(December 17, 1997)
We were provided with a very interesting tour of NASA's Goddard Space
Flight Center at our last seminar in 1997. Mr. Roland Van Allen of Goddard's
Visitor Center gave a brief presentation on Goddard's history and provided
some fascinating details to the Visitor Center displays of NASA's early
years of space exploration. Goddard Space Flight Center is named for
Dr. Robert Goddard, the early pioneer of rocket research. Its mission
is to expand knowledge of the earth, the solar system and the universe
through observation in space. Goddard's main facilities are situated
on a campus of about 1,200 acres in Greenbelt, Maryland and they currently
employ approximately 11,000 persons at various facilities.
Mr. Van Allen then led us on a short tour of the campus, which included
an opportunity to experience firsthand Goddard's diverse mission of science
and technology. We visited a large centrifuge, which allows scientists
to test the strength and stability of various spacecraft components when
subjected to the force of gravity during launch. We also visited the
Spacecraft Systems Development and Integration Facility, which contains
a hugh clean room where components for future space shuttle missions
are being constructed and assembled before being transported by containerized
barge to Cape Kennedy. This building also contained components of other
satellite systems, which were being tested for conditions in space.
No visit to Goddard would be complete without a visit to the NASA Communications
Center in Buildings 3/14. We were greeted by Julio Marius who presented
some wonderful details about the Hubble Space Telescope (HST) Project.
In 1993, Goddard was instrumental in managing the successful first servicing
mission of the HST. This included correcting the vision of the telescope's
optical components and included five days of astronaut spacewalks to
complete corrections. A highly successful second HST servicing mission
in 1997 added a Near Infrared Camera and Multi-Object Spectrometer (NICMOS)
and the Space Telescope Imaging Spectrograph (STIS). Finally, astronauts
added several thermal insulation blankets to HST to protect areas where
the existing insulation was showing its age.
Mr. Marius then introduced us to Mr. Jack Leibee, the Mission Operations
Manager, who gave us a brief overview of NASA's Next Generation Space
Telescope Project, which will not be launched until 2007. Its mission
will be to study the solar system at one million miles from Earth. Mr.
Leibee answered many detailed questions about the differences between
HST and the Next Generation Space Telescope (NGST). NGST uses an eight
meter telescope as compared with the HST, which used a 2.4 meter. HST
weighed over 25,000 pounds, but the NGST will need to weigh less than
6,000 pounds. One of the challenges for NASA in building the NGST, is
a design that includes this larger telescope, but results in a total
overall lighter payload. This will result in a less expensive mission
for NASA. The goal for NGST is to spend only one-fourth of the cost associated
with HST. Another challenge is in using commercially available technologies
within the design. Since planning and design of a mission begin many
years before launch, these technologies become obsolete during the lifetime
of the mission. NASA is conducting technology studies now to determine
the correct design of the telescope with testing to be conducted in the
2003-2004 time frame.
Verne L. Lynn
Director
Defense Advanced Research Projects Agency
(January 6, 1998)
Topic: Advanced Research Projects -- Defense Conversion and How to Stimulate
Technology Development
Mr. Larry Lynn was appointed Director of the Defense Advanced Research
Projects Agency (DARPA) in 1995. Prior to 1995, he served as Deputy Under
Secretary of Defense for Advanced Technology. He has been a member of
the Defense Science Board and served as the Deputy Director of DARPA.
His career includes being part of the faculty and a member of the Massachusetts
Institute of Technology Lincoln Laboratory Steering Committee, as well
as serving two years active duty in the U.S. Navy.
Mr. Lynn provided a top level presentation of his perspective of DARPA
and its activities on advanced research projects. DARPA was chartered
in 1958 in response to the Soviet Sputnik launch. Once the National Aeronautics
and Space Administration was given the space mission, DARPA was tasked
to create advanced military capability. DARPA's accomplishments include:
Saturn space launch vehicle, Centaur rocket engine, M-16 rifle, ground
radar, Unmanned Aerial Vehicle (UAV), satellite imaging, original internet,
high energy lasers, Javelin missile, stealth technology, and other military
hardware.
DARPA's philosophy is to invest in high payoff research and be willing
to accept the possibility of not succeeding due to high risk. This philosophy
is complementary to the usual military research and development. Important
characteristics of the organization are that it is flat and remains small
and flexible. A quick overview of the budget with breakdowns for basic
research, applied research, and advanced technology development. The
investment approach used the 20-year vision for military capability and
tried to quickly exploit new inventions, ideas and concepts to achieve
that vision. He proceeded to discuss the current military priorities
and the current technology priorities. He also discussed the efforts
on Biological Warfare Defense. DARPA most recently has put advanced technology
programs in Bosnia and Desert Storm. Finally, he described a "typical" project
for DARPA. It would be funded between $10-$40 million over a four-year
period. There would be a single DARPA program manager, executing multiple
contracts that include a teaming of industry, universities and Federal/Department
of Defense laboratories. In all cases, the emphasis is on small teams
of the highest quality people.
Visit to WETA Broadcasting Facility, Arlington, Virginia
(January 14, 1998)
Mr. Lew Zager, Director of Technical Services at WETA, was our host
and guide through the Broadcasting Facility, which is located in Arlington,
Virginia.
Beginning in the Fall of 1998, Digital Television (DTV) will be transmitted
for the first time in the top 20 markets in the U.S. WETA is one of only
three stations in the U.S. that is already transmitting a DTV signal.
This new technology is the greatest change in television since the introduction
of color television. The benefits of DTV include incredible resolution
and a change of the aspect ratio to that of theater -- a wider picture.
The tour of the WETA broadcasting facility included the production facilities
and the studio. We were able to walk on the set where "The News Hour
with Jim Lehrer" originates. The highlight of the tour was the opportunity
to view a demonstration of DTV. WETA's first-ever production, "Impressionists
on the Seine", was shown on both a large screen and on a DTV receiver.
The producer, Jackson Frost, participated in the viewing and discussed
the various technical and artistic challenges to showing the exhibit
of masterpieces of impressionist art with views of the Seine River, which
were shown at the Phillips Collection in 1997.
William Y. Brown
Science Advisor to the Secretary
Department of the Interior
(January 28, 1998)
Topic: Science Agenda for the Department of the Interior
William Brown was appointed Science Advisor to Secretary of the Interior
Bruce Babbitt on April 13, 1997. In that capacity, he is advising the
Secretary on diverse issues concerning science and policy at the Department.
Mr. Brown discussed the problem with noxious weeds, which is spreading
across millions of acres. This issue is gaining recognition nationally.
Both the U.S. Department of Agriculture Forest Service, and the Bureau
of Land Management have named weed control among the top priorities of
their agencies. The objectives are to come up with a long-range plan
for the weed management area, map the weed infestations and enter the
information into a computer database, and educate the public about the
scourge of weeds.
He discussed Interior's role in the national conference on the oceans
later this year that would bring together federal agencies, members of
Congress and environmentalists. The gathering will highlight the ways
marine resources are used and prompt discussion on methods of promoting
healthy, sustainable oceans.
Mr. Brown discussed Habitat Conservators Plans (HCPs), which are legally
binding agreements under which landowners adopt certain conservation
measures in exchange for permission from the Federal Government to develop
property, even if some endangered species and habitat are destroyed in
the process. HCPs are potentially "powerful tools" to protect endangered
species, but provisions in the plans for long-term biological monitoring,
if they exist at all, are weak.
Mr. Brown discussed a number of previous positions that he has held
in the public and private sector.
Guy S. Gardner
Associate Administrator for Regulation and Certification
Federal Aviation Administration (FAA)
(February 4, 1998)
Topic: Aviation Safety
Mr. Gardner is serving as the head of the FAA's Regulation and Certification
complex. He is principally responsible for the certification and safety
oversight of some 7,300 U.S. commercial airlines and air operators. His
distinguished career includes serving over 20 years in the U.S. Air Force
as a pilot, flying 177 combat missions in 1972 throughout Southeast Asia.
He was selected by the National Aeronautics and Space Administration
(NASA) to be a pilot astronaut and subsequently flew on two separate
space shuttle missions -- Orbiter Atlantis in 1988 and piloting the Orbiter
Columbia in 1990.
Mr. Gardner described the FAA organization, explaining that the agency
has two main divisions, the Regulation and Certification Division and
the Technical Acquisition Center. The Technical Acquisition Center is
responsible for all research and development of the avionics that support
aircraft. He went into detail of how the Regulation and Certification
Division works. He described the relationship the FAA has with the aircraft
carriers explaining that they formed partnerships between the FAA and
the carriers. The FAA did not want to be a policeman of the carriers
but a contributing partner helping in developing the processes for collecting
data that will prevent future air accidents. The data is voluntarily
collected and consists of mistakes made in all aspects of the operation
and maintenance of the aircraft. The reason this is so important is because
most aircraft accidents are caused by a long string of minor errors that
culminate in a major or catastrophic failure. The flying public is only
concerned with the terms safe or unsafe, when in actuality the safety
of flight is a level of risk analysis. Statistically, airplanes are safer
than traveling in automobiles. Everyday, when you get in your car you
perform a risk analysis to drive or not to drive. Individually, you believe
you control the automobile and feel quite safe in driving, as opposed
to an airline where you are a passenger with no control over the situation.
This is the root of the concern with safety.
We discussed international regulation and the cooperation with other
foreign government agencies. The European Union and the U.S. don't always
agree, but generally the rest of the world follows the U.S. standards
because they want to be able to fly into the U.S. or its territories.
Mr. Gardner concluded by talking about the cooperation of the National
Transportation Safety Board when accidents do occur. It is very important
for all involved to figure out the cause of any accidents to help eliminate
the problems in any aircraft or situation and increase flight safety,
which is the FAA mission.
Bruce W. McConnell
Chief, Information Policy and Technology Branch
Office of Information and Regulatory Affairs
Office of Management and Budget
(February 17, 1998)
Topic: Economic Interests Versus Civic Interests on the Internet
As Chief of Information Policy and Technology at the Office of Information
and Regulatory Affairs, U.S. Office of Management and Budget, Bruce McConnell
explained that if something in a Federal Agency is information related,
they "do it". They are currently addressing three major issues: the Year
2000 problem, encryption, and the government reform act.
The Year 2000 problem stems from the fact that computer software was
often written with only two digits for the year. When we reach the year
2000, dates could show up with a year of "00", which would not be recognized
as legitimate or which might be used in calculations and produce inconsistent
results. The problem is particularly significant in financial software
where amounts of money are calculated in terms of years. Mr. McConnell's
office is summarizing the status of all government agencies relative
to this problem. The summary will be forwarded to Congress.
Encryption is the process of using mathematical calculations to code
information so that it can only be decoded by individuals or software
that have the key or numeric value used in the original calculation.
Encryption is heavily used in financial transactions to protect amounts
of money and identities of the parties in the transaction. It can also
be used to protect any information, from casual E-mail to state secrets.
The controversy that Mr. McConnell is dealing with is whether law enforcement
should be able to access (have escrowed keys for) any and all encrypted
material. Law enforcement agencies would like to be able to "wire tap" the
Internet in order to collect evidence for law enforcement. The controversy
develops because encryption software is available in the public domain
and criminals are not likely to use encryption for which law enforcement
holds the keys, if they have an alternative. Because some form of the
encryption software is already available, U.S. encryption software companies
are frustrated that they are currently limited in the level of protection
they are allowed to sell in the international market.
The Information Management Reform Act of 1996 is the third major issue
under Mr. McConnell's oversight. This effort is to improve information
management across government. Chief Information Officer's have been mandated
for every agency. A main focus of their work is to convince agencies
that projects should be broken into smaller "chunks", which can be accomplished
in six months.
Other issues of interest include "paper work reduction", such as rules
limiting the government's tendency to take surveys. Privacy was the last
issue discussed. With the increased power and flexibility of computing,
agencies are able to compare their databases about individuals. For instance,
an agency tracking individuals that have defaulted on school loans may
want to match their database against a database that contains current
addresses, perhaps the Internal Revenue Service's database. Mr. McConnell
pointed out that even if the databases match, an investigation must still
be made to determine if the individual has been properly identified.
The New Hires Database, mandated by the Welfare Reform Act of 1996,
contains names, addresses, and social security numbers for all individuals
who have taken a new job. The New Hires Database was set up to help locate
parents who are not paying child support; however, it contains information
on all individuals who take a new job. Its usefulness to other agencies
looking for individuals is clear. Since there is no limit to the time
an individual's information will remain in the database, most people
will eventually be in the database. The accumulation of this information
and the ability to match it with other databases is considered by many
to be a threat to individual privacy.
Ajit Sapre
Technology Consultant
Mobil Corporation
(February 25, 1998)
Topic: Mobil's Energy Technology Advances and Research Efforts
Dr. Sapre spoke to us about Mobil's energy technology advances and research
efforts. Mobil is the third largest independent oil company in the world
with 1996 revenues at seven percent of the world market of $1,121 billion.
Mobil breaks down its businesses into three groups: Upstream, Downstream,
and Petrochemicals. Upstream business involves the search for, discovery
and production of, oil and natural gas. Downstream business is the refining
and marketing of gas, diesel and jet fuels, lubricants and heavy products.
Petrochemical business consists of products like plastics, films, polyesters
and fibers.
Future drivers for the energy industry are: resources, emission reductions,
climate change, and technology. The world uses 25 billion barrels of
oil a year. There are known reserves of two trillion barrels. The world
is not going to run out of oil any time soon. What will really change
the business will be the need to deal with the effects on the environment,
we need to get "greener".
Dr. Sapre expressed a few basic points. Society will use fossil-based
hydrocarbons into the 21st Century. More specificity will be given to
refinery products, making refineries more like chemical plants. Refinery
conversion will require more science, and more chemical engineers.
Dr. Sapre described the technology direction that Mobil is undertaking
and how it will match his expectation of the 21st Century.
Jeffrey Kopchik
Senior Policy Analyst
Office of Policy Development
Federal Deposit Insurance Corporation
Cynthia Bonnette
Examination Specialist and Chairperson, Banking Technology Task Force
Federal Deposit Insurance Corporation
(March 4, 1998)
Topic: Electronic Banking Technology and Policy
The Federal Deposit Insurance Corporation (FDIC) is an independent agency
with a board of directors appointed by the President. The FDIC is federally
chartered, but receives no federally appropriated funds. The banks that
the FDIC regulates/insures are assessed, based on their holdings, to
support the FDIC.
The FDIC insures all U.S. banks. In addition, it supervises community
banks, which don't fall under the Federal Reserve, and regulates state
chartered banks. With the rapidly changing technologies, the FDIC finds
the banks really appreciate the FDIC's examinations. The discussion and
questions help the banks evaluate their position relative to the new
technologies.
Electronic banking grew out of the use of personal financial software
on home computers. Once people had all their financial information easily
accessible, the next step was to automate the financial transactions
and to be able to do it from home. Most services are now possible, but
there is little actual use so far. A few customers per bank actually
utilize the e-banking possibilities.
Most banks are preparing to handle the electronic transactions for fear
of losing their current customers to a bank that offers the services
on-line. Surveys show that 35 percent of all banks and 83 percent of
large ($4 billion) banks will have on-line banking this year. Banks expect
to save money with on-line banking because the cost per transaction will
be very low. Electronic banking costs only $0.01, while full service
in the bank is $1.07 and by telephone is $0.54 per transaction.
The FDIC has run an internal e-banking task force since November of
1995. Issues that the task force feels must be resolved in order to facilitate
the use of e-banking are security, speed, authentication, privacy, disclosure,
and law enforcement jurisdiction.
The many ways of handling money for electronic banking include: stored
value cards, digital cash, electronic checks, and smart cards.
Joan Dudik-Gayoso
Associate Assistant Administrator for Science, Technology and Communications
U.S. Agency for International Development
Economic Development Institute
The World Bank
(March 11, 1998)
Topic: Science and Technology in International Development
Ms. Dudik-Gayoso, who has spent her whole career working on development
issues in underdeveloped countries, introduced her talk by commenting
that "Science and Technology" is a rubric that really means information
and communication technology. Other areas that might naively be considered
to be included in science and technology already have organizational
homes in traditional programs, such as health and agriculture, and are
not generally so included. Ms. Dudik-Gayoso then indicated the three
divisions of her talk. They are the state of the world, a picture of
the development community, and the role of science and technology in
development.
The state of the countries of the world can be measured in several ways.
The most common measure is the Gross National Product per capita. There
are also the quality of life measurement -- the literacy rate, the life
expectancy at birth, the availability of clean water, the per capita
calorie consumption, etc. By most measures, many of the most under developed
countries are improving. Most of the exceptions are located in sub-Sahara
Africa and southeast Asia. However, several current international developments
bode ill for continuing improvement. These include the impact of price
increases of petroleum-based fertilizers on the agriculture of developing
countries and the increasing population and wealth of China. Future development
of poor countries will also be handicapped by the lessening advantage
of cheap labor. Trained, not cheap, labor is what is needed. Ms. Dudik-Gayoso
also argued for increased emphasis on traditional technologies, such
as, the use of dung for fertilizer.
According to Ms. Dudik-Gayoso, the philosophy of the development community
-- the groups that help developing countries -- has changed significantly
since its early days in the Marshall Plan. The change has run from simply
providing money under the Marshall Plan, to encouraging industrialization,
to supporting of human needs, to assisting development. These changes
in philosophy parallel changes in policy towards our own poor. These
policies have shifted from the support of human needs (welfare or consumption)
to development (education and health care). The question, what is the
responsibility of Government and what is the responsibility of individuals,
has been raised in both communities. Ms. Dudik-Gayoso concluded this
section of her talk by contrasting the U.S. and other development communities.
The U.S. and European development communities differ in that the U.S.
generally provides aid in support of U.S. national interest, whereas
the European community feels a greater moral or social commitment. Japan
is more likely to provide aid in support of its economy.
Finally, Ms. Dudik-Gayoso discussed the role of science and technology
in development. She thinks that changes in information technology hold
great potential for changing the world through increases in the speed
and availability of information. Availability of information makes governments
more responsible and accountable and reduces the power of intermediaries.
Other science and technology issues, such as property rights and intellectual
property rights, still remain to be addressed. Ms. Dudik-Gayoso concluded
that a better understanding of how people adjust to the new is needed.
Sally E. Howe
Associate Director
National Coordination Office for Computing, Information, and Communications
(April 1, 1998)
Topic: Infrastructure for Future Computing and Communications
Dr. Sally Howe is the Associate Director of the National Coordination
Office for Computing, Information, and Communications (CIC). To meet
the challenges of a radically new and technologically demanding century,
Dr. Howe said that CIC programs are investing in long-term research and
development (R&D) to advance computing, information, and communications
in the U.S.
One of the nine committees of the National Science and Technology Council,
the Committee on Computing, Information, and Communications -- through
its CIC R&D Subcommittee -- coordinates R&D programs conducted by twelve
federal departments and agencies in cooperation with U.S. academia and
industry. These R&D programs are organized into five Program Component
Areas: HECC -- High End Computing and Computation; LSN -- Large Scale
Networking, including the Next Generation Internet Initiative; HCS --
High Confidence Systems; HuCS -- Human Centered Systems; ETHR -- Education,
Training, and Human Resources.
Dr. Howe indicated that HECC R&D investments provide the foundation
for 21st Century U.S. leadership in high end computing. This R&D focuses
on advances in hardware and software, and in algorithms for modeling
and simulation needed for computation -- and information -- intensive
science and engineering applications. HECC research also explores advanced
concepts in quantum, biological, and optical computing.
LSN R&D, including the new Next Generation Internet (NGI) initiative,
focuses on developing the landmark networking technologies and applications
that will keep the U.S. in the forefront of the information revolution.
Key research areas include technologies and services that enable advances
in wireless, optical, mobile, and wireline network communications; networking
software that enables information to be disseminated to individuals,
multicast to groups or broadcast to an entire network; software for efficient
development and execution of scalable distributed applications; software
components for distributed applications; and research infrastructure
support testbeds.
Dr. Howe noted that the NGI initiative is the primary focus of LSN R&D.
Announced by President Clinton and Vice President Gore on October 10,
1996, the NGI initiative will create a foundation for the more powerful
and versatile networks of the 21st Century. Based upon strong R&D programs
across CIC agencies, NGI will foster partnerships among academia, industry,
and Government that will keep the U.S. at the cutting-edge of information
and communications technologies. It will accelerate the introduction
of new networking services for our businesses, schools, and homes.
HCS R&D focuses on the technologies necessary to achieve high levels
of security, protection, availability, reliability, and restorability
of information services. Systems that employ these technologies will
be resistant to component failure and malicious manipulation and will
respond to damage or perceived threat by adaption or reconfiguration.
Dr. Howe said that the goal of HuCS R&D is increased accessibility and
usability of computing systems and communications networks. HuCS technologies
are needed to assure that today's rapid advances in computing, information,
and communications continue to be readily accessible to federal agencies
and to all U.S. citizens no matter where they might live and work.
ETHR R&D supports research to advance education and training technologies.
The complex and technically challenging applications flowing from leading
edge HECC and LSN R&D make it increasingly important for today's students
and professions to update their education and training on an ongoing
basis in order to exploit the latest technological advances.
Allan Whittaker
Director
Electronic Systems and Software
Lockheed Martin
(April 8, 1998)
Topic: Applying Advanced Military Technology to Commercial Products
Mr. Allan Whittaker, Director of Electronic Systems and Software for
Lockheed Martin Aeronautical Systems, provided an interesting overview
of Lockheed Martin's Aeronautical programs. The Electronic Systems and
Software (ES&S) Directorate comprises around 900 individuals who support
the Marietta, Georgia plant's primary business lines including the F-22,
the C-130J, and Maritime Systems. Lockheed Martin is currently a $27
billion a year corporation, which encompasses the original Wright Brother's
factory. It has been a successful manufacturer of advanced military and
commercial aircraft for a number of years.
The ES&S's primary missions are research and development, engineering,
production, and continued support of airlift, tactical, and other aircraft.
The ES&S maintains a laboratory environment recognized throughout the
corporation and the industry for excellence in electronic systems and
development and integration. The Associate Technology Laboratory serves
as the environment for intelligent systems development. Mr. Whittaker
described ES&S's Associate Technology, a network of cooperating expert
systems that provide critical decision-making information to operators
of aircraft. The Pilot Associate was the application of this technology
developed under a Defense Advanced Research Projects Agency program to
provide an "electronic backseater" in a single seat aircraft. Lockheed
then teamed with United Airlines to apply the Pilot Associate technology
to an advanced air transportation program. This ES&S laboratory is also
developing advanced transportation management software and real time
in flight information systems for both military and commercial applications.
In another application, ES&S designed a single mode fiber optics bus
based on the telephony model to increase data throughput to one quadrillion
bits per second. The application was based on the need to reduce wiring
and increase data throughput for an aircraft such as the P-3 where the
communications infrastructure adds weight and cost to the aircraft design.
This high-speed bus combines voice, video, and data in multiple wavelengths
over a single fiber, allowing simultaneous transmission of digital and
analog data. The technology employed used a commercial off-the-shelf
based solution and Lockheed Martin applied for a basic patent on it in
1997.
Mr. Whittaker gave us a brief glimpse of the many research and technology
programs that his Directorate is pursuing. We were assured that ES&S
Directorate will continue to develop advanced electronic systems for
military and commercial applications in the future.
Kevin Doxey
Office of the Deputy Under Secretary for Environmental Security
Department of Defense
(April 22, 1998)
Topic: Environmental Concerns Including Clean-Up and Recycling Which
Might Lead Into Sustainable Development
Mr. Kevin Doxey, of the Office of the Deputy Under Secretary of Defense
for Environmental Security, presented an interesting and informative
overview of the Department of Defense (DOD) environmental technology
policy. The DOD is actively involved in promoting the use of innovative
technologies that can substantially reduce costs and increase the effectiveness
of environmental programs. The investment in environmental technology
research and products contributes to the DOD's overall mission readiness,
but also provides alternatives to operation and maintenance practices
that can reduce costs and the risk from future liabilities. Some of the
technology challenges for the DOD include compliance with the Clean Air
Act, unexploded ordinances, tribal relationships with Native Americans
for clean-up of contaminated areas, pollution prevention, conservation,
and domestic terrorism.
The DOD components are responsible for identification of their unique
environmental technology requirements. Similar environmental needs are
grouped by communities of interest and prioritized against the DOD's
environmental goals and objectives. The highest priority objectives are
then used to guide science and technology investment decisions. In FY
1997, the highest priority objectives included clean-up, compliance,
and pollution prevention. The principles of the technology investment
program include satisfying multiple needs simultaneously, cost-effectiveness,
improvement of quality and risk reduction, and increased stakeholder
participation, that is, a demonstrated relevance to operational users,
regulatory officials, industry partners, and the public.
The DOD is using a three-phase approach to promote environmental technology
insertion: 1) establishing focus areas based upon degree of impact on
the DOD, technology maturity and the likelihood of success; 2) promoting
federal, state and industry partnerships; and 3) technology transfer
through forums and use of the Internet.
T. James Glauthier
Associate Director
Natural Resources, Energy and Science
Office of Management and Budget
(May 6, 1998)
Topic: Developing the President's Science Budget
T. J. Glauthier is Associate Director of the Office of Management and
Budget with responsibility for the natural resources, energy, and science.
He discussed the development of the President's science budget, especially
paying attention to the Research Fund of America.
Mr. Glauthier said that science and technology are the principal agents
of change and progress, with over half of the Nation's economic productivity
in the last 50 years attributable to technological innovation and the
science that supported it. Appropriately enough, the private sector makes
many investments in technology development. The Federal Government; however,
also has a role to play -- particularly when risks are too great or the
return to companies is too small.
He noted that the Federal Government supports areas of science at the
cutting edge, through the National Aeronautics and Space Administration,
the National Science Foundation, and the Department of Energy science
programs.
Mr. Glauthier stated that these agencies have a tradition of funding
high-quality research and contributing to the Nation's cadre of skilled
scientists and engineers. Another important Federal role is to construct
and operate major scientific facilities and capital assets for multiple
users. These include telescopes, satellites, oceanographic ships, and
particle accelerators. Many of today's fast-paced advances in medicine
and other fields rely on these facilities.
He indicated that the budget proposes $18.5 billion to conduct these
activities. The Government also seeks to stimulate private investment
in these activities through over $2 billion a year in tax credits and
other preferences for research and development (R&D).
The Federal Government has played an important role in spurring and
sustaining this scientific and technological advance. Among other feats,
Government-sponsored R&D has put Americans on the moon, explored the
oceans, harnessed the atom, devised more effective treatments for cancers,
found the remains of lost civilizations, tracked weather patterns and
earthquake faults, and discovered the chemistry of life. No other country
in history can match America's record of achievement in science and technology.
The centerpiece of the Administration's continuing commitment to research
is the proposed Research Fund for America, from which many of the research
dollars will now flow.
Mr. Glauthier said that the budget for the Research Fund for America
-- reflecting the President's commitment to ensuring long-term stability
and growth for non-defense research programs -- will support a wide range
of Federal science and technology activities. The budget proposes $31
billion for the Fund, representing an eight percent increase for these
programs over the 1998 level.
William G. Wells, Jr.
Consultant to the Director
Office of Science and Technology Policy
Executive Office of the President
(May 20, 1998)
Topic: Science, Technology, and National Policy-Making -- The Clinton
Administration's Science and Technology Agenda
Dr. Wells briefly discussed many topics -- his professional history,
a brief history of governmental science and technology (S&T) policy in
the United States, a political analysis of the forthcoming election,
the organizational structure of the Office of Science and Technology
Policy, and the S&T strategy, goals, and guiding principles of the Clinton
Administration. He pointed out that the U.S. has had, with a few aberrations,
an official S&T policy for a very long time. During the Cold War, the
need for such a policy was easier to defend than now. Now the main challenges
are economic and our main competitors are allies.
Dr. Wells discussed the intraparty conflicts concerning S&T policies
in all political parties. The real political issue is how to maintain
fiscal discretion but also maintain social consciousness. The Clinton
Administration's main agenda is to make the government "look" like America.
Dr. Wells is proud of his contribution towards placing more women and
minorities into positions which will influence S&T policy and implementation.
The Clinton Administration's S&T agenda is to promote S&T in partnership
with the private sector towards its goals of healthy citizens, economic
growth, and enhanced security.
Dr. Wells was able to share with us some personal stories about how
he has been involved, through several administrations, with the formulation
and implementation of the S&T budgets. A key to success is being able
to work with the Congress and present ideas in ways that Congress will
understand the benefits to the country. His unique insights and observations
were very interesting. At the conclusion, Dr. Wells individually signed
copies of his book, Working With Congress, A Practical Guide for Scientists
and Engineers.
David Rejeski
Executive Director, Environmental Technology Task Force
Council on Environmental Quality
(May 29, 1998)
Topic: Future Challenges and Directions for Environmental Quality
David Rejeski works in the Environmental Protection Agency's Office
of Policy, Planning and Evaluation. He is presently agency representative
to the White House Council on Environmental Quality. Mr. Rejeski began
by using three environmental stories to illustrate the current challenges
in environmental policy.
A known neurotoxin, lead was the target of regulatory clamp downs in
the U.S. which significantly reduced its dissipative uses in gasoline
and paints in the 1970's. A dramatic decrease in blood lead levels in
the general population followed. The dominant use remains in lead-acid
batteries and a significant amount of that demand is now met through
recycling.
Health risks have now been shifted offshore to countries where battery
reclamation and secondary processing is done. These facilities often
do not meet high environmental or occupational health standards. The
system boundaries are now global, and the policy debate has moved into
the complex and often arcane world of international trade, treaties,
and law.
Arsenic flows followed a very different path than did lead. Its use
in the U.S. has remained relatively constant for the past 30 years at
an average of about 20,000 metric tons annually. Use has migrated; however,
from agricultural into pressure-treated wood. The five billion board
feet of pressure-treated wood produced each year contain chromated copper
arsenate. This accounts for 90 percent of the worldwide arsenic demand
and makes the U.S. the world's largest consumer of arsenic.
The domestic production of arsenic in the U.S. ended completely in 1985.
U.S. demand is met entirely through imports mostly from China, the world's
largest producer, and Chile, a growing provider. We now have a toxic
that is trapped in wood products with a life span of 25-30 years that
is used in virtually every new home in America, and no recycling or recovery
strategy is in place to deal with the end-game.
The silver story began in San Francisco, where the Regional Water Quality
Control Board discovered high levels of silver in the water, sediments,
and tissues of fish and marine mammals in the bay. A mass balance study
done by researchers at the University of California at Los Angeles found
that one-half of the silver flowing through the U.S. economy was being
mobilized by dentists' offices and photo labs and essentially going down
the drain in fixer solutions.
The surprise was not only the discovery of a dissipative flow that had
been missed, but that the service sector, long considered the environmental
benign partner in economic growth, was mobilizing the flows. There are
98,000 dentist offices in the U.S. and thousands of photographic facilities.
The traditional regulatory system is not set up to deal with such high
numbers of scattered small generators.
Mr. Rejeski went on to talk about the need for environmental regulations
for the service sector of the economy. Current environmental law may
work well to regulate the service sector -- or maybe not. Of the many
think tank exercises undertaken over the past few years to explore the
future of environmental policy, only one, the Next Generation Project
at Yale, has paid any attention to the most fundamental transformation
of the U.S. and global economy to occur since the Industrial Revolution.
Mr. Rejeski compared this to the thousands of studies and dozens of
policy initiatives per year focused on the manufacturing sector. Ironically,
rather than address the diverse and less tangible environmental impacts
of this change, many policy makers point to the transition to a service
economy as an automatic environmental benefit, presaging a reduction
in pollution as industry moves away from smokestack activities.
Visit to the National Security Agency, Fort Meade, Maryland
(June 3, 1998)
Our host, Stephen Barnett, welcomed us to the National Security Agency
(NSA) and accompanied us throughout the day. Our visit included briefings
from Information Systems Security Organization (ISSO) personnel, tours
and demonstrations in the Information Systems Security (INFOSEC) Demo
Room, the Research and Engineering Biometrics Lab, the Supercomputer
facility, and the National Cryptologic Museum.
The first presentation by Louis Giles gave us an overview of the NSA
Information Assurance Strategy including the National Information Assurance
Partnership (NIAP) with the National Institute of Standards and Technology
(NIST). NSA and NIST are partnering to promote the development of security
enhanced information technology products for use across the full spectrum
of information technology applications. NIAP relies on accredited commercial
labs to perform security testing using the international Common Criteria
as the standard methodology. NSA wants to shorten the time needed for
security evaluations of commercial off-the-shelf products for lower assurance
applications while fostering an active, product rich security testing
industry. The result will mean a wider choice of products for the Department
of Defense and all consumers and will allow evaluations to be used as
a basis for product comparison.
Richard Sprague followed with an excellent overview of the NSA and its
corporate vision of "Information Superiority for America -- One Team,
One Mission". He explained that information superiority is the capability
to collect, process, and disseminate an uninterrupted flow of information
while exploiting or denying an adversary's ability to do the same. NSA
provides products and services to protect the security of U.S. signals
and information systems and provides foreign intelligence information
derived from those of our adversaries.
Stan Heady followed with the Research and Engineering Science and Technology
Trends Forecast. In the past, wide band access, bandwidth supply and
demand, and intelligent networks and switches were thought of as separate
technologies. Advances in communications such as ATM, SONET, and SDH
have caused the computer and telecommunications industry division to
blur. Additionally, the rapid advances in the wireless market, software
radios and smart antennas, and a renewed interest in high frequency communications
by third world countries, have contributed to the mix of several competing
technologies. Issues in network security, cryptography and privacy have
illustrated the limitations and lack of investment in security technology.
Some key enabling technologies for the near term include microelectronics,
photo electronics, high end computer, software defined networks, object
technologies, data visualization, mobile code, and survivable secure
wireless networks. The forecast for 2010 includes issues such as processing
over the network, ubiquitous access, 100 fold increase in current bandwidth
needs, quantum information processing, and high assurance hardware and
software systems.
Chip Harry then presented a very concise and timely Threat and Vulnerability
briefing in which he used many current news stories as real examples
of the threat to our country's critical information infrastructures from
hackers, disgruntled employees, extremist groups and even nation states.
We then toured the INFOSEC Technology Demo Room and received several
enlightening demonstrations. David Phister presented examples of NSA's
end to end secure voice, data, and digital wireless technologies. Deborah
Wojtowycz demonstrated special packaging technologies used to tamper
protect some of NSA's critical products. William Sabia demonstrated the
FORTEZZA crypto card used for secure e-mail and other electronic transactions.
After lunch, we toured the Biometrics laboratory and were able to participate
in several demonstrations of biometric systems used to authenticate users
in computer systems. Robert Rahikka explained the differences in quality
and performance of the latest fingerprinting authentication devices and
the application for this technology within NSA. Major John Colombi demonstrated
a voice recognition system used for authentication. J. David Murley demonstrated
a fascinating face recognition system used for authenticating an individual.
There were many interesting questions from the fellows and before we
realized it, we were out of time.
William Johnson guided us on a whirlwind tour of the Tordella Supercomputer
facility. The building is a technological accomplishment in itself but
also contains some of NSA's most interesting and powerful computers.
We saw many super computers used for different types of general and cryptanalytic
processing including some of the latest state-of-the-art systems from
Cray. We even got to experience "flourinert" which is used as a heat
exchange medium in some of the Crays. It is heavier than water, does
not conduct electricity, and dries instantly on the skin. We were all
impressed with Bill Johnson's recall of hardware processing statistics
for literally dozens of computers and special purpose devices.
We then toured the National Cryptologic Museum and had a fascinating
visit with the museum's Assistant Curator, John Hultstrand. Mr. Hultstrand
guided us to some of the better-known exhibits and highlighted the historical
significance of the museum's extensive cryptographic artifacts. The Museum
was opened to the public in December 1993 and contains a wonderful rare
book collection, the German Enigma machine, Japanese World War II cryptographic
machines, an American Civil War exhibit, the Black Chamber, Verona, and
the great seal from the American Embassy in Moscow to name a few. All
of us agreed that one could spend the better part of the day in the Museum
experiencing the world of cryptography. It was the end of an exhausting,
but enlightening visit.
Visit to Nasdaq Stock Market, Inc. and NASD Regulations, Inc., Rockville,
Maryland
(June 8, 1998)
Nasdaq Stock Market, Inc., a subsidiary of the NASD (National Association
of Securities Dealers) develops, maintains, and operates systems, services
and products for the NASD securities markets to benefit and protect investors.
NASD Regulations, Inc. was created in 1996 as part of a restructuring
of the NASD to separate the regulation of the broker/dealer professional
from the operations of the stock market.
The Nasdaq Stock Market is an electronic securities market that uses
computer and telecommunication technologies to executive transactions
with real-time trade reporting and automated market surveillance. The
MarketWatch department, created in 1996, conducts surveillance over the
marketplace. It is divided into two sections, TradeWatch and StockWatch.
We were shown the ways in which TradeWatch monitors all trades to assure
that an orderly market is maintained. The StockWatch monitors market
activity and news to assure all information is disseminated in a timely
manner to the public.
Of the many functions of NASD Regulations, Inc., we were exposed to
the ways in which their advanced detection system programs retrieve all
market data collected by MarketWatch to discover patterns of possible
anti-competitive and/or harassment behavior. We left the Nasdaq with
a greater understanding of the marketplace and a greater feeling of safety
in our own financial security.
Visit to the National Institute of Standards and Technology, Gaithersburg,
Maryland
(June 11, 1998)
Raymond G. Kammer, Director of the National Institute of Standards and
Technology (NIST) started our tour with an overview of NIST. An agency
of the U.S. Department of Commerce's Technology Administration, NIST's
primary mission is to promote U.S. economic growth by working with industry
to develop and apply technology, measurements, and standards. It carries
out this mission through a portfolio of four major programs:
The Measurement and Standards Laboratories provide technical leadership
for the Nation's technology infrastructure needed by U.S. industry to
continually improve its products and services. The Advanced Technology
Program (ATP) provides cost-shared awards to industry for development
of high-risk, enabling technologies with broad economic potential. The
Manufacturing Extension Partnership (MEP) works with a network of local
centers to offer technical and business assistance to smaller manufacturers.
The Malcolm Baldrige National Quality Award recognizes continuous improvements
in quality management by U.S. manufacturers and service companies.
Overviews of the goals and processes of the ATP and MEP programs were
given and the chemistry, physics and manufacturing engineering laboratories
gave the following presentations.
Work in the Biotechnology Division of the Chemical Science and Technology
Laboratory is focused on assuring that the accuracy and precision of
DNA testing in the U.S. is based on well-qualified DNA standards produced
and certified at NIST. These studies have resulted in the issuance of
two Standard Reference Materials that are currently used in most of the
major commercial DNA testing laboratories, as well as at least a quarter
of all forensic laboratories.
The Ionizing Radiation Division, the Physics Laboratory, develops, maintains,
and disseminates through calibrations the national measurement standards
for ionizing radiation. The Division recently established new calibration
facilities for mammography x-rays. All radiation exposure measurements
in more than 11,000 mammography x-ray clinics in the U.S. will be traceable,
through a network of secondary calibration laboratories, to NIST standards.
The Division is also developing standards and calibration facilities
for small, sealed radionuclide-containing seeds called brachytherapy
sources, used in cancer therapy and now in the treatment of arterial
blockages. Both of these therapies depend on NIST-developed methods to
very accurately measure radiation from the seeds.
The National Advanced Manufacturing Testbed (NAMT) in the Manufacturing
Engineering Laboratory has designed a distributed testbed built on a
high-speed computing and communications infrastructure. Industry, NIST,
and academia are working together to solve measurement standards issues
and to demonstrate how machines, software and people can improve productivity
and foster innovation at all levels of a manufacturing enterprise.
A visit to the NIST museum concluded our tour.