Capital Science 2008

Capital Science 2008 took place on March 29-30, 2008 at the National Science Foundation.  Abstracts are preceded by a Table of Contents, which is divided into two sections: the highlights of the Conference followed by an alphabetical listing of the participating Affiliates. Entries in the Table of Contents are linked to the Abstracts of their respective Affiliates.  For details of these excellent presentations, please contact the authors directly. 


Keynote Address –
6:00PM Saturday, March 29 – Hilton Hotel
“The Future of U.S. Innovation: Fate or Fatality?” – National Science Foundation Director Arden Bement
Luncheon Talk – Noon Saturday,
March 29 – Hilton Hotel
“Symmetry: From Human Perception to the
Laws of Nature”
– Dr. Mario Livio, Senior Astrophysicist and Head of the Office of Public Outreach, Space Telescope Science Institute.
Luncheon Talk –
Noon Sunday, March 30 – Hilton Hotel
“Science and Size” – Dr. Maxine Singer, recently retired (2002) as President of the Carnegie Institute and Scientist Emeritus at the National Cancer Institute.
(No abstract)
Plenary Session – 11:00PM
Saturday, March 29 – Room 375
“NSF Office of Polar Programs”
– Dr. Kelly Falkner, Program Director, Antarctic Science Division, NSF Office of Polar Programs
Plenary Session –
4:00PM Saturday, March 29 – Room 1235
“Tissue Ownership: Ethical, Legal, and Policy Considerations” – Panel led by Dr.
William Gardner, Executive Director, American Registry of Pathology, including Robin Stombler, Auburn Health Strategies, Major Cathie With, Legal Counsel, Armed Forces Institute of Pathology, and Colonel Glenn Sandberg, Chief, Scientific Laboratory, Armed Forces Institute of Pathology.
AAAS Reception and Discussion –
4:00PM Sunday March 30 – Room 1235
“Science and Engineering in the Courtroom: Ethics and the Expert Witness” – Judge Barbara Jacobs Rothstein, U.S. District Judge for the Western District of Washington and Director of the Federal Judicial Center and Mark S. Frankel, Ph.D., Director of the Scientific Freedom, Responsibility and Law Program, American Association for the Advancement of Science

                                Abstracts by WAS Affiliate

Panel of local TV
broadcasters Weather and You- A Town Hall Meeting
American Society of Plant Biologists/Botanical Society of
Washington/Virginia Native Plant Society/Maryland Native Plant
Greg Zell A Case Study: The Challenge of Protecting
Natural Resources in an Urban Environment

2. Jerry Dieruf GypCheck Pesticide
Application for Gypsy Moth Caterpillars Suppression in the City of Alexandria,

3. Alain
Touwaide, PhD, and Alice Tangerini Botanical Illustration: Past and Present

4. Marion Lobstein
The Flora of Virginia Project: A Perspective of 401 Years of Exploration of
Virginia Botanical Diversity

5. Rod Simmons A Survey of Native Oaks and Their Hybrids in the
Greater Washington, D.C. Area

6. Kimberly L. Hunter PhD, and Richard B. Hunter Plants,
Polyploidy, and Undergraduate Research at Salisbury University
7. Mark Holland, PhD and Richard A.
Henson PPFM bacteria: Plant symbionts with applications in agriculture

8. Emily N. Burnett,
Kimberly Hunter and Richard Hunter Genetic Diversity of Liquidambar
styraciflua in Cusuco National Park, Honduras

9. Katharine Spencer, Emily N. Burnett, Mary
E. Cockey, Kimberly Hunter, Richard Hunter, and Katherine Miller
Nordihydroguaiaretic Acid (Ndga) Localization and Quantification in Three
Ploidy Levels of Larrea Tridenta
American Society for
Technical Innovation
1. Thomas Meylan, PhD Can the features of
elective virtual communities be used to create effective virtual

2. Gene Allen Simulation-supported Decision Making
3.Geoffrey P. Malafsky  Simulation-supported Decision Making
4. F. D. Witherspoon
High Velocity Dense Plasma Jets for Fusion Applications

5. D. A. Tidman and F. D. Witherspoon, Slingatron – A Hypervelocity Mechanical Mass

6. James Jordan and James Powell Maglev Transport – A Necessity in the Age of
No Oil

7. John Bosma
Tech Futures 2008-2030

8. Richard Smith Micro- and Nanotechnologies for Near Term
Medical Diagnostics

9. Jim Burke Scientific Leaders and the Workforce of 2015
10. Limor Schafman Distributed Business:
How IPv6 Will Change Business and Government Operations
Anthropological Society of Washington 1. Martin C. Solano, PhD Sex Differences in Skeletal Trauma Among the 19th Century Working Class.
2. Ryan W. Higgins Limb Proportion Inheritance and Ancestry Determination from Fetal
Crural and Brachial Indices

3. Marilyn R. London Complete Fusion of the Mandible to the Cranium During Childhood in an Eskimo from Southwestern

4. David R. Hunt, PhD and Deborah Hull-Walski, MSAll That Remains – Multidisciplinary
Study of a Mid-19th Century Iron Coffin and Identification of the Individual

5. Lynn Snyder, PhD Faunal and Human Remains from a 2nd century BCE Well in the Athenian
Agora; Evidence of Animal Sacrifice and Infanticide in Late Hellenistic

6. J. Christopher Dudar, PhD. Archaeological Discovery of a Previously Undocumented Case of an Anencephalic Infant from a 19th Century Upper Canadian Cemetery
7. Matthew W. Tocheri, PhD Concerning the Evidence for “Hobbits”: An Overview of
Homo floresiensis
Association for Women in Science – DC Metro
Managing Your Career in Science
Gretchen Schieber
Useful Skills For an Industrial Career or Everyday Life.
Association for
Computing Machinery – DC Chapter
Bill Spees, PhD Lightweight Java State
Biological Society of Washington W. Ronald Heyer Can long-established, narrow-niche scientific
societies such as the Biological Society of Washington survive the digital
Chemical Society of Washington Jesse Gallun and
Jennifer Young Green Chemistry and the ACS Green Chemistry
Institute of Electrical and Electronic
Engineers (IEEE), DC and Northern Virginia Sections
1. Invited Talk by Frederika Darema Dynamic Data Driven Applications Systems: a Transformative Paradigm
2. Ronald L. Ticker The US National
Laboratory on the International Space Station

3. Gerard Christman Sr. In the Aftermath
of the Indian Ocean Basin Tsunami: An Information Sharing Pilot Program in
Support of Humanitarian Assistance / Disaster Relief

4. Tim Weil Securing Wireless Access for
Vehicular Environments (WAVE): A Case Study of the Department of Transportation
VII Project

5. Haik Biglari Past, Present and Future of Safety-Critical Real-time Embedded
Software Development

6. Ashwin Swaminathan Digital Detective for Electronic Imaging

7. X. Zhu, Y. Yang,Q. Li,
D. E. Ioannou, J. S. Suehle, and C. A. Richter High Performance Silicon
Nanowire Field Effect Transistor and Application to Non-Volatile Memory

8. Boris Veytsman, Leila Akhmadeyeva, Fernando Morales, Grant Hogg, Tetsuo Ashizawa, Patricia
Cuenca, Gerardo del Valle, Roberto Brian, Mauricio Sittenfeld, Alison Wilcox,
Douglas E. Wilcox, and Darren G. Monckton Microsatellite Expansion: The
Search for Underlying Pattern

9. Hojin Kee, Newton Petersen, Jacob Kornerup, Shuvra S. Bhattacharyya Synthesis of FPGA-Based FFT Implementations
10. Raj Madhavan, Stephen Balakirsky and Chris Scrapper, Intelligent Systems
Division An Open-Source Virtual Manufacturing Automation Competition

11. Kiki Ikossi, Antimonides for High Efficiency Solar Cells
12. Brian Borak, Dan Feng, John Kucia, and Dan Vlacich What it Takes to Design and Build a Successful Solar Home.
Institute of Industrial Engineers, National Capital
Chapter/Washington Chapter of the Institute for Operations Research and
Management Sciences
1. Douglas A. Samuelson Modeling Attention Management in Organizational

2. H. Ric Blacksten and Joseph C. Chang Fermi model estimation of illegal
immigration deterrence as function of apprehension probability

3. Steven Wilcox GOSSIP: A Computational Model of Team-Based Intelligence

4. Pete Hull What Faith-Based Organizations Can Teach Us about Disaster Response:
Post-Katrina Lessons Learned

5. Douglas A. Samuelson Agent-Based
Simulation of Mass Egress from Public Facilities and Communities

6. Donald E. Crone Postal Automation and the Flats Sequencing System
7. Michael E. McCartney Project Management Shared Network Reporting System for Tracking Capital Investment Projects
8. Charles L. Hochstein Acquisition Cost Optimization Through Supply Chain Management
9. Joseph J. Scheibeler After Cost Review Process for Capital Investments
Marian Koshland Science Museum of the
National Academy of Sciences
Erika Shugart Presenting Current Science: Lessons from the Marian Koshland Science Museum
National Capital Section/Optical Society of
America & IEEE/LEOS
1. Invited Talk by Dr. Michael Haney Photonic Integrated Circuits: Ready for Prime Time?

2. Pavlo Molchanov,
Vincent M. Contarino, Olha Asmolova Gated Optical Sensors

3. Dr. Spilios Riyopoulos Slow light propagation across coupled micro-laser

4. Alexander Efros
Multi-Exciton Generation by a Single Photon in Nanocrystals

5. Jeffrey O. White
Continuous ‘system level’ scale for laser gain media

6. Emily Schultheis Machine Vision
Assessment to Tomatoes of Unknown Diameter

7. Dr. H. John Wood, NASA Goddard Space Flight Center Hubble Discoveries
8. Invited talk by Dr. Ken Stewart Mesh Filters for Infrared Astronomy
9. Geary Schwimmer, Tom Wilkerson, Jed Hancock, Jason Swasey, Adam Shelley, Bruce
Gentry and Cathy Marx Holographic Scanning UV Telescope for the Tropospheric
Wind Lidar Technology Experiment

10. Peter Blake, Joseph Connelly, Babak Saif,
Bente Eegholm, Perry Greenfield, and Warren Hack Spatially Phase-Shifted DSPI for Measurement of Large Structures

11. Dr. Joseph Howard Optical Modeling
Activities for NASA’s James Webb Space Telescope: Overview and Introduction of
Matlab based toolkits used to interface with optical design software

12. Dr. Raymond Ohl Recent
developments in the alignment and test plans for the James Webb Space Telescope
Integrated Science Instrument Module

13. Bert A. Pasquale and Ross M. Henry
Functional Testing of Hubble Relative Navigation Sensor Flight Cameras

14. Dr. J.C. (Chuck) Strickland Fast Optical Processor for Laser Comm
15. Athanasios N. Chryssis, Geunmin Ryu and Mario Dagenais High Resolution Incoherent Optical Frequency Domain Reflectometry
16. Christopher Stanford, Mario Dagenais, Juhee Park, Philip DeShong An Etched FBG Sensor: Modeling Bio-attachment and Improving Sensitivity
National Capital Society of American
I. . Dialogue Among Natural Resource Societies in the National
Capital Area

Panel Includes Chris Farley
Land Use, Forests and Agriculture in a Post-Kyoto Change Climate Agreement:
Prospects and implications for natural resources

2. Nicolas W. R. Lapointe American
Fisheries Society Virginia Tech Non-indigenous species introductions –
benefit or threat?
3. David L. Trauger The Role of Environmental
Societies and Conservation Organizations
II. . Society of American Foresters Science Exhibition and
National Institute of Standards and Technology – Physics
1. Jeff Cessna
New Paradigms in Diagnostic and Therapeutic Nuclear Medicine, New

2. Larry
Hudson, Steve Seltzer, Paul Bergstrom, Fred Bateman, and Frank Cerra Ionizing
Radiation Division, Physics Laboratory, National Institute of Standards and
Technology Standards for X-Ray and Gamma-Ray Security Screening

3. Svetlana Nour,
Matthew Mille, Kenneth Inn, Douglas W. Fletcher Population Radiation
Measurement – the Monte Carlo option

4. Daniel S. Hussey Neutron Imaging: The key to understanding
water management in hydrogen fuel cells
1. Martin Ogle Birds of Prey of Virginia
2. Keith Tomlinson, Manager Meadowlark
Botanical Gardens A Floristic Natural History of the Greater Washington DC
Region in the Potomac River Basin
Philosophical Society of
1. Joe Coates
Homeland Insecurity

2. Kenneth Haapala Economics 21: America’s Post Industrial
Potomac Chapter of the Human Factors and
Ergonomics Society
1. Gerald P. Krueger, Ph.D., of Krueger Ergonomics Consultants
Effects of Health, Wellness and Fitness on Commercial Driver Safety: A
Review of the Issues

2. Ronald R. Knipling, Ph.D., of Virginia Tech Transportation
Institute What Does Instrumented Vehicle Research Tell Us About Crash Risk
and Causation?

Christopher A. Monk, Ph.D.

David M. Cades, Stephen M. Jones, Nicole E. Werner, Deborah A. Boehm-Davis
Knowing When to Switch Tasks: Effectiveness of Internal versus External
Science and Engineering Apprentice Program, George
Washington University
1. Kelly Colas,
James Madison High School Virginia Heppner, James Madison High School Mentored
by: Charlotte Lanteri, Walter Reed Army Institute of Research, SS, MD Jacob
Johnson, Walter Reed Army Institute of Research, SS, MD Assessment of 96-
and 384- Well Malaria SYBR Green I- Based Fluorescence Assay for Use in In
Vitro Malaria Drug Screening

2. John Russo Jr., St. Vincent of Pallotti High School Mentored
by: Heather O’Brien, and Dr. Marc Litz, ARL, MD Pulse Power

3. Muneer
Zuhurudeen, Eleanor Roosevelt High School Mentored By: Dr. Mostafiz Chowdhury,
ARL-WMRD, Adelphi, MD A Study of the Scaling Relationships between
Full-Scale and Sub-Scale Vehicle Ballistic Shock
Washington Society for the History of
Health and
Disease in American Public Education Movies, 1930s-1950s

A presentation of public health movies from the collections of
the U.S. National Library of Medicine. Selected and Presented by David Cantor
for The Washington Society for the History of Medicine.


ABSTRACTS (organized by Affiliate)



Science and Engineering in the Courtroom: Ethics and the Expert Witness

Speakers: Judge Barbara Jacobs Rothstein, U.S. District Judge for
the Western District of Washington and Director of the Federal Judicial Center
and Mark S. Frankel, Ph.D., Director of the Scientific Freedom, Responsibility
and Law Program, American Association for the Advancement of Science.

In addressing an AAAS Annual Meeting, Supreme Court Justice Stephen Breyer
observed that the law “increasingly requires access to sound science because
society is becoming more dependent for its well-being on scientifically complex
technology.” A critical issue facing judges is how to distinguish between
scientific evidence that should be admitted into a legal dispute and that which
is unacceptable because of its poor scientific foundation. This session will
discuss factors that judges rely on to make those decisions, and the role of
the expert in presenting scientific and technical information in legal
proceedings. When scientists or engineers engage the legal system as experts,
they are subject to norms and practices not always familiar to them. This
raises questions about how they can act responsibly in that setting. The
session will identify ethical issues that have confronted experts recruited to
participate in litigation, and the extent to which long-standing professional
norms provide useful guidance.


Panel of local TV broadcasters, including Bob
Ryan and Joe Witte, as well as Steve Zubrick, the Science and Operations
Officer (SOO) at the Washington/Baltimore National Weather Service Forecast
Office and Jason Samenow, Chief Meteorologist of the Washington Post’s Capital
Weather Gang
WEATHER AND YOU – A Town Hall Meeting
Weather is perhaps one of the most ubiquitous subjects of informal
conversation. It affects just about everyone every day as they make decisions
on whether to set forth for the day with umbrellas to preparing for an
impending snowstorm, severe thunderstorms, flooding rains, etc., etc. This
session is about, the nature, issues, and problems of observing and predicting
the weather. The panel members will provide a brief overview of a relevant and
topical subject. The remainder of the session and most important aspect of the
Town Hall will be an open question, answer and discussion period.


Greg Zell, Natural Resource Specialist,
Arlington County, VA
A Case Study: The Challenge of Protecting Natural
Resources in an Urban Environment
Arlington County is in the process of completing a comprehensive
Natural Heritage Resource Inventory (NHRI) of natural lands and public open
spaces within a highly urbanized corridor. The County is approximately 40%
impervious and is considered “built out” from a development standpoint. The
project has collected data relating to local flora, fauna, geology, hydrology,
and has documented significant remaining resources. The next step is to develop
a County-wide Natural Resource Management Plan. Part I of the presentation will
be an overview of the techniques used to collect natural resource data and will
share some of the interesting results to date. Part II of the presentation will
be a Roundtable discussion, where participants will be asked to provide
insight, experience, and suggestions on what elements should be included in the
development of a Natural Resource Management Plan and Policy.
Jerry Dieruf, Arborist/Gypsy Moth
Coordinator, City of Alexandria Department of Recreation, Parks, and Cultural
Activities, Park Planning Division
GypCheck Pesticide Application for
Gypsy Moth Caterpillars Suppression in the City of Alexandria,
Populations of Gypsy Moth Caterpillars surged throughout the
Washington, D.C. area in 2007, causing widespread damage to native oak species
and oak dominated forests. In Alexandria as throughout much of the region, oak
species comprise the dominant vegetation of our forests. This presentation will
include the reason for the application, the application coverage, follow-up
survey of gypsy moth caterpillars, other variables affecting the gypsy moth
population, results of the suppression program, and others’ experiences with
Gypchek in 2007.
Alain Touwaide, PhD, and Alice Tangerini,
Smithsonian Institution, National Museum of Natural History, Department of
Botanical Illustration: Past and Present
In the first part of this presentation, Alain Touwaide will examine
the creation of ancient botanical illustration in classical antiquity,
particularly the question of the supposed schematic nature of such
representations in ancient Greek manuscripts, as well as the transformation of
botanical illustrations from manuscripts to printed books during the 15th and
16th centuries. In the second part, Alice Tangerini will discuss the changes in
the way contemporary botanical illustrations reproduce plants and communicate
the knowledge contained in such representations to a world wide audience. She
will devote a special attention to the methods of reproduction of illustrations
as illustrated by the transformation from woodcuts and etchings to digital
images to be consulted on the Internet.
Marion Lobstein, Associate Professor of
Biology, NVCC
The Flora of Virginia Project: A Perspective of 401 Years
of Exploration of Virginia Botanical Diversity
Virginia has the greatest diversity of plant species for its
surface area of all of the 50 states in the U.S. In this presentation, Marion
Lobstein will discuss reasons for this diversity and botanical exploration in
Virginia since the founding of Jamestown in 1607. In 1737, the Colony of
Virginia had the first flora of any of the original thirteen colonies, The
Flora Virginica by John Clayton in 1737, but the Commonwealth of Virginia has
not had a modern flora since that time period. This presentation will cover the
progress being made to produce a modern Flora of Virginia in the next three
Rod Simmons, Plant
Ecologist, City of Alexandria Department of Recreation, Parks, and Cultural
Activities, Park Planning Division
A Survey of Native Oaks and Their
Hybrids in the Greater Washington, D.C. Area
The Washington-metro region contains a wide diversity of native oak
species and natural oak hybrids. Live material will be used in this hands-on
workshop, which will cover all of the native oak species throughout the region
and the known natural hybrids. Species range and distribution, habitats, and
rarity will also be discussed.
Kimberly L. Hunter PhD, and
Richard B. Hunter, Department of Biological Sciences, Salisbury University,
Salisbury, MD
Plants, Polyploidy, and Undergraduate Research at
Salisbury University
Sunday 2:00PM
Room 390
The goal of the program was to increase undergraduate botanical
research, and to find innovative methods for recruiting students. In most
departments on college campuses, there are independent study/research courses.
Such faculty members mentor 1-6 students working on projects that the faculty
member designs. We have increased the number of students to 20, and have
students work on common projects as a team. Each project involves most of the
following: field collection of plant samples, modern genetic analysis,
intensive literature research, grant writing, lab work, data analysis, and
presentation of the research. There were biweekly meetings with each group and
the faculty mentor. Progress of the projects depended upon the effort each
group. This method was evaluated in four ways: 1) number of presentations, 2)
students continuing beyond the first semester, and 3) monitoring students going
on to graduate or professional school.
Mark Holland, PhD, Head of
the Department of Biological Sciences, Richard A. Henson School of Science and
Technology, Salisbury University, Salisbury, MD
PPFM bacteria: Plant
symbionts with applications in agriculture
Sunday 2:30PM
Room 390
Pink-pigmented facultatively methylotrophic (PPFM) bacteria in the
genus Methylobacterium are ubiquitously distributed on plant surfaces. Although
once thought to be insignificant or accidental visitors on plants, we have
demonstrated that their metabolic activities have a positive effect on plant
growth and development and that their relationship with plants is a true
symbiosis. We have also developed strategies for exploiting the symbiosis to
the benefit of agriculture.
Emily N. Burnett, Kimberly
Hunter and Richard Hunter, Department of Biological Sciences, Salisbury
University, Salisbury, MD
Genetic Diversity of Liquidambar styraciflua
in Cusuco National Park, Honduras
Sunday 3:00PM
Room 390
Liquidambar, the sweet gum tree has classic disjunct distributions
in western Asia, eastern North America and Central America. In collaboration
with Operation Wallacea, who operates biological conservation expeditions in
regions of the world with high biodiversity, Liquidambar styraciflua was
gathered from five sites in the cloud forest of Cusuco National Park, Honduras.
The goal of this research was to find the genetic diversity of this species in
a primitive field location. DNA extraction using DNeasy Qiagen kits were used
along with puReTaq Ready-To-Go PCR beads to amplify the DNA. Through ISSR
(inter-simple sequence repeats) the genetic variability was mapped using
primers 825, 840 and 855. These repeats recognized by the primers occurred in
individuals in varying numbers. The differences in the numbers of repeats were
visualized as bands on a gel which were separated according to molecular
weight. A Cambrex Flash Gel was used to analyze the DNA indicating variability
among individuals. Analyzing DNA in field situations is possible due to the
stability of the PCR bead and Cambrex Flash Gel at room temperature. The mean
gene diversity ranged from 0.23 to 0.36 while the percent polymorphic loci
ranged from 61% to 90%. The importance of this research was to aide in the
conservation research taking place in Cusuco National Park. Our research
provided a stepping stone into genetic research for Operation Wallacea as well
as an accurate representation of the genetic diversity of Liquidambar
styraciflua in Cusuco National Park, Honduras.
Katharine Spencer, Emily N.
Burnett, Mary E. Cockey, Kimberly Hunter, Richard Hunter, and Katherine Miller,
Department of Biological Sciences, Salisbury University, Salisbury, MD

Nordihydroguaiaretic Acid (Ndga) Localization and Quantification in Three
Ploidy Levels of Larrea Tridenta
Sunday 3:30PM
Room 390
Larrea is one of the dominant perennials in the deserts of North
and South America. North American Larrea tridentata has three ploidy levels in
three distinct regions: Chihuahuan Desert –diploid; Sonoran Desert –
tetraploid; Mojave Desert – hexaploid. Nordihydroguaiaretic acid (NDGA) is
one of the most widely investigated phytochemicals within this genus. NDGA is a
tetrahydroxy lignan found in Larrea leaves at a concentration of 5-10% dry
weight. It is a powerful antioxidant that inhibits cancer, microbes, fungi, and
viruses. Larrea tridentata ploidy levels were examined to determine a
relationship between higher ploidy levels and concentrations of NDGA. Methanol
was used to extract NDGA from 20mg (2 leaves) of plant tissue, which was
quantified using high performance liquid chromatography (HPLC). Plant tissue
from the three different ploidy levels was analyzed and results suggest NDGA
concentrations are variable across the landscape. However, a correlation does
exist between NDGA concentration and time of year collected. There is an
increase in NDGA concentration during August. August is the hottest and driest
month in these regions. Secondly, subcellular fractionation was used to
determine the unknown location of NDGA in Larrea leaf cells. A density gradient
was formed in Percoll and gradient layers were tested for NDGA concentration. A
high concentration of NDGA was found in Larrea chloroplasts.


Thomas Meylan, PhD EvolvingSUCCESS
Can the features of elective virtual communities be used to create effective
virtual workforces?
Saturday 9:00AM
Room 370
Virtual communities are springing up as quickly as people can
identify enough points of mutual identification to justify the establishment of
a virtual meeting space. The communities being considered in this presentation
are built around “elective” membership. People WANT to build software products
on SourceForge. People WANT to play simple games together on Horse Isle. People
WANT to formulate different methods of interaction on Second Life. The question
is: Can virtual community models based on elective membership be transferred to
workforce deployment needs? Three factors considered in this presentation are:
1) the intellectual price of entry into an elective community and how it
relates to the financial costs of setting up an online, virtual workforce; 2)
metrics of pleasure and fulfillment in elective communities, and how they
relate to productivity and job satisfaction in virtual-but-definitely-real
task-driven workforces; 3) structures, rules, customs and conformity
enforcement procedures in elective communities that can transfer to virtual
workforce environments.
Gene Allen, MSC.Software Corporation
Simulation-supported Decision Making
Saturday 9:25AM
Room 370
Engineering provides a Knowledge Base for decision making. An
Engineering Knowledge Base is the culmination of education, training, and
experience that provides insight and understanding of how things work or don’t
work. A program’s Engineering Knowledge Base consists of the knowledge and
expertise of all the personnel involved over the lifecycle of a program with
all accompanying documentation. The majority of an Engineering Knowledge Base
is learned from experience in testing and operations. However, learning from
prototype testing and operational accidents/problems is both costly, time
consuming, and risky. In the past, this has been an accepted cost of adopting
new technologies, as it has been the only way we learn about what we do not
know. The unanticipated and often non-intuitive results of new technologies are
often realized in operations, and sometimes only after decades. This
uncertainty is the result of combinations of factors or characteristics, all of
which have natural ranges of variability. This variability and uncertainty has
historically been taken into account through the use of safety factors, based
on experience. The advances and availability of compute capability can be used
as a substitute for the experience-based safety factors used in design. Virtual
data can be generated by running multiple physics-based analyses of a
parameterized computer model, varying parameters across their natural ranges
with each run. This process provides an accurate simulation of reality. Results
are a cloud of points with each point being an accurate result of that specific
combination of variables. The simulation process includes as many variables as
possible. A simulation consists of 100 analysis runs, sampling all variables
using advanced Monte Carlo sampling methods. 100 analysis runs provides a
simulation resolution equivalent to the resolution of inputs. This process
minimizes the need for making initial assumptions, which are often a source of
problems as people most often do not know what they do not know at the time of
making their assumptions. Different correlation methods are used to filter the
number of variables in the simulation result to those individual variables, or
groups of variables, that are most significant. This provides information that
can be used to understand what can happen. Additionally, automatic outlier
detection can be used to quickly identify those combinations of variables what
generate anomalies. The combination of 1) correlation information and 2) the
knowledge gained through understanding outliers provides accurate input to the
Engineering Knowledge Base that can be used for Decision Making. Simulation,
using today’s readily available compute capability, is being used to learn and
gain otherwise unavailable knowledge for making decisions.
Geoffrey P Malafsky, TECHi2
A New Technology for Unifying
Knowledge and Semantics to Harness the Fusion of
People-Process-TechnologySimulation-supported Decision Making
Saturday 9:50AM
Room 370
The information universe continues to expand from what can be
considered its own big bang of the advent of the openly available Internet
while producing only a few regions of viable systems. The potential for
widespread knowledge sharing and automated processes based on human concepts
remains an abstract goal except for these few systems. Some of these are
focused on the positive inclinations of humanity, such as global knowledge
sharing among like-minded professionals and hobbyists, others on the
entertainment interests of groups, and finally others on the more banal human
tendencies. Yet, in each case processing and managing these information clouds
into useful environments has been extremely costly and replete with low-quality
products. Data quality is the foundation of using data for effective management
and operations. Data in and of itself is not the goal of an enterprise data
management system nor business process. Rather, it is the use of the data to
support decision-making and operations that is the goal. Yet, data quality is
inextricably tied to business processes, governance regulations, and technical
standards. This inherent integration across people, process, and technology has
impeded, and in many cases, derailed attempts to ensure that data is accurate,
trustworthy, secure, interoperable and shareable. Even more important to the
operation and difficult to achieve is maintaining data quality in continuous
operations. Data is frequently stored and used in widely different
technologies, formats, and varying levels of quality. Attempts at using the
previous generation of data technologies (e.g. relational databases, data
warehouses, business intelligence tools, client-server architectures, data
brokers, metadata management, case based reasoning, natural language
processing, expert systems, object modeling) have failed to solve this
challenge. The real solution of semantically integrated data explicitly tied to
governance defined rules remains unsolved. The data must be authoritative and
correct at the element level in both organizational and technical terms. We
have created a new technology that meets the architectural requirements of
modern standards based modular components like a Services Oriented Architecture
(SOA), and also ties together human knowledge to machine processing rules and
semantics to provide a light-weight, adaptive, highly functional solution to
harnessing enterprise-scale data that is authoritative. This technology is
poised to yield a dramatic increase in organizational capabilities from the
combination of the knowledge held by people with the rules required by
processes to the intelligent automation enabled by semantic data.
F. D. Witherspoon, HyperV Technolgies Corp
High Velocity Dense Plasma Jets for Fusion Applications*
Saturday 10:15AM
Room 370
High velocity dense plasma jets are under development at HyperV
Technologies Corp. for a variety of fusion applications. The initial motivation
for this line of research was Magneto-Inertial Fusion using high density, high
velocity plasma jets as standoff drivers to implode a magnetized plasma target.
Additional applications include fusion reactor feuling, injection of angular
momentum into centrifugally confined plasmas, high energy density plasmas
(HEDP), plasma thrusters and others. The near term technical goal is to
accelerate plasma slugs of density greater than 10^17 per cc and total mass
above 200 micrograms to velocities above 200 km/s. The approach utilizes
symmetrical injection of very high density plasma into a coaxial
electromagnetic accelerator having a tailored cross-section geometry designed
to prevent formation of the blow-by instability. The injected plasma is
generated by an array of radially oriented ablative capillary discharges
arranged uniformly around the circumference of an angled annular injection
region of the accelerator, or by a similar array of small non-ablative parallel
plate minirailguns now under development. We describe computer modeling and
experiments to develop these plasma jets, including descriptions of an
injection experiment currently underway at the University of Maryland, and an
early test of jet merging using 64 capillary injectors to form an imploding
ring of dense plasma.
*Work supported by the U.S. DOE Office of Fusion
Energy Sciences
D. A. Tidman* and F. D. Witherspoon**
Slingatron – A Hypervelocity Mechanical Mass Accelerator
Saturday 10:40AM
Room 370
The slingatron(1) is a mechanical mass accelerator that is
dynamically similar to the ancient sling, but unlike the classical sling it
appears capable of accelerating projectiles of large mass to extremely high
velocity, possibly to above 10 km/sec. In this machine a projectile slides on
its self-generated gas bearing in a steel accelerator tube that guides the
projectile along a curved path that is typically circular or spiral. The
projectile accelerates when the tube is moved with an inward component along
the radius of curvature of the tube at the projectile location, i.e., along the
direction of the centripetal force acting on the projectile so that work is
done on the projectile. This motion of the tube can be implemented by mounting
the entire curved tube on distributed swing arms that propel the tube around a
gyration circle of relatively small radius without changing its orientation,
i.e., the structure gyrates but does not spin. A projectile accelerating in a
gyrating sling tube initially slides on its thin outer Teflon film. This film
wears off, and evaporation of a polycarbonate layer (or energetic plastic) then
provides a low friction gas film in the range from ~ 1 to many km/sec. Heat
from this hot bearing gas does not have sufficient time to diffuse far into a
projectile during its acceleration. For a phase-locked projectile accelerating
in a spiral the acceleration time is the number of turns times the gyration
period, e.g., 0.05 seconds for a 3-turn spiral with a gyration frequency of 60
cps. A theoretical model and experimental data also show that the projectile
gas bearing is thicker than asperity heights on the steel track, so that
sling-tube damage is avoided. For large projectiles, the bearing gas film is
thicker than for small projectiles so that its viscous drag per cm2 on the
track and the sliding friction coefficient are smaller. This occurs because the
“residence time” of gas evaporated into the bearing of a large projectile is
longer than for a small projectile. The slingatron mechanics is similar to
rolling a ball bearing around in a circular frying pan (or sliding an ice cube
in a cooled pan) in a horizontal plane and gyrating the pan around a small
circle, except that the slingatron gyration speed is much higher, e.g., ~ 150
m/sec for a projectile velocity gain of ~ 1 km/sec per turn, and the projectile
slides at high velocity and low friction along the curved path on its
self-generated gas film. We will discuss the dynamics, mechanics, and some
preliminary experimental results for this hypervelocity mass accelerator.
*ALCorp, 6801 Benjamin Street, McLean, VA 22101, 703-790-0620, **HyperV, 13935 Willard Road, Chantilly, VA 20151,
703-378-4882, (1)”Slingatron, A Mechanical
Hypervelocity Mass Accelerator”, D. A. Tidman, Aardvark Global Publishing,
2007. A book available at
James Jordan and James Powell, Maglev-2000,
Maglev Transport – A Necessity in the Age of No Oil
Saturday 2:00PM
Room 370
World oil production will peak in the next few years and then
steadily decline, causing prices to rapidly escalate far beyond the present
$100 per barrel. To maintain affordable transport of people and goods in the
oncoming Age of No Oil, a crucial requirement for the U.S. and World economy,
new modes of transport that can operate without oil must be quickly developed
and implemented on a large scale. The 3 new options proposed to date –
biofuels, hydrogen, and coal-to-liquids – do not appear to be practical
solutions. Biofuels are very limited in supply capability, and will cause major
increases in food prices and World hunger, and seriously degrade long-term soil
fertility. Hydrogen fuel requires an enormous increase in expensive electrical
generation capacity, much greater than now exists. Coal to liquid fuel greatly
increases greenhouse gas emissions and drastically accelerates global warming.
Electrically powered transport can meet U.S. and World future transport needs
in an affordable, energy efficient, and environmentally acceptable manner.
Maglev transport will play a major role in electric transport. The advanced
Maglev-2000 system which can transport passengers, highway trucks, freight, and
passenger autos at speeds of 300 mph with much less energy usage and at much
lower cost than present transport systems, is described. Implemented as a
25,000 mile National Maglev Network, using the rights-of-way along the
Interstate Highways and existing railroad trackage, it would serve all major
U.S. Metropolitan regions in a seamless high speed web. The first phase of the
National Maglev Network, a transcontinental East-West and North-South routes
could be operating by 2019 AD. The Network would be financed by private
investment, with a payback time of 5 years, using revenues generated by
transporting long-distance highway trucks.
John Bosma, NRAC, Arlington, VA Tech
Futures 2008-2030
Saturday 2:25PM
Room 370
The key elements of tech trends influencing global security,
economics and business through 2030 are discussed, covering the following nine
major topics: 1) ‘Bio-fication’ and MEMS-ification of ‘primary industry’
processes: hydrocarbon energy mining, renewables (solar), farming and forestry,
metals mining and metallurgy, metals/radionuclides cleanup, extremophile
industrial biologies; 2) Conventional-oil exhaustion (50% drop by 2030?) – vs.
massive North American hydrocarbon stocks for liquefaction; can US/Canada
become ‘new Persian Gulf’? 3) Step-jump productivity/ROI improvements in ‘big
iron’ industrial activities (fast intermodal shipping, solid free-form
manufacturing…) will outrank IT whoopee; 4) ‘Big iron’ ops in extreme
environments (deep offshore oil, Arctic oil and minerals, heavy lift (4000
tons) across tundra/wetlands, in-situ coal mining; 5) 4-5 orders-of-magnitude
reduction in size-weight-cost of future military ops (including terrorist ops)
from current state-of-the-art – e.g. 25-lb SUAVs (small unmanned air vehicles)
crossed Atlantic unrefueled in Dec 1998, mass production cost <$15,000; 6)
Personalized medicine (self-diagnosis via electronic/sensor-textile underwear,
noninvasive wearable ultrasound, cheap personal medical imagers, global
Web-enabled ‘on call’ medical expertise w/ ‘smart agent’ software assist,
telemedicine; 7) Rapid development of poor-country Internet, telephony/WiMAX,
women-owned startups, communications-based rural take-off – but medical MUST be
solved; 8) Dirt-cheap space launch via all-mechanical Slingatrons ( – enables cheap missile defense, cheap distributed-aperture solar power satellites; 9) Downsides/’bad news’: high
probability of 1-2 regional nuclear wars (Pakistan, Iran) and/or terror nukes
in US, Europe (7-12 cities) – w/ TBD aftermaths but assuredly strong
anti-nuclear, anti-terror tech, ‘defensive emphasis’/BMD drives.
Richard Smith, MS President Flexible Medical
Sysems, LLC
Micro- and Nanotechnologies for Near Term Medical
Saturday 2:50PM
Room 370
Micro- and nanotechnologies are allowing university and lab
scientists to create medical diagnostic capabilities never before seen. These
new capabilities can facilitate the realization of telemedicine and make
universal health care affordable. While these capabilities were once thought to
be far in the future, commercial products are in the human trials stage right
now. The CEO of a local medical diagnostics company will discuss the science,
the commercialization process (including traversing the “valley of death”) and
the remarkable near-term possibilities.
Jim Burke Manager Futures, Forecasting, and
Change Management Northrop Grumman
Scientific Leaders and the Workforce
of 2015
Saturday 3:15PM
Room 370
This talk emphasizes that 2015 is not that far off, but that a lot
can happen in eight years, especially in the workforce. The pace of change
calls for attention to the trends that are driving change within technical
organizations and the rest of the world. Some trends are micro-trends (e.g.,
the growth of self-storage businesses in the US), while others are macro-trends
(e.g., population growth or decline). The three buckets of trends that will
affect most scientific organizations in the next eight years fall are:

  • People
  • Processes
  • Technology

People People across the US and around the world,
especially young people coming into the workforce, share growing demands to be
connected and share social experiences. The Internet is fueling and serving
these demands, with websites like YouTube, MySpace, Facebook, and Flickr. These
sites are being visited by millions of people, many of whom are in the 18-35
year-old range. The young workforce of 2015 will come pre-wired, with high
expectations for workplace technologies that allow them to work off site and
while traveling. Processes The leaders of the workforce of 2015 can
expect to use a broad range of management skills and techniques that recognize
the multiple intelligences of the workforce, including those identified by
Howard Gardner in Five Minds for the Future. A leader’s orientation to the
future, along with honesty, will be the qualities most admired by the
workforce. These qualities and recognition of workforce needs will be
especially important when considering the trends regarding talent in the
workplace, with half the senior leaders leaving the government and major
corporations in the next five years, and a projected US shortfall of some ten
million workers by 2010. The processes that will affect organizations in the
coming years in some cases are extensions of trends already firmly in place,
like sustainability, micro-agriculture, and the changing role of food (genetic
manipulation, neutriceuticals, biofuel, and custom pets). Others are beginning
to emerge and have the potential to alter the workforce focus in the
2000-teens. Technology The range of technology that will affect high
tech operations in 2015 is broad and deep. One trend that technology will
increasingly be called on to resolve is information overload. With information
doubling some 12 to 18 months, the challenge for science workers, like most
technical employees, is to find the right information at the right time. The
talk will address some solutions for the dilemma.

Limor Schafman, Keystone Tech Group Fairfax,
Virginia .
Distributed Business: How IPv6 Will Change Business and
Government Operations
Saturday 3:40PM
Room 370
Central command and control is a concept of the past. We are
seeing many technological developments and the human response to these new
products and services that indicate a mind and behavioral shift taking place at
a fundamental, root level. IPv6, the New Internet protocol is more than just an
address structure. It is a mindset. Its impact on the social, mobile and
connected world will alter how businesses and government agencies are
structured, how employees work, how opportunity is identified and acted upon,
and how value is brought to an organization. This presentation will discuss
what IPv6 and other technologies mean to business operations, reasons for these
changes and what the implications and opportunities for business and government
can be going forward.


Recent Research in Human Skeletal
Room 330
Martin C. Solano, PhD,
Contractor, Repatriation Osteology Laboratory, Department of Anthropology,
Smithsonian Institution
Sex differences in skeletal trauma among the
19th century working class.
Skeletal trauma is analyzed in individuals from an almshouse
cemetery skeletal sample from Albany, NY. The cemetery served as a potter’s
field for almshouse decedents and unclaimed bodies from Albany County from 1826
to 1926. Marked differences in the patterns of fractures were observed with
respect to age and sex, reflecting occupational hazards and interpersonal
Ryan W. Higgins, George
Washington University, Department of Anthropology (graduate student)

Limb Proportion Inheritance and Ancestry Determination from Fetal Crural and
Brachial Indices
Relative distal limb length is found to correlate with climate in
modern human populations. The question remains whether this trait is determined
by adaptation or epigenetic influences. Experiments on laboratory animals
support the hypothesis that cold temperature influences limb development by
reducing growth plate kinetics and/or vascular supply. Reduction in vascular
supply would theoretically have a more pronounced effect on the smaller limb
segments. Furthermore, differences in nutritional uptake between generations
(i.e., secular trends) may affect distal limb segments more than proximal
segments. Together these findings suggest intergenerational plasticity in (1)
distal limb segments and ultimately (2) brachial and crural indices.
Conversely, if upper and lower limb segment proportions are genetic traits
shaped over generations by natural selection and affected little by ambient
temperature and nutritional uptake during development, then limb segment
proportions may aid forensic scientists in determining ancestry from the
postcranial remains of immigrant populations adapted to different ancestral
climates. The present study seeks to use (1) a natural experiment, the
migration of Africans and Europeans to North America, to examine the role of
genetic and epigenetic influences on human limb proportions, and (2)
discriminant function analysis to assess the forensic value of limb proportion
data for determining ancestry in adult and fetal African Americans and European
Marilyn R. London, MA,
Lecturer, Department of Anthropology, University of Maryland and Erica B.
Jones, MA, Department of Anthropology, Smithsonian Institution
fusion of the mandible to the cranium during childhood in an Eskimo from
southwestern Alaska
A rare case of fusion between the mandible and the cranium is seen
in an individual from a cemetery in southwestern Alaska. Although the fusion
appears to have occurred in early childhood, the remains are those of an adult
female, aged 30 to 45 years at death. The effects of the fusion on the life of
the individual must have been significant. The mouth could not be opened,
although some movement prevented atrophy of the mandible. The food passage was
narrow and her food may have been softened or liquefied. Speech may have been
somewhat difficult. However, there are indications throughout the skeleton of
osteoarthritis, and both tibiae exhibit squatting facets. This suggests that
the individual lived an active life and performed routine activities. The
etiology of the fusion is discussed.
David R. Hunt, PhD and Deborah
Hull-Walski, MS, COllections and Archives Program, Department of Anthropology,
National Museum of Natural History, Smithsonian Institution
“All That
Remains” – Multidisciplinary study of a mid-19th Century Iron Coffin and
Identification of the Individual Within
In April 2005, a cast iron coffin was discovered during
construction in the Columbia Heights neighborhood of Washington, DC. A
multidisciplinary study of the coffin and the contents was done to investigate
preservation of bodies in iron coffins, the historical funerary significance of
this type of burial, and, if possible, the identity of the individual inside.
The biological profile of the individual was determined, by CT scanning and
autopsy, to be an approximately 15 year old male of European ancestry. He died
of lung infection with probable complications due to a heart valve disorder.
Samples were taken and analyses performed for DNA, isotopes, and various
pathogens that might be present in the coffin and body. The analysis of the
clothing and coffin manufacture established the date of death at approximately
1851-2. After two years of extensive historical and genealogical research, the
possible identity of the boy in the coffin was narrowed to three individuals.
Smithsonian anthropologists obtained DNA from living relatives of each lineage
and an absolute match to the boy was made. William Taylor White was from
Accomack Co. Virginia, was attending Columbia College and died January of 1852.
Many of his descendents still live in the Virginia eastern shore area. “Thus is
cut off, in the morning of his days, one in whom many hopes were
centred—and who had the fairest prospects of happiness and usefulness in
life.”—Excerpt from White’s obituary, published Feb. 8, 1852, in the
Religious Herald (Richmond, Va.).
Lynn Snyder, PhD, Science
Director, Azoria Project, Crete; Researcher/Contractor, Department of
Anthropology, National Museum of Natural History, Smithsonian Institution

Faunal and human remains from a 2nd century BCE well in the Athenian Agora;
evidence of animal sacrifice and infanticide in Late Hellenistic
The human and animal bones recovered from Well G5:3 in the Athenian
Agora received little notice when they were first discovered in 1937/38, beyond
a short note that the well contained human remains and “over eighty-five dogs”.
In 1945, J. Lawrence Angel published a short description of the human remains,
noting the presence of “about 175 infants”, an adult male and an 11-year-old
child; he posited that the infants represented either deaths by exposure, or
victims of disease and/or starvation. In 1996, a thorough re-examination of the
skeletal materials from the well was begun, leading to the identification of
the remains of 450 human infants, plus more than 150 dogs. Restudy of these
remains indicate that the human infants were placed in this isolated location,
away from the more urban and public areas of the Agora, with some care, and may
represent still births and newborns who failed to thrive. References in ancient
sources on childbirth indicate that infants were not accepted as full members
of society until several weeks or months after death, and thus not afforded
full burial rites; they also suggest that the dogs may have been sacrificed in
purification rites associated with female fertility or childbirth.
J. Christopher Dudar, PhD.
Repatriation Office, Smithsonian Institution, National Museum of Natural
Archaeological discovery of a previously undocumented case of
an anencephalic infant from a 19th Century Upper Canadian cemetery
Anencephaly is defined as the absence of normal brain development
due to a severe neural tube defect, and is among the leading causes of
perinatal death in the developed world. While the incidence of anencephaly
ranges between 1 to 5 per 1000 births, there is only one published example of
anencephaly from the archaeological record, an Egyptian mummy described by
Étienne and Isidore Geoffroy Saint-Hillaire in the early 19th century.
Compelling evidence will be presented for the diagnosis of only the second case
of anencephaly in the paleopathological literature, an archaeologically
recovered 8.5 to 9.5 lunar month gestation fetus from a pioneer Upper Canadian
Matthew W. Tocheri, PhD,
Human Origins Program, Department of Anthropology, National Museum of Natural
History, Smithsonian Institution
Concerning the Evidence for
“Hobbits”: An Overview of Homo floresiensis
The hominin skeletal remains discovered four and a half years ago
at Liang Bua cave on Flores, Indonesia have intrigued scientists and the
general public alike. The initial analyses suggested that the remains belonged
to a previously unknown species of hominin, prompting the discoverers to name a
new species, Homo floresiensis, and to nickname them the “hobbits” of
human evolution. Some researchers have rejected the new species claim, arguing
that the remains more likely represent modern humans with skeletal pathologies
or growth disorders. However, the physical evidence in favor of the new species
hypothesis continues to expand as more researchers conduct detailed comparative
analyses of different aspects of H. floresiensis anatomy. As such, most experts
of human evolutionary anatomy recognize the Flores hominin remains, which
currently date between approximately 95 and 12 thousand years ago, as a
legitimate taxon. In this presentation, I provide an overview of H.
floresiensis anatomy in comparison to that of modern humans (normal and
abnormal), other fossil hominin species, and extant great apes. I also discuss
how this evidence applies to current and future debates about the paleobiology
of H. floresiensis—a debate which is no longer centered on whether the
“hobbits” are pathological modern humans or a distinct hominin
species, but rather on the particular details of their evolutionary history and
functional morphology.


Managing Your Career in
This session will begin with an overview of current
data on gender equity in science and the impact that workplace culture,
climate, and policies have on the recruitment, retention, and success of women
working in scientific disciplines. In this context, speakers from both academe
and industry will address the skills necessary for success in building
scientific careers beyond the bench with particular emphasis on the different
expectations and challenges faced by women as they advance to
managerial/professorial and executive/department chairperson roles. Speakers
will discuss how to identify and use opportunities both to hone and expand the
needed skill sets. The session will also highlight ways to successfully
navigate career transitions, focusing on issues such as setting career goals,
deciding when to make a career change, and identifying opportunities suited to
individuals’ strengths and interests. The session will conclude with a
moderated panel discussion, offering participants and panelists an additional
opportunity to share insights and exchange ideas.
Room 380
Natalia Melcer-Program Officer, National
Academy of Sciences
Welcome and Introduction
Room 380
Ruth Fassinger-Professor, Department of
Counseling and Personnel Services, University of Maryland, College Park
Surveying the Landscape for Women in Science
Room 380
Gretchen Schieber-Vice President, Product
Development, Adlyfe, Inc. and Rachelle Heller-Associate Dean for Academic
Affairs, The George Washington University
Skills for Succeeding
in Science
Room 380
Alicia M. Rodriguez-Certified Executive
Leadership Coach, Sophia Associates and Leanne Posko-Managing Director,
Community Partnerships, Constellation Energy
Navigating Career Transitions
Room 380
Jennifer A. Hobin-Science Policy Analyst,
Federation of American Societies for Experimental Biology

Moderated Panel Discussion
Room 380


Bill Spees, PhD, Forensic
Software Engineer, Division of Electrical and Software Engineering Center for
Devices and Radiological Health Office of Science and Technology, Food and Drug
Administration and Practitioner Faculty, University of Phoenix
Lightweight Java State Machine
Sunday 2:00PM
Room 365
Classical state machines offer clarity, efficiency, and precision.
Recent technology has provided opportunities to update and improve state
machines, but the usual updates have clouded the simple workings of state
machines with burdensome object oriented decoration.We will revisit the state
machine and discover how it can appropriately enhance an object oriented
system. We will explore a Java program for a basic board game and extend it
with house rules.


W. Ronald Heyer President,
Biological Society of Washington
Can long-established, narrow-niche
scientific societies such as the Biological Society of Washington survive the
digital age?
Saturday 2:00PM
Room 365
The Biological Society of Washington was formed in 1880 primarily
as a forum for the Washington based biologists to meet and discuss current
biological research, with publication of those discussions and other submitted
articles to the Society’s journal, the Proceedings of the Biological Society of
Washington. A few years after formation, there was a general trend toward
emphasizing the journal over the meetings and moving from all aspects of
biology to the related topics of systematics and taxonomy. This trend
culminated in the late 1950s, with the stated purpose of the Society to be:
“For the furtherance of taxonomic study of organisms and for diffusion of
biological knowledge among interested persons.” This purpose served the Society
well through the 1990s. Perhaps associated with the decline in support for
systematics and taxonomy by the US academic community, the membership of the
Society has been in slow decline since 1993. This decline, combined with
competition from the new (2001) journal Zootaxa (an electronically produced and
distributed journal dedicated to animal taxonomy), together with younger
scientists preferring pdf files of publications over hard copy, sounded an
alarm to the elected officers and Councilors of the Society. Deliberations
resulted in undertaking major changes in the management and delivery system of
the Society’s publications together with activities to garner new members and
institutional subscribers. The actions taken are recent and it is too early to
assess whether they will be successful or hasten the demise of the

SOCIETY OF WASHINGTON (see American Society of Plant


Jesse Gallun and Jennifer Young, ACS Green
Chemistry Institute, American Chemical Society
Green Chemistry and the
ACS Green Chemistry Institute
Sunday 11:00AM
Room 310
Green chemistry finds sustainable solutions through innovative
technologies while preventing pollution, through the reduction or elimination
in the use and generation of hazardous substances. The American Chemical
Society Green Chemistry Institute (ACS GCI) has a mission to advance the
implementation of green chemistry and engineering principles into all aspects
of the chemical enterprise. To achieve this mission, ACS GCI works in several
strategic areas: research, education, industrial implementation, international
collaboration, communication and outreach, and policy advocacy. A brief
overview about ACS GCI and its activities will be shared, as well as
opportunities open to you, such as grants, awards, conferences, teaching
materials, and more. Since the green chemistry movement began in the early
1990’s, there are many real world examples of green chemistry, such as: new
adhesives that mimic nature while eliminating hazardous chemicals like VOCs,
redesigned pathways to important pharmaceuticals that significantly reduce the
use of hazardous chemicals and generation of waste, and other consumer products
that feature greener components and production. A number of these examples will
be presented, with an interactive discussion of factors that may lead industry
to utilize new green technologies or products.


Invited Talk by Frederika Darema, NSF
Dynamic Data Driven Applications Systems
Saturday 9:00AM
Room 310
This talk will discuss the Dynamic Data Driven Applications Systems
(DDDAS) concept, driving novel directions in applications and in measurements,
as well as in computer sciences and cyber-infrastructure. DDDAS entails the
ability to incorporate dynamically additional data into an executing
application (these data can be archival or collected on-line), and in reverse
the ability of the applications will be able to dynamically steer the
measurement process. The dynamic environments of concern here encompass dynamic
integration of real-time data acquisition with compute and data intensive
-systems. Enabling DDDAS requires advances in the application modeling methods
and interfaces, in algorithms tolerant to perturbations of dynamic data
injection and steering, in systems software, and in infrastructure support.
Research and development of such technologies requires synergistic
multidisciplinary collaboration in the applications, algorithms, software
systems, and measurements systems areas, and involving researchers in basic
sciences, engineering, and computer sciences. Such capabilities offer the
promise of augmenting the analysis and prediction capabilities of application
simulations and the effectiveness of measurement systems, with a potential
major impact in many science and engineering application areas. The concept has
been characterized as revolutionary and examples of areas of DDDAS impact
include computer and communication systems, information science and
technologies, physical, chemical, biological, medical and health systems,
environmental (hazard prediction, prevention, mitigation, response), and
manufacturing, transportation and critical infrastructure systems. The talk
will address technology advances enabled and driven the DDDAS concept, as well
as challenges and opportunities, motivating the discussion with application
examples from ongoing research efforts.
Ronald L. Ticker National Aeronautics and
Space Administration
The US National Laboratory on the International
Space Station
Saturday 9:20AM
Room 310
The International Space Station (ISS) is rapidly approaching the
long-awaited completion of assembly in 2010. All US core elements have been
integrated and tested on-orbit, and the attention of NASA has turned to
deployment of the European, Japanese, and Russian laboratories. Section 507 of
the NASA Authorization Act of 2005 designated the US segment of the ISS as a
“national laboratory”, opening up use to other US Government agencies, US
private firms and US academic institutions. This paper summarizes strategy and
plans for implementation of the ISS National Laboratory as well as applicable
research and support facilities. The original 1984 vision of a robust,
multi-mission space station serving as a platform for the advancement of US
science, technology and commerce will soon be achieved.
Gerard Christman Sr.,
Systems Engineer & PM OSD Technical Services Femme Comp Inc.
In the
Aftermath of the Indian Ocean Basin Tsunami: An Information Sharing Pilot
Program in Support of Humanitarian Assistance / Disaster
Saturday 9:40 AM
Room 310
On December 26th, 2004, the world was shaken by a magnitude 9.0
earthquake. The epicenter was located in the Indian Ocean Basin that resulted
in a tsunami wave extending around the world. Most of its devastating effects
were felt in Sri Lanka, India and Thailand. The US Department of Defense (DoD)
characterizes such an event as a Humanitarian Assistance Disaster Relief
Operation (HADR). With humanity impacted on such a scale, the key to saving
lives and providing relief is the ability to triage, assess, and determine
accurate situational awareness. Situational awareness can lead to aid and
services being directed for maximum benefit. This paper will discuss activities
and describe how the DoD is approaching the way forward to share information
with non governmental organizations (NGOs), other governmental organizations
(OGOs), private voluntary organizations (PVOs), host nations civil authorities,
agencies across the Interagency of the US Government, and international
Organizations (IOs). The DoD has through outreach developed a dialog and a plan
to create a Community of Interest (COI) to map out the business processes and
the information to be shared in order to enable non-DoD entities to marry-up
their capabilities with emerging requirements around theater of operations.
Tim Weil – Associate (CISSP/CISA) Booz |
Allen | Hamilton
Securing Wireless Access for Vehicular Environments
(WAVE): A Case Study of the Department of Transportation VII Project
Saturday 10:00AM
Room 310
The Department of Transportation Vehicular Infrastructure
Integration (DOT VII) program has paved the way for the Intelligent
Transportation Systems of tomorrow. VII envisions a future in which intelligent
vehicles routinely communicate with each other and the transportation
infrastructure in real time. Booz Allen Hamilton has led the Systems
Integrator’s role for building a model DOT VII network based on the deployment
of network and software infrastructure using a newly published set of IEEE
standards. The VII technical architecture is based on IEEE 1609 Wireless Access
for Vehicular Environments (WAVE) standards which define an architecture and a
complementary set of services that enable secure vehicle-to-vehicle and
vehicle-to-infrastructure wireless communication. The IEEE WAVE family of
standards(1609) provide the foundation for a broad range of applications in the
transportation environment, including vehicle safety, public safety,
communication fleet management, automated tolling, enhanced navigation, traffic
management and other operations. The recently published WAVE Networking
standard (IEEE 1609.3) provides an Intelligent Transportation Systems framework
from which a Proof of Concept VII Service Oriented Architecture, WAVE
Networking Stack, and the Real-Time Messaging VII have been implemented. This
VII model also includes a detailed description of the Publish / Subscribe MQ
Architecture developed to support collection/parsing of vehicle probe data and
the scheduling/delivery of standard SAE J2735 messages to vehicles in a limited
connectivity environment. The suite of WAVE protocols provides application
services and Dedicated Short Range Communication (DSRC) communication channels,
allowing secure messaging and application services between wireless roadside
access points and vehicle radio transceiver units. This wireless security
technology, IEEE 1609.2, WAVE Security Services for Applications and Management
Messages, presents the VII program with Identity and Access Management
challenges An examination of the working model will demonstrate the use of
Mobile PKI to manage VII actors, messaging and applications using DSRC/WAVE
communication services. The discussion will conclude with an overview of how
the Communications Industry is positioned to take advantage of the IEEE 1609
standards for Application Services, IPv6 Networking and Multi-Channel Radio
Haik Biglari – Fairchild Controls
Past, Present and Future of Safety-Critical Real-time
Embedded Software Development
Saturday 10:20 AM
Room 310
Safety-Critical systems are those systems whose failure could
result in loss of life, cause significant property damage or cause damage to
the environment. These complex systems tend to have sufficient kinetic or
potential energy which can become uncontrollable and thus pose a hazardous
condition. Therefore, the system controller must be designed in such a way as
to guarantee system stability during all of the system operational modes.
Furthermore, when a fatal fault occurs, the controller shuts down the system
safely. This paper will present the evolution of software development for these
systems, current certification issues, the gap that exists between systems
engineering and software engineering disciplines, software reuse, use of
productivity tools and the future of safety-critical real-time embedded
software development.
Ashwin Swaminathan (University of Maryland,
College Park)
Digital Detective for Electronic Imaging
Saturday 10:40 AM
Room 310
Electronic imaging has experienced tremendous growth in recent
decades, and digital images including those taken by digital cameras have been
used in a growing number of applications. With such increasing popularity and
the availability of low-cost image editing software, the integrity of digital
image content can no longer be taken for granted. Rapid technology development
has also led to a number of new problems related to protecting intellectual
property rights, handling patent infringements, authenticating acquisition
source, and identifying content manipulations. In this presentation, we
consider the problem of image acquisition forensics and introduce a fusion of a
set of signal processing features to identify the source of digital images. We
show that traces of the in-device processing operations such as color
interpolation along with the noise characteristics of devices’ image
acquisition process jointly serve as good forensic features to help accurately
reconstruct the history of the input image to its production process and
differentiate between images produced by cameras, cell phone cameras, scanners,
and computer graphics. Through analysis and extensive experimental studies, we
demonstrate the effectiveness of the proposed framework for image acquisition
forensics. (Include joint work with Prof. Min Wu, Prof. K.J. Ray Liu, Dr.
Hongmei Gou, and Ms. Christine E. McKay.)
X. Zhu 1, 2, Y. Yang 1, Q. Li 1, D. E.
Ioannou 1, J. S. Suehle 2 and C. A. Richter 2
High Performance Silicon
Nanowire Field Effect Transistor and Application to Non-Volatile Memory
ECE Department, George Mason University, Fairfax, VA 22030, USA 2. CND Group,
Semiconductor Electronics Division, NIST, Gaithersburg, MD 20899, USA
Saturday 2:00PM
Room 310
We report the fabrication and characterization of double-gated
silicon nanowire field effect transistors (SiNWFET) with excellent
current-voltage characteristics, low subthreshold slope (~ 85mV/dec) and high
on/off current ratio (~ 106). The silicon nanowire devices were fabricated by
using a self-aligned technique with standard photolithographic alignment and
metal lift-off processes, ensuring large-scale integration of high-performance
nanowire devices. We have studied the effect of device structure and forming
gas rapid thermal annealing on the nanowire transistor’s electrical properties.
We attribute the excellent current-voltage characteristics displayed by our
devices to the low interface state densities achieved by the above fabrication
process. We also report non-volatile memory cells (NVM) based on these
nanowires. The SiNWs are integrated into memory devices by using a
self-alignment technique. The top gate dielectric, which surrounds most of the
nanowire, consists of three stacked layers: blocking SiO2, charge-storing layer
HfO2 and thin tunneling oxide. Prior to the SiNW growth a thermal SiO2 was
grown on a p-type silicon wafer by dry oxidation to form the bottom-gate oxide
of these dual-gated structures. The diameter of the SiNW is ~ 20 nm and the
gate length ranges from 2 µm to 8 µm. When these devices are
electrically characterized, a large threshold voltage shift is observed under
voltage sweep of either the top or the bottom gate. The top gate control is
superior to that of the bottom gate control as demonstrated by the large memory
windows and large on/off current ratios (~107) observed in these devices.
Boris Veytsman(1), Leila Akhmadeyeva(2),
Fernando Morales(3,4), Grant Hogg(3), Tetsuo Ashizawa(5), Patricia Cuenca(4)
Gerardo del Valle(4), Roberto Brian(4), Mauricio Sittenfeld(4), Alison
Wilcox(3), Douglas E. Wilcox(3) and Darren G. Monckton(3)

Microsatellite Expansion: The Search for Underlying Pattern
1. George Mason
University, MS 5A2, Fairfax, VA 22030, USA 2. Bashkir State Medical University,
3 Lenina Str., Ufa, 450077, Russia 3. University of Glasgow, Glasgow G11 6NU,
UK 4. Universidad de Costa Rica, San Jose, Costa Rica 5. The University of
Texas Medical Branch, Galveston, TX 77555-0539, USA
Saturday 2:20PM
Room 310
Microsatellite expansion is the cause of a number of severe
diseases like Fragile X, Huntington disease, Myotonic Dystrophy and others. An
interesting common feature of the expansion in these case is the instability of
the mutation: once expanded, the number of microsatellites continues to change
in the patient’s cells. An understanding of the mechanism of microsatellite
expansion will help in the prediction of the individual development of the
disease and planning the medical care. Recently (J. Theor. Biol., v. 242,
401–408, 2006) we proposed a mathematical model to describe the mechanism of
the microsatellite expansion and resulting distribution of repeats lengths in
the patient’s DNA. Here we compare the theoretical predictions with the data on
the repeat lengths of a wide group of patients having Myotonic Dystrophy I (DyM
I). We find that the theoretical predictions agree fairly well with the
clinical data. The distribution of repeats lengths is close to the one
predicted by the mathematical model. We used the clinical data to estimate the
theoretical parameters: the rate of increase of the number of repeats and the
rate of widening the distribution. We find that while these parameters have
large individual variations, the average values give reasonable predictions for
the development of mutations. These values can be used to estimate the initial
mutation (the number of repeats in the progenitor allele) and to predict the
development of the disease.
Hojin Kee1, Newton Petersen2, Jacob
Kornerup2, Shuvra S. Bhattacharyya1
Synthesis of FPGA-Based FFT
1Department of Electrical and Computer Engineering, and
Institute for Advanced Computer Studies, University of Maryland, College Park,
20742, USA. 2National Instruments Corporation, Austin, 78759, USA. {hjkee,
ssb}, {newton.petersen, jacob.kornerup}
Saturday 2:40PM
Room 310
In this paper, we propose a systemic approach for synthesizing
field-programmable gate array (FPGA) implementations of fast Fourier transform
(FFT) computations. We also demonstrate these methods in the dataflow-based
programming environment of LabVIEW FPGA, and through our experiments, we show
efficiency levels that are comparable to, and in some cases better than,
commercially-available intellectual property cores for the FFT. Our approach
considers both cost (in terms of FPGA resource requirements), and per-formance
(in terms of throughput), and optimizes for both of these dimensions based on
user-specified requirements. By appropriately combining complementary forms of
loop unrolling, we systematically achieve cost-optimized FFT implementations in
terms of FPGA slices or block RAMs in FPGA, subject to the given throughput
constraints. Furthermore, our approach provides the advantages of being able to
optimize implementations based on arbitrary, user-specified performance levels
with general formulations of FFT loop unrolling trade-offs, which can be
retargeted to different kinds of FPGA devices.
Raj Madhavan, Stephen Balakirsky and Chris
Scrapper, Intelligent Systems Division, NIST
An Open-Source Virtual
Manufacturing Automation Competition
Saturday 3:00PM
Room 310
Automated Guided Vehicles (AGVs) represent an integral component of
today’s manufacturing processes. They are widely used on factory floors for
intra-factory transport of goods between conveyors and assembly sections, parts
and frame movements, and truck-trailer loading/unloading. Automating these
systems to operate in unstructured environments presents an exciting area of
current research in robotics and automation. Unfortunately, the traditional
entry barrier into this research area is quite high. Researchers need an
extensive physical environment, robotic hardware, and knowledge in research
areas ranging from mobility and mapping to behavior generation and scheduling.
An accepted approach to lowering this entry barrier is through the use of
simulation systems and open source software. This talk will present an overview
of research and collaboration being undertaken by the National Institute of
Standards and Technology (NIST) with a grant received under the IEEE Robotics
and Automation Society’s New Initiatives Competition. It is our belief that
competitions are an effective means of stimulating interest and participation
among students by providing exciting technological problems to tackle. Under
this effort, faculty members and their interested students from six
universities in the Greater Washington Area (Washington D.C., Northern Virginia
and Baltimore) were introduced to this time-critical research area through the
creation of a factory automation regional competition and tutorial. Since all
code used in these competitions is open source, participants are able to learn
from their competitors and self-sustain their research in their areas of
expertise. This talk will also outline the performance metrics that were used
to judge the competition. The competition arenas and metrics used for scoring
were specifically designed to create a “level” playing field for the various
research disciplines. The specific metrics, the way in which the competition
was run, and the future directions of the competition will be discussed in
detail. Defects that were noted in the metrics will also be outlined.
Kiki Ikossi, I-Cube Inc.
and George Mason University
Antimonides for High Efficiency Solar
Saturday 3:20 PM
Room 310
Solar cells generate electricity utilizing the photovoltaic
effect. The energy band gap of the semiconductor materials comprising the solar
cells determine which part of the solar spectrum is utilized for electrical
generation. Currently the record high efficiency solar cells are multijunction
solar cells based on germanium and combined with different compound
semiconductor cells. Cost considerations however limit these high efficiency
cells to space applications. In this work we examine the possibility of using
new materials in a novel way that uses a variable energy bandgap and spatial
dimensions to convert into electricity most of the solar spectrum. Realistic
modeling with position dependant material parameters, degeneracy effects,
bandgap narrowing effects, surface and interface recombination are used to
design the optimum parameters for the novel solar cell. A simple graded
antimonide based monolithic solar cell with a 37.4% efficiency current matched
at a current density of 23 A/cm2 under AM1.5 global illumination is
demonstrated. The results show that efficiencies as high as the ultra high
efficiency space solar cells are possible and are promising for development of
low cost high efficiency solar cells for terrestrial applications.
Brian Borak, Engineering team student leader
for the DC electrical systems on the 2007 University of Maryland Solar
Decathlon team, Dan Feng, a recent graduate from the University of Maryland,
John Kucia, one of the project managers on the 2007 University of Maryland
Solar Decathlon team, and Dan Vlacich is a Senior Consultant at Booz Allen
Hamilton, Inc., and a mentor to the 2007 University of Maryland Solar Decathlon
What it Takes to Design and Build a Successful Solar
Saturday 3:40PM
Room 310
The Department of Energy Solar Decathlon is an international
competition where students teams build fully-functional 100% solar-powered
homes. The students then compete in a week long competition on the National
Mall in Washington, D.C. to determine who has the best solar home. Homes are
judged on a variety of criteria ranging from the amount of excess electricity
produced, to heating and cooling comfort, to how far they can drive an electric
car that is charged from the house. The University of Maryland has competed in
all three Solar Decathlon events to date (2002, 2005, 2007) and recently placed
2nd in the world (and 1st in the United States) last October. This presentation
will discuss the competition, what it takes to design and build a successful
solar home, the student’s experiences and plans for future competitions.


Computational Modeling of Decision-Making
Chair / Organizer: Douglas A. Samuelson, Serco
Douglas A. Samuelson, Serco Modeling
Attention Management in Organizational Decision-Making
Saturday 9:00AM
Room 120
Consider how to improve organizational decision-making by
streamlining the process of seeking and allocating the attention of top
decision-makers. These decision-makers try to optimize the value they receive
by allocating their attention, taking uncertainty into account. Establishing a
“bidding” process for attention-seeking improves efficiency and reduces
problems. Now consider agent-based models of teams of workers. Workers have
skills and various numbers of units of work they can accomplish, per skill
area, per time period. The version of the model in which problems arrive and
drift through the organization’s space randomly until they encounter a team
that can solve them appears to approximate – and explain – the behavior of the
Cohen, March and Olsen Garbage Can Model. Other, more hierarchical versions are
likely to deadlock, overwhelming the managers and unnecessarily idling many of
the workers, in a manner that fits intuition for certain large, tightly
controlled bureaucracies. Explicitly modeling the attention required by
managers and supervisors to assign problems and monitor progress adds another
level of complexity and realism. This approach promises a rich variety of
interesting results.
H. Ric Blacksten and Joseph C. Chang,
Homeland Security Institute
Fermi model estimation of illegal
immigration deterrence as function of apprehension probability
Saturday 9:40AM
Room 120
While U.S. leaders and legislators demand that our Southwestern
borders be secured and controlled to stop illegal immigration, operators and
researchers express reservations as to how easily that can be achieved.
Recidivism statistics and surveys suggest that once an alien decides to cross
into the USA, he or she will persist until successful. Does this mean that
deterrence is hopeless? We present a “Fermi” framework, implemented in Excel,
to explore this question. Using educated estimates of economic variables, we
project the reduction in economic immigrant demand, i.e., deterrence, as a
function of probability of apprehension.
Steven Wilcox, Serco GOSSIP: A
Computational Model of Team-Based Intelligence Gathering
Saturday 10:20AM
Room 120
The Goal-Oriented Sales-Specific Information Processing (GOSSIP)
simulation model is a prototype for computationally modeling task complexity
and the effect of team communication on the performance of intelligence
gathering and exploitation tasks such as selling insurance or finding
terrorists. In GOSSIP, Kauffman’s NK model of environmental complexity meets
the Garbage Can model (Cohen, March & Olsen, 1972) and the phenomenon of
diffusion along social networks, thus allowing one to use the power of
simulation modeling for performing organizational design and analyzing impacts
on search performance for elusive targets. In lieu of employing social network
analysis measures in regression models of organizational effectiveness data,
GOSSIP models the information passing process and the complexity of the task
directly, thus pointing the way to enhanced clarity in quantitative modeling
and analysis.
Emergency Preparedness and Disaster Response
Chair / Organizer: Douglas A. Samuelson, Serco
Pete Hull, Homeland Security Institute and
Skills That Serve, Inc.
What Faith-Based Organizations Can Teach Us
about Disaster Response: Post-Katrina Lessons Learned
Saturday 2:00PM
Room 120
Faith-based organizations (FBOs) and non-governmental
organizations (NGOs) stepped in to fill the gaps when the geographic scales,
intensities, and durations of Hurricanes Katrina and Rita overwhelmed the
existing disaster response resources. FBOs and NGOs undertook a surprisingly
large, varied, and demanding set of activities with extraordinary
effectiveness. They provided shelter, food, medical services, hygiene services,
mental health and spiritual care, physical reconstruction, logistics management
and services, transportation, children’s services, and case management. The
FBOs’ and NGOs’ successes in providing these services are a stark contrast to
the many chronicled deficiencies and failures of government during the
catastrophic 2005 hurricane season. We will discuss these organizations’
successes and glean lessons that may make the nation better prepared for future
Douglas A. Samuelson, Serco
Agent-Based Simulation of Mass Egress from Public Facilities and
Saturday 3:00PM
Room 120
We review computer simulation models of selected attack scenarios
on civilian targets and of the effects of possible counter-measures. In
particular, these models focus on representing mass egress from large
facilities, following one or more detonations, to evaluate some proposed ways
to facilitate evacuation and reduce casualties. This review focuses on models
developed by Homeland Security Institute (HSI) and Redfish Group (Santa Fe, New
Mexico) to analyze two venues: a sports stadium and a subway station.
Innovations include an order-of-magnitude increase, relative to previous
models, of the number of people represented (70,000 in the stadium), and new
computational portrayals of crowd movement and explosions. These approaches
appear to conform especially well to real events, according to their
developers’ experiments and comparisons. We also discuss, more briefly, recent
models of wide-area evacuations in response to wildfires and nuclear terrorism.
We conclude that the development and analysis completed to date, while far from
exhaustive, suffice to demonstrate the utility of models such as these for
evaluating proposed countermeasures, for indicating policy and technology
issues that should be analyzed further, and for response planning. We also
address the unusual problems such models pose for validation and evaluation.
Papers of the Institute of Industrial
Engineers/ Program Chair: Joseph Scheibeler
Donald E. Crone, Program
Director for the Flats Sequencing System (FSS), Headquarters Engineering, U.S.
Postal Service
Postal Automation and the Flats Sequencing System
Sunday 10:00AM
Room 120
Over the past 25 years automation has revolutionized the US Postal
Service. This Session will explore the many technologies used by the US Postal
Service in its day to day operations. The first half of the session will focus
on key technologies such as bar code sorting, Delivery Point Sequencing (DPS),
Optical Character Recognition (OCR), and package sorting systems used by the US
Postal Service to process mail. The second half the session will provide an
in-depth review of the Flats Sequencing System (FSS), the Postal Service’s
latest advancement in flat mail sorting technology. The Flats Sequencing System
advances flat mail processing by sorting flat mail in the order that postal
carriers walk their route. This significantly improves the efficiency of flat
mail processing and allows postal carriers more time to serve customers. FSS is
designed to automatically sequence flat mail at a rate of approximately 16,500
pieces per hour, and is capable of sorting and sequencing up to 75,000 pieces
of flat mail in one sequencing session. The machine is designed to sequence
280,500 pieces to more than 125,000 delivery addresses on a typical 17 hour
daily operating window.
Michael E. McCartney,
Program Performance Specialist, Capital and Program Evaluation at U.S. Postal
Service Headquarters Finance
Project Management Shared Network Reporting
System for Tracking Capital Investment Projects
Sunday 11:00AM
Room 120
US Postal Service (USPS) Project Managers submit quarterly status
reports on investment projects for consolidation by USPS Headquarters Finance
using a shared network Access-based reporting system. Finance edits the Project
Managers’ submittals, incorporating individual project financial data from the
corporate data base to create a Quarterly Investment Highlights Report for the
USPS Board of Governors, the Postmaster General, the Chief Financial Officer
and other Postal executives comprising the Capital Investment Committee.
Project progress, previously reported using individual Word files by Project
Managers for each project, followed by tedious copying and pasting in
Headquarters Finance to create the consolidated report, is now input into a
shared network by Project Managers using an Access data base and uploaded to
the Finance reporting system for editing. In each subsequent Quarter, the
updated and edited information from the previous Quarter is downloaded to the
respective Project Managers who have access to their assigned projects for
updating and the report consolidation process for the Quarterly report is
repeated with the updated information. Project Managers now add only the
updated or changed information instead of modifying the entire project report
and Finance consolidates only the new information into each project’s report
records. The enhanced reporting process is transparent to the complete
consolidated Quarterly Investment Highlights Report. Required approval of the
Project Managers’ input by Project Management executives is built into the
system whereby each executive reviews only the projects of the executive’s
assigned Project Managers.
Charles L. Hochstein,
Purchasing and Supply Management Specialist, Commodity Management Center (CMC)
Mail Transport Equipment (MTE) and Spares at U.S. Postal Service Headquarters
Supply Management
Acquisition Cost Optimization Through Supply Chain
Sunday 2:00PM
Room 120
The Postal Service drives down its material and services
acquisition costs by applying optimization techniques and expressive bidding
methodologies to drive supply chain efficiencies. Advanced computer modeling
methodologies, web based tools, and strategic sourcing methodologies are used
to achieve these results. The Postal Service’s accomplishments in these areas
were recently recognized with the Technology Innovation Award from the November
2007 Chief Purchasing Officer’s Summit. The Postal Service will present the
framework it uses to make these accomplishments and a case study demonstrating
its success
Joseph J. Scheibeler,
Program Performance Specialist, Capital and Program Evaluation at U.S. Postal
Service Headquarters Finance
After Cost Review Process for Capital
Sunday 3:00PM
Room 120
This talk focuses on the methodology underlying the US Postal
Service’s (USPS) capital investment justification and review process. In the
preparation of both Decision Analysis Reports (DARs) to facilitate investment
decision-making of approval authorities and in subsequent cost study reviews,
cash flows – of project investments and incremental costs and benefits – are
projected for a ten year benefit period to determine the return on investment
(ROI). The cash flows are discounted at the appropriate interest rate (based on
project risk and cost of capital) to calculate a project’s net present value
(NPV). “After cost studies” for capital investment projects exceeding $25
million are undertaken as directed by the USPS Board of Governors (BOG) to
evaluate the results of BOG-approved projects after at least one full year of
normal continuous operation. Utilizing discounted cash flow methodology, a
given project’s “actual” ROI and NPV of its after cost analysis are compared
with the DAR’s. The information contained in the DARs and the after cost study
reports includes subject matter experts’ estimates, vendors’ estimates and
prices and data from USPS’s extensive corporate data base, which includes
operation work hours and volumes processed, and actual project capital and
expense payments. Information is also obtained by direct contact with field
operations personnel, as well as the use of data collection, modeling and
analysis techniques such as sensitivity analysis and productivity analysis, to
estimate future costs and benefits. Inflation factors are applied to the
estimates for the projection of anticipated results over a ten-year
post-investment time horizon.


Erika Shugart
Presenting Current Science: Lessons from the Marian Koshland Science Museum
Sunday 10:00AM
Room 110
Global warming, vaccination, and forensic DNA evidence are all
topics that have been in the headlines. What approaches can be used to help the
public understand these complex issues? The Marian Koshland Science Museum
presents exhibits on topics such as climate change, infectious disease, and DNA
technology since it opened in 2004. Dr. Erika Shugart, deputy director of the
Koshland Museum, will show examples of a variety of approaches for making
complex science accessible, share what she is learning about museum visitors,
and explore some of the lessons learned and how they can be applied beyond the
museum environment.

NATIVE PLANT SOCIETY (see American Society of Plant


Invited Talk by Dr. Michael
Haney, DARPA & the Univ. of Delaware
Photonic Integrated Circuits:
Ready for Prime Time?
Saturday 9:00AM
Room 330
In recent years, Photonic Integrated Circuits (PICs) – in which
information is encoded and manipulated as optical (rather than electrical)
signals – have begun to emerge. Optical fiber networks are a natural
application domain for PICs as there is an inherent match between the signal
generation, detection, and multiplexing capabilities of emerging PICs and the
optical signals being transported over the network. However, as with
electronics over the last 4 decades, the transition from discrete to integrated
circuits is opening up new opportunities for information processing based on
photonics. There is increasing demand for integrated photonic sensing and
processing solutions in chemical, biological, medical, communications, and
remote sensing applications. Moreover, the advancement of PICs is being
accelerated by leveraging the lithographic design and fabrication tools that
have already been developed for the electronics IC industry. This talk explores
the status of PIC R&D and highlights the key similarities and differences
between electronic and photonic IC technologies that will ultimately influence
the transition of PICs into real applications.
Pavlo Molchanov1 ,
Vincent M. Contarino2, Olha Asmolova1
Gated Optical Sensors
1Aerospace Mass Properties Analysis Inc., North Wales, PA, USA 2 US Naval Air
Systems Command, Patuxent River, MD, USA
Saturday 9:30AM
Room 330
Gated optical sensors can be divided into two groups: gated optical
sensors (one anode optical sensors) and gated imaging optical sensors
(multianode optical sensors, CCD cameras and TV cameras). Gated optical sensors
are usually used for receiving laser pulsed signals from determinate distances
in different LIDAR, LIVAR, Laser-Radar systems. An optimal adjusted gate allows
the excision of noise signals reflected from objects before and after the area
of observation. Gating of optical sensors in a confined area of observation
protects the optical sensor and decreases noise background. The area of
observation and application of gated optical sensors may be different depending
on gating pulse width. For example, for a 1-10 ns pulse width area of
observation confined by 0.3-3 meters can be used for underwater measurements,
target detection and imaging [1-4]. For a 1 picosecond pulse width area of
observation confined by 0.3 millimeters, this technology can be used for
baggage and human body screening. Femtosecond gating pulses can be used for
biomedical tomography with micrometer resolution. Results of investigating the
nanosecond gated photodetectors for underwater target detection and underwater
imaging are presented in this paper. 20 nanosecond gating pulses have been used
for FireLidar, where 1574 nm laser light was pulsed through hydrocarbon smoke
generated by wood. FireLidar is being developed for use in search and rescue in
smoke and flame environments [5]. 1. M. Bristov. Suppression of
afterpulsing in photmultpliers by gating the photcathode// Applied Optics,vol
41; No24;4975-4987, 2002. 2. V. M. Contarino, P. A. Molchanov, O.V. Asmolova.
Large Area Intensified Photodiodes for Ocean Optics Applications// Remote
Sensing of the Atmosphere, Ocean, Environment, and Space, 2004, 8-12 November
2004, Honolulu, Hawaii USA. 3. P. Molchanov, V.Contarino, B.Concannon,
O.Asmolova, Investigation and Design of Wide Dynamic Range Gating Photosensor
Module on the Base Hamamatsu Photomultiplier Tube R7400U with Output Signal
Compression for LIDAR-RADAR Applications// Int. Semiconductor Device Research
Symposium, Bethesda, Maryland, Dec. 7-9, 2005. 4. P.Molchanov, V.Contarino,
B.Concannon, O.Asmolova, Nanosecond Gated Optical Sensors for Ocean Optics
Applications// SAS 2006, Sensors Applications Symposium, Houston, Texas, Feb.
7-9, 2006. 5. E.T.Dressler, R.I. Bilmers, E.J.Bilmers, M.E.Ludwig. FireLidar
development: light scattering from wood smoke, experiments, and theory at 1574
nm //SPIE Optics, Photonics Conf. San Diego, CA, Aug.13-17, 2006.
Dr. Spilios
Riyopoulos, SAIC
Slow light propagation across coupled micro-laser
Saturday 9:50AM
Room 330
Closely packed micro-laser cavities, such as VCSEL arrays, can
interact through their evanescent fringe-fields, so that radiation confined in
one cavity causes stimulated emission and carrier depletion [1] in neighbor
cavities. Active coupling differs from optical interference in “passive”
photonic lattices, including photonic crystals and CROWS, as here the radiation
in one cavity modulates the complex gain of nearby cavities. Such nonlinear
cross-cavity interactions endow active photonic lattices with a rich
multifaceted, behavior. Since each cavity possesses a characteristic (slow
compared to optical) oscillation frequency, a phase-locked array behaves as a
coupled oscillator lattice. Theory and numerical simulations demonstrate that
driving one of the cavities generates periodic variations in amplitude and
phase around the steady-state values, propagating over the entire array. The
dispersion for such lattice waves has been derived for small amplitude
variations around the steady-state [2,3]. Stable low frequency (GHz) optical
waves exist in a wide region of coupling strengths and line-width factor
values. For parameters near the stability boundary the decay constant
approaches zero and these waves propagate over long range. (Beyond that
stability boundary the lattice breaks into self-excited oscillations and
steady-state does not exist in the first place.) Typical propagation speeds ~
km/s show a 5 orders of magnitude reduction below the vacuum light speed c.
While the group velocity decreases with the coupling strength, the coupling
cannot go to zero, to maintain coherence. Because such waves involve
oscillations in the coupled fermion-boson gas (carrier and photon densities)
near the material sound speed, they are characterized [2] as photonic sound.
[1] Coherent phase locking, collective oscillations and
stability in coupled VCSEL arrays, S. Riyopoulos, Phys. Rev. A 66, 53820
(2002). [2] “Photonic Sound Excitation in Active Photonic Lattices, S.
Riyopoulos, Optics Express 12, 3190-3195 (2004). [3] “Slow light waves at sonic
propagation speed in active photonic lattices, S. Riyopoulos, Phys. Rev. A 75,
013801 (2007).
Alexander Efros, NRL
Multi-Exciton Generation by a Single Photon in Nanocrystals
Saturday 10:10AM
Room 330
Very efficient multi-exciton generation has been recently observed
in nanocrystals where an optically excited electron-hole pair with an energy
greater than the effective bandgap produces one or more additional
electron-hole pairs [1,2]. We present a theory of multiple exciton generation
in nanocrystals [3]. We have shown that very efficient and fast exciton
generation in nanocrystals is caused by the breaking the single electron
approximation for carriers with kinetic energy above the effective energy gap.
The concept allows us to define the condition for dominant two-exciton
generation by a single photon: the thermalization rate of a single exciton,
initiated by light, should be lower than the both the two exciton
thermalization rate and the rate of Coulomb coupling between single and two
exciton states. We have also explained why the threshold of highly efficient
multiple exciton generation in PbSe nanocrystals begins at photon energy close
to the 3 times of the effective energy gap of the nanocrystals. 1. R. Schaller and V. Klimov, Phys. Rev. Lett. 92, 186601 (2004)
2. R. J. Ellingson, M. C. Beard, P.Yu, O. I. Micic, A. J. Nozik, A.
Shabaev, and Al. L. Efros, NanoLetter 5, 865 (2005) 3. A. Shabaev, Al. L.
Efros, and A. J. Nozik, NanoLetter 6, 2856 (2006).
Jeffrey O. White, ARL
Continuous ‘system level’ scale for laser gain media
Saturday 10:45AM
Room 330
A “system level” scale from is proposed, based on the probability
of occupation of absorbing and emitting pump and laser levels. The numerical
value coincides with conventional usage of the terms 2-, 3- and 4-level system.
The physical significance is that for , the laser beam gains photons at the
expense of the pump beam, in steady state. For , the opposite occurs. The
proposed definition is general enough to apply to semiconductor lasers, but is
particularly useful for comparing systems with discrete levels that are pumped
with a narrow-band source, in near-resonance with the laser wavelength. When
pumping Er3+ at 1470 nm, and lasing at 1645 nm, varies smoothly from 4 to 2 as
the temperature increases, so this pair of transitions is effectively a
3.3-level system at 300°K. When pumping at 1532 nm, a maximum value of is
reached at ~200°K.
Emily Schultheis,
Student, Glenelg H.S., Howard County, MD (Science Fair Winner in Optics)

Machine Vision Assessment to Tomatoes of Unknown Diameter
Saturday 11:10AM
Room 330
The objective of these experiments was to determine the distance to
an object of unknown size using binocular digital vision in an algorithm that
may be automated for use by a robot. The eventual goal is to develop a robot
that can autonomously harvest ripe tomatoes or similar delicate agricultural
products.In a series of three studies, analyses from binocular digital images
were performed using a desktop computer running LabVIEW 8.0 with Vision
Assistant 8.0. A measurement from the center of a target’s ROI (region of
interest) to the viewing axes of each camera’s viewing axis was used to
calculate the camera-to-target distance using the principle of proportional
lengths within similar triangles. The calculation utilized a reference scale of
known dimensions and known distance from the cameras.The results demonstrated
that the distance to a target of unknown size may be calculated from digital
images made from parallel binocular views of the target within a limited
camera-to-target range.
Dr. H. John Wood, NASA Goddard
Space Flight Center
Hubble Discoveries
Saturday 11:30AM
Room 330
Orbiting high above the turbulence of the earth’s atmosphere, the
Hubble Space Telescope (HST) is providing breathtaking views of astronomical
objects never before seen in such detail. This medium-size telescope can reach
the faintest galaxies ever seen by mankind.. Some of these galaxies are as
young as 2 billion years old in a 14 by universe. Up until recently,
cosmologists assumed that all of the laws of physics and astronomy applied back
then as they do today. Now, using the discovery that certain supernovae can be
used as “standard candles”, astronomers have found that the universe is
expanding faster today than it was back then: the universe is accelerating in
its expansion. In recent years, a major breakthrough has been made in the field
of cosmology. Using HST and ground-based images, astronomers and physicists
have discovered a new means for measuring the distances to faint galaxies.
Using the light from exploding stars called supernovae of type Ia, observers
can measure distances by comparing their known intrinsic brightness to their
apparent brightness’s. When compared with distances derived from Doppler shifts
in their spectra, supernovae distances of faint young galaxies show that the
universe was expanding more slowly than it does today. The simplest hypothesis
is that because of the Big Bang, the mean density of the universe is decreasing
rapidly with time while the cosmological constant (also known as dark energy)
is there, unchanging, throughout space. Today, dark energy has command of the
Invited talk by Dr. Ken
Stewart, NRL
Mesh Filters for Infrared Astronomy
Saturday 1:30PM
Room 330
Metal meshes or screens have been used as infrared filters since at
least the 1960s. Different metal/dielectric structures can be used to make
lowpass, highpass, and bandpass filters as well as dichroic beamsplitters and
polarizers. Recent advances in numerical simulation and nanotechnology
fabrication techniques permit design and construction of structures with
improved optical performance. New simulation codes and faster computers permit
exploration of novel, more complex three-dimensional structures and can largely
eliminate a costly trial-and-error approach to filter design. The new computer
codes can predict filter and beamsplitter transmittance and reflectance at any
angle of incidence in focused as well as collimated beams. Nanotechnology
fabrication facilities can produce nearly ideal structures which eliminate
fabrication errors as a source of disagreement between theory and experiment,
and extend the technology to shorter wavelengths than previously possible. I
will describe our project at the Naval Research Laboratory’s Nanoscience
Institute and Columbia University to design, fabricate, and test metal mesh
optical components for infrared astronomy.
Geary Schwimmer1, Tom
Wilkerson2, Jed Hancock2, Jason Swasey2, Adam Shelley2, Bruce Gentry3 and Cathy
Holographic Scanning UV Telescope for the Tropospheric Wind Lidar
Technology Experiment
1 Science and Engineering Services, Inc., Columbia,
MD 21046 2 Space Dynamics Lab, Utah State University, Logan, Utah 84322-4405 3
NASA Goddard Space Flight Center
Saturday 2:00PM
Room 330
The first ever Ultraviolet Telescope based on a transmission
Holographic Optical Element (HOE) as the primary optic has recently been
completed as part of the Tropospheric Wind Lidar Technology Experiment
(TWiLiTE). The HOE is mounted in a ring bearing and rotates about its center
normal to effect a 45-degree conical scan while the remainder of the telescope
remain stationary. Preceded by two other similar telescopes at visible and near
IR wavelengths, this is the first UV version of an HOE based telescope. The
telescope and the TWiLiTE instrument are designed for use in the uncontrolled
environment of the NASA WB-57 bomb bay. TWiLiTE will use the HOE telescope with
a pulsed UV laser to profile tropospheric winds from the WB-57 cruise altitude
of 15-18 km down to the surface. The HOE is 40-cm in diameter, 1 cm thick, has
a 1-m focal length and a 45-degrees off-normal field of view. The telescope is
a twice-folded coaxial design, with a flat secondary and a convex tertiary
mirror that directs the light through a hole in the secondary. A 2-cm diameter
laser beam is transmitted coaxially through a 2-mirror periscope mounted in a
hole in the center of the HOE.
Peter Blake1, Joseph
Connelly1, Babak Saif2, Bente Eegholm2, Perry Greenfield2, and Warren Hack2

Spatially Phase-Shifted DSPI for Measurement of Large Structures 1 NASA
Goddard Space Flight Center 2 Space Telescope Science Institute
Saturday 2:20PM
Room 330
Successful development of large, lightweight, deployable, cryogenic
metering structures for infra-red space optics requires verification of
structural deformations to nanometer level accuracy in representative test
articles at cryogenic temperature. We review a Spatially Phase-Shifted Digital
Speckle Pattern Interferometer (SPS-DSPI), which was designed for the testing
of optical structures and related technology development structures for the
James Webb Space Telescope (JWST). This instrument is capable of measuring
dynamic deformations of the surfaces of large structures to nanometer accuracy
at cryogenic temperatures. Verification of the instrument was performed using a
single-crystal silicon gauge, which has four islands of different heights that
change in a predictable manner as a function of temperature.
Dr. Joseph Howard, NASA
Goddard Space Flight Center
Optical Modeling Activities for NASA’s James
Webb Space Telescope: Overview and Introduction of Matlab based toolkits used
to interface with optical design software
Saturday 2:40PM
Room 330
The work here introduces some of the math software tools used to
perform the optical modeling activities for NASA’s James Webb Space Telescope
project. NASA has recently approved these in-house tools for public release as
open source, so this presentation also serves as a quick tutorial on their use.
The tools are collections of functions written in Matlab, which interface with
optical design software (CodeV, OSLO, and Zemax) using either COM or DDE
communication protocol. The functions are discussed, and examples are
Dr. Raymond Ohl, NASA Goddard
Space Flight Center
Recent developments in the alignment and test plans
for the James Webb Space Telescope Integrated Science Instrument Module

Saturday 3:00PM
Room 330
The James Webb Space Telescope (JWST) is a 6.6m diameter,
segmented, deployable telescope for cryogenic IR space astronomy (~40K). The
JWST Observatory architecture includes the Optical Telescope Element (OTE) and
the Integrated Science Instrument Module (ISIM) element that contains four
science instruments (SI) including a Guider. The SIs and Guider are mounted to
a composite metering structure with outer dimensions of 2.1×2.2×1.9m. The SI
and Guider units are integrated to the ISIM structure and optically tested at
NASA/Goddard Space Flight Center as an instrument suite using an OTE SIMulator
(OSIM). OSIM is a high-fidelity, cryogenic JWST telescope simulator that
features a ~1.5m diameter powered mirror. The SIs are aligned to the
structure’s coordinate system under ambient, clean room conditions using laser
tracker and theodolite metrology. Temperature-induced mechanical SI alignment
and structural changes are measured using a photogrammetric measurement system
at ambient and cryogenic temperatures. OSIM is aligned to the ISIM mechanical
coordinate system at the cryogenic operating temperature via internal
mechanisms and feedback from alignment sensors in six degrees of freedom. SI
performance, including focus, pupil shear and wavefront error, is evaluated at
the operating temperature using OSIM. We present an updated plan for the
assembly and ambient and cryogenic optical alignment, test and verification of
the ISIM element.
Bert A. Pasquale and Ross
M. Henry, NASA Goddard Space Flight Center
Functional Testing of Hubble
Relative Navigation Sensor Flight Cameras
Saturday 3:40PM
Room 330
The Hubble Space Telescope (HST) Servicing Mission 4 will run a
live test of the Relative Navigation Sensor (RNS) system for autonomous
rendezvous and capture. The cameras will be mounted on a yoke-type device – the
Multi-Use Lightweight Equipment (MULE) carrier in the Space Shuttle cargo bay.
Using three fixed-pointing cameras with various fields of view and fixed focus
distances, the RNS system will be used to determine the relative spatial
orientation of HST fixtures to the shuttle’s grappling arm. RNS will run
real-time with an on board processor, but the system also stores all the images
and data for post-mission processing. If this on-orbit test is successful,
future missions may rely on autonomous RNS systems as the primary method to
safely dock with HST or other spacecraft. This presentation will discuss
details associated with testing of the RNS cameras and supporting avionics,
including both individual camera laboratory evaluation and full-system tests
using reduced scale and full-sized models.
Dr. J.C. (Chuck)
Strickland, NASA Goddard Space Flight Center
Fast Optical Processor for
Laser Comm
Saturday 4:00PM
Room 330
Real-time compensation of atmospheric turbulence is an important
unsolved problem limiting the effectiveness of a Laser Communications
(Lasercomm) system. A special purpose optical processor is proposed to
compensate these effects using a DSP (Digital Signal Processor) in combination
with a WFS (Wavefront Sensing) camera and Fast Steering Mirror (FSM). If
successful, the technology facilitates an end-end demo of the GSFC Lasercom
system. The processor will be built and tested at GSFC and then implemented at
the GSFC Optical Test Site.
Athanasios N. Chryssis,
Geunmin Ryu and Mario Dagenais, University of Maryland, College Park, MD

High Resolution Incoherent Optical Frequency Domain Reflectometry
Saturday 4:20PM
Room 330
In this paper we present the performance characteristics of a
tabletop incoherent optical frequency domain reflectometer used for fault
detection in a fiber optic network. The system is based on measuring the
beating of a linearly chirped signal and its delayed reflection after it has
propagated down a fiber network. The beat frequency is proportional to the
delay of the signal and thus the fault distance from the transmitter. In our
experiment we use a chirped microwave signal which is ramped from 0 to 1GHz in
5 seconds to modulate a laser. We were able to obtain a 10cm fault resolution.
The system was tested on fibers as long as 25km. Experimental plots were
confirmed by numerical simulations. Simultaneous multiple faults detection was
also realized on a single fiber branch as well as on a fiber network. Narrow
bandwidth detection favors this scheme over standard time domain techniques. A
high optical dynamic range of over 70dB was obtained. Another advantage of the
frequency domain method is that there is no intrinsic deadtime for acquiring
the reflected signal, allowing continuous monitoring of the network. This
results in a lower peak laser power requirement for this technique opening the
possibility for providing built in test capability using the existing network
Christopher Stanford,
Mario Dagenais, Juhee Park, Philip DeShong, University of Maryland, College
Park, MD
An Etched FBG Sensor: Modeling Bio-attachment and Improving
Saturday 4:40PM
Room 330
Fiber Bragg gratings continue to be used in chemical and biological
detection because they offer high sensitivity to bulk refractive index changes.
The minimum index resolution of our etched FBG sensor is 1×10-6 riu. We devised
a multilayer model and simulation to show how adsorbed materials shift the
effective refractive index around the grating sensor thus shift the reflection
spectrum. Applying this method to characterize our silanization attachment
(essential to protein adsorption on silica surfaces), results show that the
etched FBG sensor can detect adsorbed monolayers. We recently fabricated a
fiber Fabry Perot by fusion-splicing two pre-fabricated FBGs which created a
20cm cavity with a finesse of ~30. In order to increase the cavity’s finesse we
need more control over cavity length, FBG reflectivity, and intracavity loss.
The effective cavity length may be altered by etching the fiber in the region
of the cavity thereby allowing adsorbed molecules to shift the FFP spectral
resonances. By fabricating the FBGs in-house we may control the spectral
properties (e.g. resonance linewidths) and enhance the sensitivity of our
etched-fiber chemical sensor. Interrogating the short-cavity fiber Fabry Perot
sensor with a tunable laser will allow us to further reduce the minimum index
resolution by more than an order of magnitude relative to our current set-up
(i.e. EDFA and Advantest spectrum analyzer). By directly modulating the laser
with a slowly-ramped voltage, we can improve our wavelength resolution to a
value around .01 pm and correspondingly decrease the index resolution to lower
than 1 x 10-8 riu.


Dialogue Among
Natural Resource Societies in the National Capital Area
Saturday 9:00AM
Room 375
This session will include science leaders from a number of
professional societies that share a focus on natural resource issues.
Representatives will be asked to participate in a roundtable discussion on the
impacts and future challenges among a number of trends, such as: global
warming; bio-energy development; the threat of invasive species; and growing
populations in the United States. Panel Members include, Chris Farley, Land
Use, Forests and Agriculture in a Post-Kyoto Change Climate Agreement:
Prospects and implications for natural resources management,
Nicolas W. R
Lapointe Non-indigenous species introductions – benefit or threat?,
David L. Trauger The Role of Environmental Societies and Conservation
Society of American
Foresters Science Exhibition and Showcase
Saturday 2:00PM
Room 375
This session will include posters, journals and resource exhibits
of SAF and NCSAF information for networking with other affiliated societies and
the general public. NCSAF members and leaders will be on hand to discuss the
science mission and roles of the Society, as well as network with other
WAS-affiliated science groups.


Inside a Closed Box: Ionizing
Radiation in Imaging
Chair: Lisa Karam, Deputy Chief Ionizing Radiation
Division, Physics Laboratory, NIST
Room 330
Jeff Cessna.
Ionizing Radiation Division, Physics Laboratory, National Institute of
Standards and Technology
New Paradigms in Diagnostic and Therapeutic
Nuclear Medicine, New Standards
2:00 PM
Medical procedures using unsealed radioactive sources
require the benefit of the procedure to be weighed against the possible risks
to the patient due to radiation exposure. The prescribed dosages for
radiopharmaceuticals, both diagnostic and therapeutic, are generally determined
using “rules of thumb” or canonical values based on patient weight or surface
area. Current research suggests that this method of prescribing dosages for
some procedures may result in overdosing in certain patient populations, most
notably pediatric and geriatric, and can lead to inadequate doses being
delivered to obese patients, requiring that the procedure be repeated. In
either case, the result is unnecessary radiation exposure. A new paradigm is
currently being promoted that seeks to optimize the dosage that a patient
receives by using patient-specific information to predict the correct dosage.
While this represents a major advance in safety and effectiveness in nuclear
medicine, it places greater demands on the accuracy and consistency of the data
used to develop the treatment plan. Perhaps the most limiting factor in the
application of this technique is the quantitation of the Positron Emission
Tomography (PET) or Single Photon Emission Computed Tomography (SPECT) images
that provide the radioactivity uptake data that form the input for the
dosimetry calculations. Additionally, the current lack of suitable standards
makes it difficult to reliably compare imaging data from different scanners and
even between scans of the same patient with the same scanner, a comparison
necessary to track disease response to treatment. A new primary standard and
secondary standards are currently being developed that will allow PET scanners
to be calibrated for activity in absolute terms and will also provide a way to
check and renormalize scanner performance between scans.
Larry Hudson,
Steve Seltzer, Paul Bergstrom, Fred Bateman, and Frank Cerra Ionizing Radiation
Division, Physics Laboratory, National Institute of Standards and Technology
Standards for X-Ray and Gamma-Ray Security Screening Systems
Since the 1920’s, the National Bureau of Standards
(NBS), now the National Institute of Standards and Technology (NIST) has been a
world leader in promoting accurate and meaningful measurements, methods, and
measurement services for ionizing radiation and radioactivity. Among other
things, the institute develops, maintains, and disseminates the national
standards for ionizing radiation and radioactivity thereby providing credible
and absolute measurement traceability for the nation’s medical, industrial,
environmental, defense, homeland-security, energy, and radiation-protection
communities. This experience and infrastructure, which includes fundamental
research and radiation-transport modeling, enabled NIST to respond to rapidly
emerging homeland-security needs in the area of x-ray and gamma-ray security
screening. In particular, with funding from the Department of Homeland Security
(DHS), and in alliance with the American National Standards Institute (ANSI),
we report on efforts to develop of a suite of national voluntary consensus
standards that that encompasses all the nation’s security systems that screen
using x-rays or gamma-rays. These include screening of carried items at
checkpoints, airline checked baggage, trucks, cargo containers, human subjects,
and abandoned objects suspected of containing bulk explosives. These
documentary standards focus primarily on imaging quality and radiation safety,
and each specifies test artifacts, test methods, and in some cases required
minimum performance levels. All modalities are treated: transmission and
backscatter geometries as well as computed tomography (CT). The goal is to
provide tools that for the first time provide governmental users and industrial
partners uniform methods to compare technical aspects related to performance
and safety, inform procurement decisions, and stimulate and quantify future
technological improvements.
Svetlana Nour1,2, Matthew Mille1,3, Kenneth Inn1, Douglas W. Fletcher4
1Ionizing Radiation Division, Physics Laboratory, National
Institute of Standards and Technology 2University of
Maryland, 3Rensselaer Polytechnic Institute,
4National Naval Medical Center, Bethesda MD

Population Radiation Measurement – the Monte Carlo option
In the event of a radioactive accident or incident, one
of the biggest tasks is to estimate the radiation internal dose received by
people to determine the appropriate emergency response needed. As part of these
radiation dose evaluations, accurate evaluation of the contaminated people
require the use of measurement efficiencies based on the geometry of the
radiation detectors and of the human body. This implies that a prohibitively
large number of calibration human body standards (phantoms) would be needed. A
more flexible alternative approach would be to use Monte Carlo computations of
the measurement efficiencies that have been validated against a set of standard
radionuclide phantoms. The scope of the project is to create standard human
body phantoms, to validate their estimated measurement efficiencies from Monte
Carlo computations, and to develop tools to expand the range of body shape and
sizes for Monte Carlo use for individual radioactive victims or patients. This
project begins with a Bottle Manikin Absorption (BOMAB) phantom spiked with
Ga-67 as a standard geometry. The radioactive BOMAB is measured at a number of
distances from HPGe detector, and the experimental efficiency for our gamma
spectrometry system is determined. The same set of experiments is then modeled
using the Monte Carlo N-Particle Transport Code (MCNP). Each of the plastic
bottles which comprise the BOMAB phantom were individually CT scanned at the
National Naval Medical Center. Using the Monte Carlo software Scan2MCNP (White
Rock Science), the resulting tomograms underwent a process called segmentation
in which materials of interest are assigned to appropriate regions of the
medical images according to their density. Measurement efficiencies were
estimated for the 5 photon energies of Ga-67 with the greatest intensity.
Agreement between the computationally determined and experimentally measured
efficiencies has been achieved to within a few percent, and all within the
estimated uncertainties. With further optimization of the input file, it is
expected that results will improve, and we will be able to move on to more
complicated geometries such as the anthropomorphic phantom, and ultimately to
CT-scanned human individuals.
Daniel S. Hussey Ionizing
Radiation Division, Physics Laboratory, National Institute of Standards and
Neutron Imaging: The key to understanding water management
in hydrogen fuel cells
Since proton exchange membrane fuel cells (PEMFCs)
have high fuel efficiency and emit only water as a byproduct, they are an
attractive alternative to the internal combustion engine. Water management in
PEMFCs critically impacts fuel cell performance, durability, and materials of
construction. Neutron radiography has been the only method able to measure, in
situ, the trace amount of water produced and stored within standard,
commercially viable PEMFCs. This talk will provide an overview of the PEFMC
research performed at the NIST neutron imaging facility, ranging from the
fundamental water transport in the membrane to the impacts of water on a fuel
cell engine.byproduct, they are an attractive alternative to the internal
combustion engine. Water management in PEMFCs critically impacts fuel cell
performance, durability, and materials of construction. Neutron radiography has
been the only method able to measure, in situ, the trace amount of water
produced and stored within standard, commercially viable PEMFCs. This talk will
provide an overview of the PEFMC research performed at the NIST neutron imaging
facility, ranging from the fundamental water transport in the membrane to the
impacts of water on a fuel cell engine.


Fifty years after the International Geophysical Year, scientists
around the world are now engaged in a coordinated program of International
Polar Year (IPY) Research, Education and Outreach. Guided by the U.S. National
Academies of Science “A Vision for the International Polar Year 2007-2009”
report, U.S. scientists are joining in research partnerships with colleagues in
30 nations to improve our understanding of the world’s great ice sheets in
Antarctica and Greenland, to study how ecosystems and human institutions
respond to climate change, and to advance our understanding of climate change
and how it will evolve at a fundamental level. The National Science Foundation
is the lead U.S. agency for the IPY and has worked closely with NASA, NOAA and
many other U.S. agencies to build a common legacy. This keynote session will
showcase several forefront research projects that exemplify the goals of U.S.
and international IPY efforts.
Saturday 11:00AM
Room 375
1. Dr. Kelly Falkner, Program Director Antarctic
Science Division NSF Office of Polar Programs
Saturday 11:00AM
Room 375


Improved scientific understanding of genetic
mechanisms, coupled with recent dramatic advances in technical capabilities,
has put within our grasp the molecular fingerprints and “recipes” of all
tissues, including those harboring disease. These genetic messages may remain
intact in preserved tissues for long periods of time e.g., centuries.
Currently, many millions of tissue specimens reside in hospital, clinic and
research laboratories throughout the world. Deciphering the genetic messages in
these tissues introduces questions of access, ownership, commercial
capabilities, etc. Ethical, legal and sociological answers will influence the
ultimate utility of such tissues and everyone has a potential stake in how
these questions are answered. This session will put these general questions
into perspective and will tease out individual cases.
Saturday 4:00PM
Room 1235
Dr. William Gardner, Executive Director,
American Registry of Pathology, Introduction
Robin Stombler President, Auburn Health Strategies, LLC Giving
Yourself Away – A Patient’s Guide to Specimens

Col. Glenn Sandberg, AFIP Tissue RepositoryMajor
Catharine M With Tissue Ownership: Legal Considerations


Martin Ogle, Chief
Naturalist, NVRPA
Birds of Prey of Virginia
Sunday 10:00AM
Room 365
This presentation will cover identification and ecology of birds of
prey regularly found in the state of Virginia. Species of hawks, falcons,
eagles, owls and also vultures will be discussed. Ecological information will
include life histories, migration patterns, behavior, sexual dimorphism, and
how these birds fit into the living system. Approximately 15 species of raptors
nest in Virginia, and a number of others regularly migrate through or to the
state. Many of these species have been adversely affected in the past by DDT
and other pesticides and habitat loss continues to be of concern for some
species. Many birds of prey are relatively easy to find and distinguish while
others are rare or secretive. All are excellent “windows” through which to
understand the natural order of Planet Earth. Places to view birds of prey and
techniques/hints for finding them will also be discussed
Keith Tomlinson, Manager
Meadowlark Botanical Gardens
A Floristic Natural History of the Greater
Washington DC Region in the Potomac River Basin
Sunday 11:00AM
Room 365
Forests of the greater Washington DC Region have evolved over time
on the eastern margin of the ancestral North American continent as part of the
Potomac River Basin. Plant migration and geomorphic processes are considered as
integral to modern distribution. Components of both ancient tropical and
temperate forests exist in woody taxa of the Washington region today. This
paper reviews the composition and distribution of these ancient floras and the
resulting contemporary forest diversity.


Joe Coates, Joseph F.
Coates Consulting Futurist Inc. Washington, DC
Saturday 9:00AM
Room 320
Public surveys show that our fellow citizens do not feel
significantly more secure after 9/11, and the government’s homeland security
responses. Part of the shortfall is due to the creation of an enormous new
organization which will be slow to come together organically, if it ever does.
Far more significant in the shortfall, is the absence of a clearly expressed
theory of terrorism, and consequently of what it might or might not be up to. A
theory of terrorism will be presented, and the implications for public and
private action.
Kenneth Haapala, President, Philosophical
Society of Washington
ECONOMICS 21: America’s Post Industrial Economy
The 20th Century was one of remarkable transformation for the
American economy. Although it always remained a strong trading nation, in the
early 20th Century the America changed from a primarily agrarian, rural economy
to a primarily industrial, urban economy. By mid-Century the United States was
the world’s leading industrial power. America is changing from an industrial
economy into a post industrial economy – or more poetically stated: from a
“perspiration economy” to an “inspiration economy.” Using established sources,
the speaker will trace the important components of these remarkable
transformations. He will emphasize certain characteristics that may surprise
many and suggest what may happen as the transformation to a post industrial
economy continues.


A Mini-Symposium on “Human
Factors and Driver Safety”
10:00 AM
Room 330
Gerald P. Krueger, Ph.D.,Krueger Ergonomics
Effects of Health, Wellness and Fitness on Commercial Driver
Safety: A Review of the Issues
Sunday 10:00AM
Room 320
This presentation: (1) identifies the most common health and
fitness concerns for commercial truck and bus drivers; (2) addresses the
relationship of the most important health risks as they affect on-the-road
driver safety; (3) provides a snapshot of the most promising occupational
health and wellness programs that can be applied to alleviate many driver
health concerns; (4) highlights case studies exemplifying successes by
employers who have already successfully addressed important driver health and
safety issues; and (5) offers suggestions on how government transportation
oversight agencies can positively impact highway safety by setting the proper
tone, and by pointing the way to templating programs to offer vast improvements
to produce win-wins for transportation industries, highway safety advocates,
and the driving public.
Ronald R. Knipling, Ph.D.,
Virginia Tech Transportation Institute
What Does Instrumented Vehicle
Research Tell Us About Crash Risk and Causation?
Sunday 10:30AM
Room 320
“Naturalistic” driving studies involve instrumenting the vehicles
of volunteer car and truck drivers with video cameras and other sensors to
provide dynamic data on crashes, near-crashes, and other incidents. Drivers
quickly revert to their normal driving habits, which often include many driving
errors and misbehaviors. “Instant replays” of driver behavior and traffic
interactions can be viewed and analyzed. The method also permits other types of
safety analysis that are not possible based on conventional crash investigation
and reconstruction. Naturalistic driving easily provides the missing ingredient
in most driving safety research: customized exposure data. This presentation
discusses and demonstrates three research applications of naturalistic driving:
1. Review and analysis of safety-critical events (a la “instant replays”). 2.
Baseline-event comparisons to identify high-risk driving situations. 3. Driver
exposure-risk comparisons to identify high-risk drivers.
Christopher A. Monk, George Mason
Driver Interrupted: The Costs of Shifting Attention While
Sunday 11:00AM
Room 320
To best understand how people manage multiple tasks, and the costs
of shifting attention between tasks, it is critical to understand how people
resume suspended task goals after interruptions. Several studies have explored
the characteristics of interruptions that make them most disruptive to resuming
the interrupted task; however, there is little task resumption data directly
connected to the driving context. There is ample evidence that people are
attempting to optimize their time by talking on the phone, checking email, etc.
while driving. This situation is potentially dangerous if the costs associated
with shifting attention interfere with required reactions to roadway situations
(e.g., reacting to unexpected object in the road). In this study, a desktop
driving simulator was used to investigate how drivers react to an unexpected
lane drift during an interruption (i.e., when attention was off the road).
Results showed that driver reaction to the lane drift was affected by the
presence of a cognitive task during the interruption. The implications of these
findings for understanding the costs associated with drivers shifting their
attention between the road and in-vehicle tasks will be discussed, as well as
future research plans with this new paradigm.
David M. Cades1, 2, Stephen M. Jones1, Nicole
E. Werner1, Deborah A. Boehm-Davis1
Knowing When to Switch Tasks:
Effectiveness of Internal versus External Cues
1George Mason University,
2Correpsonding Author
Sunday 11:30AM
Room 320
It is now commonplace in both our personal and professional
environments to be performing multiple concurrent tasks. Sometimes people
switch between tasks under their own volition (e.g., reach a good stopping
point, finish the task), but other times they are forced to switch by some
external cue (e.g. phone call, knock at the door). While research on task
switching has shown that there is a time cost every time tasks are switched, it
is unclear if there are any performance differences when these switches are
forced as opposed to when a person chooses to switch tasks. The current
research used a category naming paradigm in which participants had to generate
words from four categories. In one condition they could switch between the
categories whenever they wanted and in another condition they could only switch
when cued by the computer. IN both conditions there was a time cost associated
with switching categories that was longer than the time between words within a
category. No differences, however, were found between these conditions in the
total number of words generated or the time to switch categories. These results
suggest that people were able to perform multiple concurrent tasks with equal
proficiency regardless of whether they switched between them on their own or
were explicitly told when to switch.


Kelly Colas, James Madison
High School Virginia Heppner, James Madison High School Mentored by: Charlotte
Lanteri, Walter Reed Army Institute of Research, SS, MD Jacob Johnson, Walter
Reed Army Institute of Research, SS, MD
Assessment of 96- and 384- Well
Malaria SYBR Green I- Based Fluorescence Assay for Use in In Vitro Malaria Drug
Saturday 9:00AM-9:40AM
Room 365
New methods for identifying drug candidates and monitoring drug
resistance trends are required for the devastating tropical disease, malaria.
Malaria parasites, Plasmodium falciparum, are adapted for in vitro growth. In
vitro malaria drug assays are used for the screening of new drug candidates and
surveying the resistance of malaria within a region. An ideal in vitro assay is
time efficient, inexpensive, accurate, and reproducible. One such assay that
fulfills these important criteria is the SYBR Green I assay, developed by
Johnson et al 2007. The SYBR Green fluorescent dye binds to parasitic DNA,
which allows for the measurement of malaria growth. The Malaria SYBR Green
fluorescence (MSF) assay is used to screen compounds for anti-malarial
activity. This fluorescence based assay is also useful to identifying drug
resistant populations of parasites from clinical samples. This SYBR Green assay
is efficient, inexpensive, and has proven to be both accurate and reproducible.
In this study, we first verified the efficacy of the Johnson SYBR Green I Assay
by quantifying the 50% inhibitory concentration (IC50) value associated with
various standard antimalarial drugs required for inhibiting P. falciparum
culture growth. The P. falciparum strain D6, a known chloroquine sensitive
strain, and the W2 strain, a chloroquine and multi-drug resistant strain, were
tested in the assay. We then micronized the 96 well assay to a 384 well assay.
Adapting the assay to a 384 well format makes the screen more time efficient
because more tests can be run at the same time with more wells; and it is less
expensive because roughly the same amount of materials are used to yield a
greater amount of results. A volume to volume scale down from the 96 well
format was used to micronize the 384 well assay. P. falciparum D6 cells were
applied to the 384 well malaria SYBR Green I MSF assay. Several factors were
taken into consideration when micronizing the assay and analyzing data; these
factors were time, edge effect, transparent versus black plates, and
background. We compared results from 72 and 96 hour long incubation periods to
examine the most effective condition for running these plates. Black and
transparent plates were tested because it was anticipated that the black plates
would yield more effective results than the transparent plates, since
fluorescent dyes (such as SYBR Green) are more likely to yield stronger signals
with the black background. Fluorescence readings of the outer most wells of the
plate sometimes are weakened based on the plate reader’s capabilities, referred
to as the “edge effect.” The Z’ score is an effective method for assessing the
robustness of a biological assay. The results from the Z’ indicate whether this
test is reproducible in a high throughput screening (HTS) capacity. Thus, all
of the assay variables were assessed using a Z’ calculation. In conclusion, the
384 well MSF assay appears to be a reliable HTS for the discovery of novel
anti-malarial drug candidates in a cost and time efficient manner.
John Russo Jr., St. Vincent of Pallotti High
School Mentored by: Heather O’Brien, and Dr. Marc Litz, ARL, MD
Power Applications
Saturday 9:40AM-10:00AM
Room 365
A numerical simulation of a millisecond pulse width transmission
line was modeled in PSpice. The numerical results were compared to a
transmission line built using six capacitors each about 42degreesF. The
numerical and measured results compared well. This transmission line was used
to evaluate a single silicon-carbide (SiC) Gate Turn-Off thyristor (GTO)
high-current pulsed power switch. The results to-date indicate that these new
SiC devices can switch without damage, a 1 mSec, 350 A pulse, charged to 620 V.
Further evaluation on this new test-bed will be pursued to identify the limits
of these switches.
Muneer Zuhurudeen, Eleanor Roosevelt High
School Mentored By: Dr. Mostafiz Chowdhury, ARL-WMRD, Adelphi, MD
Study of the Scaling Relationships between Full-Scale and Sub-Scale Vehicle
Ballistic Shock Response
Saturday 9:40AM-10:00AM
Room 365
When testing the potency of armor made to protect vehicles from
bomb blasts, ammunition rounds, other dangers of war, and even more
importantly, the soldiers inside, it can become very costly to perform tests on
full-sized prototypes. An alternative is to conduct tests on sub-scale models
because they are less costly to manufacture and easier to handle. However,
determining whether sub-scale models will accurately predict the responses of
full-scale prototypes seems to cause uncertainty. The efficient solution to
this problem is to use finite-element model simulators, such as the program
ABAQUS, to recreate real-life situations in order to test the robustness of the
armor. This report is an analysis of the scaling methods used to design these
simulations, and a test of their validity and effectiveness when predicting
full-scale response during ballistic tests on armor panels. During
experimentation, a full-scale model and a sub-scale model of the right-side
panel of the SAC-11 vehicle were created with a replica scaling ratio of 1:3
and then tested using ABAQUS. Another case was created with full-scale
thickness and threat, but with sub-scale geometry. When creating a life-like
model in a computer simulator, it is important to address environmental
conditions such as physical, material, loading, and boundary conditions. A
force can be introduced in the ABAQUS code by applying amounts of pressure over
a period of time to simulate ballistic impact. Material properties such as the
modulus of elasticity, Poisson’s ratio, and density were also included in the
ABAQUS code. Boundary conditions were also applied to the test panels in order
to simulate forces on the model, such as weight, that were factored into the
simulation in order to make it suitable. The data collected in ABAQUS was
imported into MATLAB in order to compute each model’s Shock Response Spectra
(SRS), or a plot of its maximum acceleration responses versus its frequency.
The comparison plots of the SRS data proved that the replica scaling model
(1:3) was an accurate representation of full-scale response at low and high
threat velocities.

NATIVE PLANT SOCIETY (see American Society of Plant

Institute for Industrial Engineers)


Health and Disease
in American Public Education Movies, 1930s-1950s

A presentation of public health movies from the collections of
the U.S. National Library of Medicine. Selected and Presented by David Cantor
for The Washington Society for the History of Medicine.
Saturday 9:00AM
Room 380
This film presentation provides a selection
of rarely seen public health movies released between 1938 and 1957. The
presentation includes movies about cancer, tuberculosis, and ‘quackery’ aimed
at a variety of audiences, and produced by an assortment of private,
philanthropic, professional and governmental organizations. Together, they
emphasize the importance to disease control of early detection and treatment;
of seeking care from a recognized physician; and of avoiding ‘quack’ healers
and home remedies. They encourage the public to learn medically-approved danger
signals of disease; to go for regular medical examinations from a recognized
physician; and to involve themselves in campaigns of medical education and
outreach. Thus, they are as much about the marketing of medicine as they are
about the education of the public. As such, they provide a window onto how
orthodox American medical agencies sought to promote their own authority,
expertise and cultural legitimacy in the twentieth century.
Introductory Messages

Advertisements & Announcements

Public Health Messages from the American Dental Association

Main Program
Man Alive, 1952 (11:35 minutes)
American Cancer
Society Let My People Live, 1938 (13:20 minutes)
National Tuberculosis
Association Fraud Fighters, 1949 (15:50 minutes)
RKO Pathe, Inc. Men of
Medicine, 1938 (16:55 minutes)
American Medical Association March of Time
The Man on the Other Side of the Desk, 1957 (12:30 minutes) American Cancer