CapSci2010

Capital Science 2010

VIDEO RECORDINGS

“Saper Vedere: 50 Years of Insights through Laser”,  Dr. Arden Bement, Director of the National Science Foundation.

“Medicine: Where We Are and Where We Need to Be”,  Jay Sanders, MD.

 

“Chaos Sensitivity to Initial Data”,  Dr. James Yorke, Professor of Mathematics at the University of Maryland.

 

“Growing up with Science at PBS”,  Donelle Blubaugh, Director of Education at PBS.

ABSTRACTS

The Abstracts are preceded by a Table of Contents, which is divided into two sections: the highlights of the Conference followed by an alphabetical listing of the participating Affiliates. Entries in the Table of Contents are linked to the Abstracts of their respective Affiliates.  For details about these presentations, please contact the authors directly.  

TABLE OF CONTENTS

Saturday Dinner
Talk -Keynote Address
“Saper Vedere: 50 Years of Insights through Laser”
Dr. Arden L. Bement, Jr., Director, National Science Foundation
Saturday Luncheon Talk “Where We Are and Where We Need to Be” Dr. Jay Sanders, MD
Sunday Luncheon
Talk
“Chaos Sensitivity to Initial Data” – Dr. Jim Yorke.
Plenary Session “Growing up with Science at PBS” – Donelle Blubaugh, Director Pre-K12 Education, PBS (Abstract)
Plenary Session
“Science Debate 2008 – Where Are We Now?”– Mary Woolley, President, Research!America
AAAS – Center for Curriculum Materials in Science Heather
Malcomson, editor SB&F and Maria Sosa, Senior Project Director, AAAS
Excellence in Children’s Science Books
ACM – DC Chapter (formerly Association for Computing
Machinery
1. Li Chen, Associate Professor, Department of Computer Science
and Information Technology, University of the District of Columbia
A
Digital-Discrete Method For Smooth-Continuous Data
Reconstruction

2. Bill
Spees, PhD, Forensic Software Engineer, FDA Center for Devices and Radiological
Health, Division of Electrical and Software Engineering; Practitioner Faculty,
University of Phonex Online and Don C. Berndt, Mapletech Productions, LLC
Introducing State Machines in Everyday Devices
American
Society of Cybernetics
1. Stuart Umpleby, Professor,
Department of Management and Director of the Research Program in Social and
Organizational Learning in the School of Business, The George Washington
University From Complexity to Reflexivity (underlying logics used in
science)

2. Ia Natsvlishvili, Associate
Professor at Tbilisi State University, Tbilisi Georgia, and Visiting Scholar at
The George Washington University, Washington D.C. Georgia’s Actions to
Become Integrated into the International Community: A Test of Social Science
Knowledge

3. Kent Myers, SAIC and Office of
the Director of National Intelligence The Reflexive
Practitioner

4. Lowell F. Christy Jr. Ph.D.
Chairman, Cultural Strategies Institute Notes from the Field: Applied
Reflexive Systems Thinking and Wicked Problems

5. Joseph M. Firestone, Ph.D., Managing Director Knowledge
Management Consortium International The Relevance of
Reflexivity
American Society for Technical
Innovation
1. Thomas Meylan, PhD, Digital Clones, Inc. An Algebra for
Determining the Value Produced or Consumed by Executive
Behavior

2. Gene Allen, Decision Incite
Opportunities and Challenges in Technology
Commercialization

3.Geoffrey P. Malafsky,
Phasic Systems, Inc. Data Manufactory — NextGen Unified Data Management

4. F. D. Witherspoon,
HyperV Technologies Corp. Status of Plasma Guns for the Plasma Liner
Experiment (PLX)

5.Robin Stombler,
Auburn Health Strategies, LLC Where’s My Nobel Prize and Other Public
Relations Faux Pas

6. Richard F. Hubbard,
Plasma Physics Division, Naval Research Laboratory An Overview of Research
at the Naval Research Laboratory’s Plasma Physics
Division

7. John Bosma, ArcXeon, LLC
Multifunctional Electronic Textiles for Accelerating Poor-Nation
Development- Mass-Producible High ‘Technology-Churn’ Platforms for Disruptive,
Communications-Driven Development

8. Bob
Kolodney Clearing Fog and Smog – A Potential Solution
Institute of Electrical and Electronic Engineers (IEEE),
DC and Northern Virginia Sections
1. Steve
Tracton, Consulting Meteorologist Do Solar Storms Threaten Life as We Know
It?

2. Gerard Christman In Support of
Complex Humanitarian Emergencies: A Model for Net-Centric, Federated
Information Sharing Amongst US Interagency, Non-Governmental and International
Oranizations

3. Nick Tran, CEO and Founder,
Oceanoco Inc. UV Emissions from Sonoluminescing
Microbubbles

4. Tim Weil, Director, IEEE DC
Section Preserving Our Section History with IEEE Global History Network
(GHN) Excerpted from the Nov. 2009 IEEE History Journal

5. Haik Biglari, Zareh Soghomonian, and Zaven Kalayjian Past,
Present and Future of the Electrical Power Grid

6. Robert Noteboom, Raj Madhavan, Gil Blankenship, and Ted Knight
Autonomous Robot Speedway Competition

6.
James C. Tilton, NASA Goddard Space Flight Center Image Segmentation Analysis
for NASA Earth Science Application Image Segmentation Analysis for NASA
Earth Science Applications

8. Paul Cotae,
University of the District of Columbia Work in Progress: Teaching Wireless
Sensor Network Communicaton through Laboratory Experiments

9. Kiki Ikossi, DTRA, R&D-CB Modeling and Measurement of
Contact Parameters in Nanostructures
10.Barry Tilton, P.E. Chair, IEEE Northern VA Section Survey of
implications of new geospatial standards and enhanced observables to next
generation Geospatial Information Systems
Institute of Industrial Engineers, National Capital
Chapter/Washington Chapter of the Institute for Operations Research and
Management Sciences
Anand
Subramanian(1), Ram R. Bishu(2), Jeffrey E. Fernandez(1), and Deepak
Subramanian(3) Does lean-six sigma (lss) effort help predict the quality of
the product and increase profitability? A healthcare industry study

(1)JFAssociates, Inc. Vienna, VA (2)Department of Industrial
Engineering University of Nebraska-Lincoln Lincoln, NE (3)Accenture India
Chennai, India

2. Anil R Kumar, Brandy
Ware and Jeffrey Fernandez, JFAssociates, Inc Virtual Office Ergonomics
Evaluation: A Cost Effective Green Office Implementation
Method

3. Charles D. Burdick, Lockheed Martin
Inc. Overview of the DoD Analytic Agenda

4. Charles D. Burdick, Lockheed Martin Inc. Addressing Hybrid
Warfare in a Campaign Environment

5 Steven
Wilcox Simulation Modeling as a Paradigm for Quantitative Sociological
Research

6. Neal F. Schmeidler, OMNI
Engineering & Technology, Inc. Staffing Model
Development

7. Nastaran Coleman and Ellis
Feldman, Federal Aviation Administration Estimating Conflict Detection
Counts in Air Traffic

8. Russell R. Vane III
Modeling an Adaptive Competitor
John W. Kluge Center of the Library of
Congress
Mary Lou
Reker,Special Assistant to the Director Office of Scholarly Programs Library of
Congress Where Scholars Gather
Marian Koshland Science Museum of the
National Academy of Sciences
Nancy Huddleston
and Ian Kraucunas, National Research Council; Jes Koepfler or Joe Heimlich,
Institute for Learning Innovation; Sapna Batish, Marian Koshland Science Museum
of the National Academy of Sciences Communicating Climate
Change
National Capital Section/Optical Society of America
& IEEE/Photonics Society
1. George Simonis Opening
Remarks on the Early Developments of the Laser

2. Ron Driggers , Naval Research Laboratory Overview of Optical
Sciences and Laser-related R&D at NRL

3.
Mike Krainak, NASA Goddard Space Flight Center NASA: Lasers in
Space

4. John Degnan , Sigma Space Corp 3-D
Imaging Lidar

5. John Wood , NASA Goddard
Space Flight Center Measurements of the Polar Ice

6.Ward Trussell , Night Vision Laboratory Development of
Compact Lasers for Army Applications at NVESD

7. Gary Wood , United States Army Research Laboratory DOD
High-Energy Solid State Lasers and Selected Laser-Related Efforts at ARL


8. Grace Metcalfe , United States Army Research
Laboratory Generation & Utilization of Optically-Generated THz
Radiation

9. Ron Waynant , Food and Drug
Administration Light Therapy and Free Radical Production: Their Role in Cell
Function and Disease

10. Ilko Ilev , Food and
Drug Administration Advanced Multifunctional Sensing and Imaging Approaches
in Biophotonics and Nanobiophotonics
National Institute of Standards and
Technology/University of Maryland Joint Quantum Institute
Panel
Presentation: Climate Change and its Mitigation: The Role of Measurement
Presenters:

1.Gerald (Jerry) Fraser, Chief , NIST
Optical Technology Division, (Examples of Satellite Calibration
work)

2. Yoshi Ohno, Group Leader in Optical
Technology Division (Examples of Solid-State Lighting work)

3:Hunter Fanney, Chief, NIST Building Environment Division
(Examples of Solar Panel work).

Panel
Presentation: The Second Quantum Revolution: Putting Weirdness to
Work

1. Steven L. Rolston, Professor of Physics,
University of Maryland; Co-Director, Joint Quantum Institute Planck to the
Present

2. Luis A. Orozco, Professor of
Physics, University of Maryland; Co-Director, JQI Physics Frontier Center
The Quantum Frontier Today

3. Carl J.
Williams, Chief, Atomic Physics Division, National Institute of Standards and
Technology; Co-Director,Joint Quantum Institute (JQI); Adjunct Professor,
University of Maryland Applications for Tomorrow
Philosophical Society of Washington 1. Eugenie V. Mielczarek
Department of Physics, Professor Emeritus George Mason University The Nexus
of Physics and Biology

2. Kenneth Haapala, Executive Vice President, Science and
Environmental Policy Project (SEPP) Nature Rules the Climate: The Physical
Evidence

3. Larry Millstein, Practicing
Biotechnology Patent Law – Millen, White, Zelano & Branigan, PC
Sequencing Single DNA Molecule and NexGen Genomics
4. Major Catherine M.
With,Legal Counsel, The Armed Forces Institute of Pathology The Legal
Landscape of Personalized Medicine
Potomac Chapter of the Human Factors and
Ergonomics Society
1. William A.
Schaudt(1), Darrell S. Bowman(1), Joseph Bocanegra(1), Richard J. Hanowski1,
and Chris Flanigan(2) Enhanced Rear Signaling (ERS) for Heavy Trucks:
Mitigating Rear-End Crashes Using Visual Warning Signals
(1)Virginia Tech Transportation Institute, Blacksburg, VA 24061
(2)U.S. Department of Transportation, Federal Motor Carrier Safety
Administration, Washington D.C. 20590

2.
Gerald P. Krueger, Ph.D.,Krueger Ergonomics Consultants, Alexandria, VA
Effects of Medications, Other Drugs, and Nutritional Supplements on Driving
Alertness and Performance

3. Nicole E.
Werner, David M. Cades, Deborah A. Boehm-Davis, Matthew S. Peterson, Sahar J.
Alothman, and Xiaoxue Zhang, George Mason University Where was I and what
was I doing? Individual differences in resuming after an interruption and
implications for real-world distractions

4.
Erik Nelson David G Kidd, and David M Cades, George Mason University. The
effect of repeated exposures to simulated driving on ratings of simulator
sickness
Potomac Overlook Regional Park Authority (a property of
the Northern Virginia Regional Park Authority)
Martin Ogle,
Chief Naturalist Viewing Nature through the “Lens” of Energy
Salisbury University, Washington
Academy of Sciences Student Chapter
1. Chuck Davis
Effects of nitrogen availability on lipid production in Neochloris
oleoabundans
Faculty Advisor: Dr. Mark Holland, Department of Biology
Salisbury University

2. Katie Pflaum and Justin
McGrath Use of Bdellovibrio bacteriovorus to control infection in
Caenorhabditis elegans
Faculty Advisor: Dr. Elizabeth Emmert, Department of
Biology Salisbury University

3. Christina M.
Martin More than meets the ear: male position relative to foam nests
influences female mate choice in the túngara frog, Physalaemus
pustulosus
Faculty Advisor: Ryan Taylor, Department of Biology, Salisbury
University

4. Denise L. Tweedale, Hannah Greene,
and Lauren Kopishke Analysis of the Maryland Residential Housing Sales data,
1995 – 2005
Faculty Advisors: Dr. Mara Chen, Dr. Barbara Wainwright, Dr.
Veera Holdai, Departments of Geography and Geosciences & Math and Computer
Sciences, Salisbury University

5. Rebecca L.
Flatley and Frederick D. Bauer The Risk and Vulnerability Impact Assessment
of Sea Level Rise for Wicomico County, Maryland
Faculty Advisors: Michael
S. Scott, PhD and X. Mara Chen, PhD, Department of Geography and Geosciences,
Salisbury University

6. Kayla Pennerman
Cuscuta transmission and secondary infection of Fusarium wilt Faculty
Advisor: Dr. Sam Geleta, Department of Biology, Salisbury
University

7. Nicole S. Massarelli The
Mathematics Behind Anamorphic Art
Faculty Advisor: Dr. Don Spickler,
Department of Mathematics and Computer Science,Salisbury
University

8. Sabrina E. Kunciw
Temperature-induced changes in the expression of enzymes involved in
membrane restructuring in Coho salmon cells
Faculty Advisor: E.Eugene
Williams, Department of Biological Sciences, Salisbury
University

9. Jordan Estes and Shelby Smith
Nordihydroguiaretic Acid in the Polyploids of Larrea tridentata: Effects of
Temperature and Developmental Stage
Faculty Advisor: Dr. Kimberly Hunter,
Department of Biology, Salisbury University

10.
Catherine M. Walsh The Dynamics of Finite Cellular Automata with Null
Boundary Conditions
Faculty Advisor: Dr. Michael J Bardzell, Department of
Mathematics and Computer Science, Salisbury University

11. Robert Figliozzi Synthesis of Butyrylcholinesterase
Inhibiting Nordebromoflustramine B
Faculty Advisor: Dr. Miguel Mitchell,
Department of Chemistry, Henson School of Science and Technology, Salisbury
University

12. Steven Sanders, Christine Davis and
Brett Spangler Genetic Variability in Five Species of Tree Ferns Collected
from Cusuco National Park (Honduras)
Faculty Advisor: Dr. Kimberly Hunter,
Department of Biology, Salisbury University
Science and Engineering Apprentice Program, George
Washington University
1. Anh Dao, Thomas Jefferson High
School for Science and Technology Mentored by: Dr. Ramchandra S. Naik, Walter
Reed Army Institute of Research Quantification of Paraoxonase Activity in
Animal Serum to Study Nerve Toxicity

2. Nader
Salass, Washington International School and Ahmad Yassin, Lincoln Memorial
University, Mentored by: CPT Jacob Johnson, Walter Reed Army Institute of
Research and Dr. Geoffrey Dow, Walter Reed Army Institute of Research
Hit-to-Lead Evaluation of Antihistamines for Use in Treatment of
Malaria
Washington Society for the History of
Medicine
1. Stephen
Greenberg Do Come Play: A Demonstration of Online Resources in the History
of Medicine from NLM

2. Alain Touwaide
Digitizing Renaissance Herbals – The PLANT Program

3.
Christie Moffatt and Susan Speaker The Profiles in Science Project at the
National Library of Medicine

4.Elizabeth Fee
A Rapid Romp through the History of the World Health Organization

5. Patricia Tuohy, Head, Exhibition Program
Medicine Ways: creating an exhibition about Native peoples’ concepts of
health and illness

6.Jiwon Kim, Exhibition
Program, History of Medicine Division, National Library of Medicine, Bethesda,
MD History, Literature and Science: Engaging Educators and Students in Harry
Potter’s World

7.Paul Theerman The History
of Tropical Medicine, as seen in the Images and Archives Collections of the
National Library of Medicine

 

ABSTRACTS AND
PRESENTATIONS

 

AMERICAN
ASSOCIATION FOR THE ADVANCEMENT OF SCIENCE

Heather Malcomson and Maria Sosa Excellence in
Children’s Science Books
The AAAS/Subaru SB&F Prizes for Excellence in Science Books
celebrate outstanding science writing and illustration for children and young
adults. The prizes are part of SB&F’s initiative to encourage more reading,
writing and publishing of quality science books. SB&F, the review journal
of AAAS, has been reviewing science books for children and young adults for
over 45 years. The SB&F Prizes began in 2005 by looking back on decades of
outstanding science books and honoring five authors and one illustrator for
their significant and lasting contribution to children’s and young adult
science literature and illustration. Beginning in 2006, the prizes began
honoring recently published, individual science books. The prizes emphasize the
importance of good science books and encourage children and young adults to
turn to science books, not only for information, but for enjoyment too. Learn
more about the prizes and how you can help promote the mission of encouraging
excellence in the publishing of children’s science books.

 

ACM – DC CHAPTER
(formerly the Association for Computing Machinery

Li Chen,
Associate Professor, Department of Computer Science and Information Technology,
University of the District of Columbia A Digital-Discrete Method For
Smooth-Continuous Data Reconstruction
A
Digital-Discrete Method For Smooth-Continuous Data
Reconstruction
A systematic digital-discrete method for obtaining continuous
functions with smoothness to a certain order (C^n) from sample data is
designed. This method is based on gradually varied functions and the classical
finite difference method. This new method has been applied to real groundwater
data and the results have validated the method. This method is independent from
existing popular methods such as the cubic spline method and the finite element
method. The new digital-discrete method has considerable advantages for a large
amount of real data applications. This digital method also differs from other
classical discrete method that usually uses triangulations. This method can
potentially be used to obtain smooth functions such as polynomials through its
derivatives f^{(k)} and the solution for partial differential equations such as
harmonic and other important equations.
Bill Spees,
PhD, Forensic Software Engineer, Division of Electrical and Software
Engineering Center for Devices and Radiological Health Office of Science and
Technology, Food and Drug Administration and Practitioner Faculty, University
of Phoenix Online
Introducing machines in
everyday devices
State machines (FSMs) were developed to keep simple control,
simple. A state machine is a conceptual or “paper” machine; it has no
preference as to its realization. State machines localize control to a “state”,
which is a domain much smaller and simpler than the whole problem. A state is
designed to handle only a certain set of “events” that can happen. The state
knows how to handle its limited vocabulary of events by doing a (possibly
empty) set of actions and going to a particular state (which may not be
different from the present state). Events that aren’t handled are rejected.
Almost every digital electronic device is designed as a state machine,
including cell phones, cruise controls, TV remote controls, clocks, watches,
call management systems. FSMs have so reduced cost and complexity of developing
technology that “high tech” wouldn’t exist without them. Simple systems would
have more parts; new designs would be discouraged; and devices would be more
discrete because integration would pose a risk of complete project failure. In
this short presentation we will explore the states of a phone call management
system based on a state machine. Along the way, we will talk about how to
recognize the state machines in other technology that we encounter every day,
with a view to clarifying our understanding of the capabilities and limitations
of state machines.

 

AMERICAN
SOCIETY OF CYBERNETICS

Stuart
Umpleby, Professor, Department of Management and Director of the Research
Program in Social and Organizational Learning in the School of Business, The
George Washington University
From Complexity
to Reflexivity (underlying logics used in science)
This talk describes the basic features of the theories of
complexity and reflexivity, their early history, their evolution, and reactions
to date. Although complexity is a major change from previous modeling methods,
it does not violate any informal fallacies or any assumptions underlying the
philosophy of science. Reflexivity does. Accepting reflexivity as a legiti-mate
movement in science will require an expansion of the conception of science
which still prevails in most fields. A shift from Science One to Science Two is
now being discussed. The talk explains what is being proposed
Ia
Natsvlishvili, Associate Professor at Tbilisi State University, Tbilisi
Georgia, and Visiting Scholar at The George Washington University, Washington
D.C.
Georgia’s Actions to Become Integrated
into the International Community: A Test of Social Science Knowledge
The paper argues that for the improvement of the economic and
social state of transitional countries institutional reforms and integration
into the international community have great importance because successful
institutional reforms encourage investment flow. However, funds from local
sources in transitional countries, particularly in Georgia, are limited. That
is why the main priority is to attract foreign investments. But foreign
investments require important institutional changes. Successful institutional
reforms in Georgia were based on the concepts of the “Washington Consensus”.
The term “Washington Consensus” refers to the macroeconomic policies that
scientists and policymakers in Washington believe governments of emerging
market economies should follow in order to promote their development.
Remarkable actions are also being made to integrate into the European higher
education area. Georgia is in the process of successfully implementing the
recommendations of the ambitious European Project, the “Bologna Process,” that
will support Georgia’s integration into the international educational
community. The paper describes Georgia’s actions to become integrated into the
international community as a positive test of the “Washington Consensus”
policies and the “Bologna Process”. The paper describes the Georgian economy
and industries where foreign direct investments are being made to illustrate
how institutional reforms can create an attractive business environment and
support economic growth. Georgia remains an attractive country for foreign
investors for several reasons: adequate institutional reforms and a free market
oriented economic policy, an attractive macroeconomic environment, competitive
trade regulations, a liberal tax code, an aggressive privatization policy,
modernized business licensing, an adequate technical regulation system, a
strategic geographical location, a competitive and dynamic banking sector, an
ancient culture and traditions, steady transformation to a market economy, and
diverse investment opportunities. The paper argues that despite the current
global financial crisis and the Russian-Georgia War in 2008 Georgia remains an
attractive place for investments because of successfully implemented
institutional reforms.
Kent Myers,
SAIC and Office of the Director of National Intelligence
The Reflexive Practitioner
Turbulent society requires a high rate of learning and adaptation.
The professionals who mediate institutional responses to turbulence are
increasingly ineffective, in part because they employ anachronistic mindsets
that limit sensemaking. Three such mindsets are: Rational, Principled, and
Interested. We describe a Reflexive mindset that works better today. It does
not offer the (illusory) satisfactions of the other mindsets. It can also be
criticized as having parted from science, traditionally understood. It is,
however, well matched with “phronetic social science” and other recent
deviations. We briefly discuss how reflexive practice could improve the
performance of national intelligence, a practice area where turbulence is
undeniable and where traditional social science has had little to offer.
Lowell F. Christy Jr. Ph.D. Chairman,
Cultural Strategies Institute
Notes from the
Field: Applied Reflexive Systems Thinking and Wicked Problems
(Abstract Only)
The way-we-think about problems matters. The “mind helps” of the
scientific method have opened new ways of understanding our physical world. The
scientific method has helped transform the world around us via technology and
applied knowledge based on our understandings. From Francis Bacon to Thomas
Kuhn and the “Structure of Scientific Revolutions” our understanding of
thinking patterns has provided great leverage to impact problems of change.
Simple change entailing physical force and use of scientific knowledge to
create new processes and materials as well as complicated change like the teams
of engineers to put a man on the moon are informed via existing paradigms of
thinking But in living systems, particularly human systems, our rational
thought and professional experts have not had such success. The human landscape
lies in tatters. This is not merely a matter of simple or complicated change
but complex change. Apart from the theories of reflexive systems thinking how
can we impact complex, “wicked problems?” This presentation will report on
applied reflexive thinking in the Cultural Strategies Institute’s “Outposts of
Innovation.” The field notes from Afghanistan, Jordan and Uganda will be used
as examples of how cybernetics and systems thinking can be applied to create
those levers of change. Concrete examples of design of interventions for
systemic change, the limits of thought, unintended consequences and metrics for
systemic change for wicked problems will be outlined for discussion
Joseph M. Firestone, Ph.D., Managing Director
Knowledge Management Consortium International
The Relevance of Reflexivity (Abstract Only)
The way-we-think about problems matters. The “mind helps” of the
scientific method have opened new ways of understanding our physical world. The
scientific method has helped transform the world around us via technology and
applied knowledge based on our understandings. From Francis Bacon to Thomas
Kuhn and the “Structure of Scientific Revolutions” our understanding of
thinking patterns has provided great leverage to impact problems of change.
Simple change entailing physical force and use of scientific knowledge to
create new processes and materials as well as complicated change like the teams
of engineers to put a man on the moon are informed via existing paradigms of
thinking But in living systems, particularly human systems, our rational
thought and professional experts have not had such success. The human landscape
lies in tatters. This is not merely a matter of simple or complicated change
but complex change. Apart from the theories of reflexive systems thinking how
can we impact complex, “wicked problems?” This presentation will report on
applied reflexive thinking in the Cultural Strategies Institute’s “Outposts of
Innovation.” The field notes from Afghanistan, Jordan and Uganda will be used
as examples of how cybernetics and systems thinking can be applied to create
those levers of change. Concrete examples of design of interventions for
systemic change, the limits of thought, unintended consequences and metrics for
systemic change for wicked problems will be outlined for discussion

AMERICAN
SOCIETY FOR TECHNICAL INNOVATION

Thomas
Meylan, Digital Clones, Inc.
An Algebra for Determining the Value
Produced or Consumed by Executive Behavior
)
The behaviors of an organization’s executive team (as well as
those of lower management) will always cost the organization in some vital
resource. For major organizations, such as Fortune 500 companies and
governmental organizations (from the large metropolitan area to the US
Government), the two most commonly affected resources are capital and public
opinion. The question then becomes, “What is the Return On Investment
(ROI) made to support a given executive’s or manager’s
behaviors?” Based on the Drive Satisfaction Strategy Theory of
organizational formation and evolution, an “algebra” quantifying the
ROI of two specific classes of in-house, executive behavior is defined, with
theoretical justifications. The algebra will be applied to “adversarial
executive behavior” and “collaborative executive behavior” to
demonstrate how it can be used to determine the time-dependent
“deltas” in corporate value generated by such behaviors. The terms
“adversarial” and “collaborative” are to be understood in
reference to the mission of the organization, i.e., that an executive’s
behaviors result in diminished organizational success or in enhanced success.
These terms are therefore to be understood also as relating to measurable
performance metrics, as yet to be defined. To be explicit, a positive ROI on
executive behavior indicates that the investment to support an executive
behavior led to greater success for the organization. The algebra supplies an
objective, quantifiable means of determining that value.
Gene Allen,
Decision Incite Inc.
Opportunities and Challenges in Technology
Commercialization
We have all benefited from the ability to apply new technologies to
improve the quality of human existence. While there have always been challenges
to technology commercialization, the challenges have evolved. I will be sharing
recent experiences in efforts to improve the engineering process through the
use of commodity computing. This will be addressed in the context of the
economic disruption we are experiencing with a hypothesis being presented on
the underlying cause – that being: 1. We are victims of our own success in that
a small percentage of the world’s population can now not only feed, but produce
all the material goods societies need; 2. As a result the U.S. business culture
has shifted focus from wealth generation to wealth manipulation, generating
economic bubbles that generate perceived wealth versus real wealth.
Demographics, markets, and incentives will be reviewed for what will be needed
to provide food, energy, and environmental security in a politically stable
world. A process for addressing contingencies will be included in the
discussion. The presentation will set the stage for technology deployment
opportunities we will need to enable the continued progression of humanity
Geoffrey P Malafsky, Phasic Systems Inc. Data
Manufactory — NextGen Unified Data Management
Data management is notoriously expensive, complicated, and
error-prone. Too many organizations spend large amounts of money and time on
software, consultants, technical staff, and seemingly endless business
improvement projects only to end up where they began. Data is disconnected
across business units; data cannot be trusted; data reports have wrong
information; and, the supposedly same data means different things to different
groups. These problems are widely recognized and have spawned new industry
efforts in enterprise architecture, Master Data Management (MDM), and data
governance. Although each of these efforts proffers good guidance and real
benefits, a single coordinated data lifecycle is needed instead of fragmented
processes. Data must be treated as the critical organizational asset it is.
Data manufactory is this coordinated management process recognizing that
organizations manufacture data and can dramatically improve their efficiency
and quality by adopting manufacturing best practices. Outside of the data
management world, a revolution occurred in manufacturing and management that
vastly increased efficiency, productivity, and quality. Global businesses
operate in far-flung locations producing parts for a single product with higher
quality and cost-efficiency than a typical moderately sized enterprise data
system. Within the data management world, businesses have been forced to choose
products from a moribund industry out-of-step with modern practices.
Organizations can no longer muddle through because of the critical role data
plays in executive decision-making, manufacturing, and increasingly, regulatory
reporting (Abstract Only)
F. D.
Witherspoon, HyperV Technolgies Corp
Status of Plasma Guns for the
Plasma Liner Experiment (PLX)*
High velocity dense plasma jets have been under development at
HyperV Technologies Corp. for the past several years for a variety of fusion
applications such as magneto-inertial fusion, refueling of magnetic confinement
devices, high energy density plasmas, plasma thrusters, and others. In
particular, a spherical array of minirailgun plasma accelerators is planned for
the Plasma Liner Experiment (PLX) to be located at Los Alamos National
Laboratory, and to be performed in collaboration also with the University of
Alabama (Huntsville), the University of New Mexico (Albuquerque), and the
University of Nevada (Reno). The imploding plasma liner will be formed via the
merging of 30 (or more) dense, high Mach number plasma jets arranged in a
spherically convergent geometry in a 9 foot diameter spherical vacuum chamber
on loan from NASA. Small parallel-plate railguns are being developed for this
purpose, due to their reduced system complexity and cost, with each gun planned
to operate at 400-500 kA peak current, and launching up to 8000 micrograms of
high-Z plasma using high voltage pulse forming networks. We describe the
operation of these minirailguns, their development, their current and projected
performance, and their use in the PLX experiment. *Work supported by the U.S.
DOE Joint Program in HEDLP. (Abstract Only)
Robin
Stombler, Auburn Health Strategies, LLC
Where’s My Nobel Prize and
Other Public Relations Faux Pas
Intellectual honesty and smarts, enthusiasm, a commitment to pursue
an idea for the long-haul, openness to exploration, and creativity are all
important traits for a good scientist to possess. Translating science from the
laboratory to commerce requires these same elements. Yet, sometimes scientists
stop acting like scientists past the point of discovery. This presentation will
discuss why many scientific ideas and exciting research efforts fail to garner
much public attention. It will outline strategies all scientists may engage in
the pursuit of improved public relations.
Richard
F. Hubbard, Plasma Physics Division, Naval Research Laboratory, Washington,
DC
An Overview of Research at the Naval Research Laboratory’s
Plasma Physics
The Naval Research Laboratory (NRL) is designated the Navy’s
Corporate Laboratory and is widely considered one of the nation’s leading
research institutions. The NRL Plasma Physics Division performs a broad range
of both basic and applied research that addresses key problems for the Navy and
the nation. Plasmas are ionized gases that occur in many natural or laboratory
environments. The Division has several high-power laser facilities that
generate plasmas for applications in inertial fusion energy (IFE), directed
energy weapons, remote detection of explosives or weapons of mass destruction
(WMDs), compact particle beam accelerators, and triggered lightning discharges.
In the case of the krypton-fluoride (KrF) laser used for IFE, the gain medium
for the laser is also a plasma. The Division also has several large pulsed
power devices, which generate high voltage electrical pulses that are used for
flash x-ray radiography, WMD detection, and electromagnetic launchers
(railguns). The Division also has a unique materials processing facility that
uses electron beam-generated plasmas that can be used to modify fragile
materials such as polymers and grapheme that would be damaged by conventional
discharge produced processing plasmas. Finally, there is an extensive effort in
space plasma physics and “space weather”. This area is of
considerable interest to the Navy and DOD because of space plasma effects on
communications and space-based assets.
Supported by the Office of Naval
Research (ONR)To be presented at CapSci2010, Washington Academy of Sciences,
Washington, DC27 March 2010 (Abstract Only)
John Bosma, ArcXeon LLC Baltimore, MD
Multifunctional Electronic Textiles for Accelerating Poor-Nation
Development- Mass-Producible High ‘Technology-Churn’ Platforms for Disruptive,
Communications-Driven Development
The concept is to use a single high-tech platform with multiple
embeddable functions to solve massive development problems fast while
unleashing local entrepreneurial talents. This concept turns the traditional
development paradigm on its ear: (1) typical ‘stovepiped’ overseas development
approaches all ‘infrastructure’ functions (e.g. housing, food/water, sewage,
waste, electric power, communications, literacy, health/medical, security)
separately; (2) this means that deploying each function or ‘utility’ demands a
separate champion, funder, approval or buy-in chain, local builders, etc. –
because traditional poor-nation development mandates this stovepiping; (3) The
Result is top-down (vs. bottom-up) development axis, enormous transaction
costs, corruption incentives, old technologies preferentially fielded; (4) A
Better idea – aggregate these functions for simultaneous deployment via one
platform and launch their new owners and users into the Internet age
immediately for accelerated development; (5) The idea is to select a single
platform that can draw on US, European, Asian-Pacific Rim consumer and business
constituencies and contributors, with full access to ‘technology churn’ that
commoditizes utility functions at progressively lower cost; (6) ‘Disruptive
Development’ via early communications/Internet capability is seen by
development experts like Clayton Christiansen (Harvard Business School) as a
new model for poor-nation economic development. (Abstract
Only)
Bob
Kolodney
Clearing Fog and Smog – A Potential
Solution
This talk discusses a new venture based on a patented device that
clears fog and smog and causes precipitation. The device represents the cutting
edge of atmospheric precipitation technology. The equipment generates negative
ions and directs them into the atmosphere. This causes the intensification of
existing clouds and creates clouds where none exist, and in due course, as the
clouds become more dense with water droplets, there is precipitation. The
device appears to be effective in air pollution control, and in creating rain.
It is generally possible to disperse fog within a few hours, and initiate
precipitation within 24-48 hours of starting up the equipment – after a period
of 2-3 weeks of preliminary preparation (analysis, establishment of operating
algorithms). Adjustments in operating algorithms and equipment set-up are made
according to circumstances of temperature, almospheric pressure, wind,
humidity, geography. A group of 3 to 5 units of equipment can serve an area of
20 by 80 kilometers. The Russian inventors are able to get results under many
different meteorological conditions, and expect that they will be able to
provide assistance in circumstances of fog around airports, smog in cities,
droughts, and forest fires. They are particularly hopeful that they can provide
assistance to farmers faced with drying conditions due to global warming. The
first generation of this equipment was conceived to clear fog around airports
in the former Soviet Union, and improvements were made in stages over a period
of 30 years until initiation of precipitation became possible. The company is
negotiating with several potential customers for a comprehensive demonstration
project. (Abstract Only)

 

INSTITUTE OF
ELECTRICAL AND ELECTRONICS ENGINEERS (IEEE), DC AND NORTHERN VIRGINIA
SECTIONS

Steve Tracton Consulting Meteorologist Do
Solar Storms Threaten Life as We Know It? Steve Tracton Consulting
Meteorologist Do Solar Storms Threaten Life as We Know It?
(Abstract Only)
As severe as the possible effects of global warming might be, many
of the worst-case impacts are not likely to occur on timescales less than a
decade or so. Of perhaps more immediate concern to civilization as we know it
— quite literally — is the threat posed by the expected increase in solar
activity starting around 2011-2012, which could disrupt many aspects of life
that societies now take for granted and depend heavily upon for their daily
existence. Electric power grids, communications and navigation systems
(including GPS), and satellites (including weather) could be damaged beyond
repair for many years. The consequences could be devastating for commerce,
transportation, agriculture and food stocks, fuel and water supplies, human
health and medical facilities, national security, and daily life in
general
Gerard
Christman, Program Manager & Sr. Systems Engr. Office of the Secretary of
Defense Technical Services FCI
In Support of Complex Humanitarian
Emergencies: A Model for Net-Centric, Federated Information Sharing Amongst US
Interagency, Non-Governmental and International
Organizations
In November 2006, the US Department of Defense issued a new policy
entitled Stability, Security, Transition and Reconstruction Operations (DoDD
3000.05). This policy mandated that the US military must treat SSTR Operations,
now shortened to Stability Operations, on par with major combat operations.
Recent efforts in Haiti indicate there remain significant challenges to
civil-military coordination. On the critical path to successful accomplishment
of Stability Operations is the ability to communicate, collaborate, translate
and engage with the civil portion of the calculus. From a military perspective,
neither will the civil side be commanded nor will it often be controlled.
Therefore, traditional C2 methods are not applicable in managing processes that
cross the civil-military boundary while engaged in Stability Operations. The
focus of this paper regards research into a methodological approach to bridging
civil and military systems that support their distinct business processes with
a view towards enhancing shared situational awareness, a common assessment
framework, providing a common basis for planning, and a synchronized ability to
execute those plans.
Nick E.
Tran, CEO & Founder, Oceanoco Inc.
UV Emissions from
Sonoluminescing Microbubbles
In this presentation, we will provide the first direct evidence of
UV emissions resulting from cavitational collapse of microbubbles in water at
ambient pressure. The microbubbles were observed to emit photons of energies
exceeding 6 eV as they underwent asymmetric collapse at the silicon photodiode
detector surface. Each photoemission event consisted of 106-107 photons with
irradiance intensities in excess of 1.4×10-6 W/cm2. The calculated curve fits
from black body radiation and Bremsstrahlung radiation indicate that the
cavitational collapse of the microbubble generated a plasma temperature around
250,000K. To our knowledge, this is the highest temperature that has ever been
recorded for this phenomenon since it was first reported by Frenzel and
Schultes in 1934.
Tim Weil,
Director, IEEE DC Section
Preserving Our Section History with IEEE
Global History Network (GHN)Excerpted from the November 2009 IEEE History
Journal
The Washington, D.C. Section offers a wonderful example of how an
IEEE organizational unit can use the GHN to preserve and showcase its
institutional memory. Many of the Section’s early records have been lost. But
fortunately, back in the early 1950s, someone in the Section put together a
large scrapbook of photographs and extensive summaries of the records spanning
the fifty year period from 1903 to 1953. Recently, this Section has placed all
these documents on the GHN. Not only does this material offer a fascinating
glimpse into the Section’s past, but it also offers a window on to the early
history of AIEE, the development of the engineering profession in the U.S., and
even the history of the District of Columbia. One photograph found in the
Washington, D.C., Section,archives page on the GHN is a rare aerial view of the
D.C.urban landscape, taken from a U.S. Army Signal Corps balloon.Another photo
shows a 1908 laboratory at the National Bureauof Standards.In September 1904,
the International Electrical Congress was held in St Louis. Prominent
electrical engineers made up the delegations from the world’s industrialized
nations. During their stay in the U.S.A., most visited the nation’scapital and
The Washington D.C. Section was the official host. Preserving institutional
“memory” and making it easily accessible to all is essential to the long-term
continuity of IEEE. The GHN offers IEEE’s technical and geographical units an
easy to use platform to both preserve important historical documents and to
share them easily with the membership at large. We encourage Societies,
Regions, and Sections to preserve and showcase their memories on the GHN. If
your organizational unit has not done this yet, we hope you will follow the
examples of those that have done so. As an illustration of what can be done,
examine the pages created for the Washington, D.C. Section on the GHN This
presentation will describe the process by which 50 years of IEEE Section
History (Washington Section) was digitally archived and made available to our
members. The talk will also focus on the values and continuity of a major
project to preserve institutional memory for future generations of Electrical
Engineers
Haik Biglari, Zareh Soghomonian, and Zaven
Kalayjian
Past, Present and Future of the Electrical Power Grid
(Abstract Only)
The twentieth century witnessed the birth of the Electrical Power
Grid (EPG) among many other discoveries and innovations. The EPG has served as
the backbone of economic strength in many industrialized and developing
countries. Considering that EPG covers vast geographical regions with large
variations in environmental conditions, the system has proven to be fairly
robust with only a few outages per decade in USA. This robustness was built
into the system by carefully matching the amount of power generation to the
electrical power demand of various load types. The need for larger power
generation and reduction in CO2 emissions is putting a greater demand on the
capabilities of EPG. The key to CO2 reduction is moving towards renewable
energy. The robustness of EPG was due to predictability and centralized
controllability of the energy resources. Unpredictability of renewable
resources makes centralized control achievable only by construction of massive
energy storage devices. The construction of large energy storage devices have
proven to be economically and technically not feasible. The alternative is a
decentralized control and further automation at the lowest possible load
levels. It is clear that the lower the load level automation, the larger the
scale of automation. An EPG compatible with such a large scale automation level
which also incorporates renewable energy resources efficiently is said to be a
Smart Grid (SG). The legacy of the twenty first century will be the birth of
the SG.
Robert
Noteboom, Raj Madhavan, Gil Blankenship, and Ted Knight
Autonomous Robot
Speedway Competition
For the past two years, the Washington/Northern Virginia Chapter
of the IEEE Robotics & Automation Society and the University of Maryland
Electrical and Computer Engineering (ECE) Department have held an Annual
Autonomous Robot Speedway Competition (ARSC) at the University of Maryland
– College Park. The last iteration of this competition held in October
2009 invited teams of IEEE junior members, University students, and robotics
club members to acquire a deeper appreciation of the state-of-the art and
challenges that are currently the focus of research in robotics and automation.
Competing teams were required to build and demonstrate a robot capable of
traveling one mile on an oval track outlined with orange cones. The competitors
were scored on speed and distance traversed as well as on a technical
presentation about their robot. This presentation will provide an overview of
the competition and a look at some of the competitors and winners from the past
two years.
James C.
Tilton, NASA Goddard Space Flight Center
Image Segmentation Analysis
for NASA Earth Science Applications
NASA collects large volumes of imagery data from satellite-based
Earth remote sensing sensors. Nearly all of the computerized image analysis of
this data is performed pixel-by-pixel, in which an algorithm is applied
directly to individual image pixels. While this analysis approach is
satisfactory in many cases, it is usually not fully effective in extracting the
full information content from the high spatial resolution image data that is
now becoming increasingly available from these sensors. The field of
object-based image analysis (OBIA) has arisen in recent years to address the
need to move beyond pixel-based analysis. The Recursive Hierarchical
Segmentation (RHSEG) software developed by the author is being used to
facilitate moving from pixel-based image analysis to OBIA. The key unique
aspect of RHSEG is that it tightly intertwines region growing segmentation,
which produces spatially connected region objects, with region object
classification, which groups sets of region objects together into region
classes. No other practical, operational image segmentation approach has this
tight integration of region growing object finding with region classification.
This integration is made possible by the recursive, divide-and-conquer
implementation utilized by RHSEG, in which the input image data is recursively
subdivided until the image data sections are small enough to successfully
mitigate the combinatorial explosion caused by the need to compute the
dissimilarity between each pair of image pixels. RHSEG’s tight integration of
region growing object finding and region classification is what enables the
high spatial fidelity of the image segmentations produced by RHSEG. This
presentation will provide an overview of the RHSEG algorithm and describe how
it is currently being used to support OBIA for Earth Science applications such
as snow/ice mapping and finding archaeological sites from remotely sensed
data
Paul
Cotae, University of the District of Columbia
Work in Progress: Teaching
Wireless Sensor Network Communication through Laboratory Experiments

Wireless communications is becoming a transparent technology with
which incoming college students most certainly have vast firsthand experience
as users. Wireless Sensors Network Communications often proves to be a quite
challenging subject to teach because many students appear to find the subject
too technical. In this paper, we present some undergoing capstone design
projects and laboratory experiments to provide the students of wireless
communication and networking with a hands-on experience. The motivation of this
approach is twofold. First, the projects pertain to the area of wireless sensor
networks where rapid technological changes in wireless sensing devices have
changed the types of work electrical and computer-engineering students are
likely to do in their careers. Second, student groups come up with their own
project applications and problem statements for which to design a system.
Kiki
Ikossi, Ph.D. DTRA, R&D-CB
Modeling and Measurement of Contact
Parameters in Nanostructures
Emerging areas of nano-electronics, nano-optics, and molecular
electronics have a common challenge: the transfer of the information from the
nano-scale to our macro world. Early demonstrations of advanced nano-devices
incorporate accessible contacts to nanostructures. In order to observe the
nano-scale phenomena, the contact parameters and the way the contacts interact
with the nanostructures needs to be accurately evaluated. In this work a
3-dimensional multilayer distributed network model is developed that allows the
extraction of contact and interface parameters. The model accounts the
interactions between adjacent layers and relates the parameters to transmission
line model measurements.
Barry C
Tilton, P.E. Chair, IEEE Northern VA Section
Survey of implications of
new geospatial standards and enhanced observables to next generation Geospatial
Information Systems
(Abstract not
available)

 

INSTITUTE OF
INDUSTRIAL ENGINEERS, NATIONAL CAPITAL CHAPTER/WASHINGTON CHAPTER OF THE
INSTITUTE FOR OPERATIONS RESEARCH AND MANAGEMENT
SCIENCES

Anand Subramanian(1), Ram R. Bishu(2),
Jeffrey E. Fernandez(1), and Deepak Subramanian(3)
Does lean-six sigma
(lss) effort help predict the quality of the product and increase
profitability? A healthcare industry study
(Abstract Only)

(1)JFAssociates, Inc. Vienna, VA (2)Department of
Industrial Engineering University of Nebraska-Lincoln Lincoln, NE (3)Accenture
India Chennai, India
Traditionally, the terms quality and process improvement has been
associated with the manufacturing industry. Improving the quality of a product
or service has increased customer satisfaction and the profitability of
organizations. The importance of quality in health care has been recognized and
analyzed in recent years. Hospitals across the United States are beginning to
embrace lean and Six Sigma business management strategies in attempts to reduce
costs and improve productivity. While applied in manufacturing
extensively—and applicable to all industries—these management methods
have moved into healthcare recently, but with little substantive data available
for hospitals to assess the worth of the methods. This presentation discusses
the methodology to use six sigma tools in health care sector. It also explores
the fact “process quality leads to product quality” and how it translates to
lowering healthcare costs, generate additional savings, and improve patient
outcomes and thereby increasing the revenue per bed. Additionally, it also aims
to present some of the critical reasons why hospitals have problems
implementing the Lean-Six Sigma methods. This study implements the most widely
used model for quality in healthcare-“Structure-Process-Outcome” model; the
methodology and preliminary results of the implementation process at a health
care facility in the Southwest United States will be presented.
Anil R Kumar, Brandy Ware and Jeffrey
Fernandez, JFAssociates, Inc
Virtual Office Ergonomics Evaluation: A
Cost Effective Green Office Implementation Method
(Abstract
Only)
In the office environment, musculoskeletal disorders such as carpal
tunnel syndrome are prevalent due the presence of injury risk factors. Injuries
often begin as discomfort and are caused by the exposure over a period of time
to the repetitive nature of work (long term keyboard and mouse use) or
inappropriately adjusted equipment (i.e. chair and desk). More often than not,
the factors causing discomfort can be accessed through a workplace evaluation
performed by a qualified individual such as an ergonomist. Many companies do
not have the expertise to conduct evaluations in-house and hire a qualified
ergonomist. The process followed by the ergonomist during the assessment
includes visiting the workplace to interview the client, collecting
measurements of the client and workplace, and making recommendations. During
the on-site evaluation, the client must host the ergonomist by meeting them,
escorting them to their work area, and spending undivided attention while the
ergonomist is there. This time away from their work decreases productivity.
From the ergonomist’s perspective, there is time spent commuting, and
collecting measurements at the workplace. In both of these cases, the cost is
transferred to the client. The speed of business today leads clients to expect
a shorter delivery time for the evaluations. Due to these factors, there is a
need for a cost effective approach to solving office ergonomics related
problems. One approach is to provide ergonomic evaluations through a virtual
ergonomist (i.e. not on-site). The evaluation is initiated by client contacting
the service provider and concludes with the virtual ergonomist following up
with the client after recommending the necessary modifications. The evaluation
of the workstation includes all of the same basic elements of an in-person
evaluation along with the same deliverables. A virtual system requires an
understanding of the information needed, comprehensive mastery of the
fundamental office ergonomics principles and their application, and an
appropriate interpretation of the data provided by client. In most cases, only
a certified and qualified ergonomist(s) is an appropriate choice for these
evaluations. The most advanced application of a virtual ergonomist involves the
use of the internet as a platform for providing the aforementioned services.
With that said, a website with interactive screens was developed to solicit
information. Additionally, the website provided the user with the capability to
upload pictures and videos for review. The presentation provides details of the
system that was developed and implemented to provide virtual office ergonomic
evaluations. The critical ergonomic considerations implemented while soliciting
the questions over the interactive screens are also described.
Charles D. Burdick, Lockheed Martin Inc.
Overview of the DoD Analytic Agenda (Abstract Only)
Most of the large Government analysis organizations have
traditionally maintained one or more large campaign models to simultaneously
evaluate all their equipment and procedures and justify changes. To feed these
models, these organizations had to put together scenarios of likely situations
where both the concepts and equipment interacted as they would be expected to
work in wartime situations. The results of simulating these scenarios have then
been used to determine the relative value of portrayed systems in these
operational concepts and to make trade-offs among the alternatives that provide
an optimal force. All of these different campaign models are represented as
being joint, meaning they involve the forces of the Army, Navy, Air Force and
Marines, however, the scenarios chosen for them in the past have generally
reflected the different points of view of each military service. Thus, when
data was collected for the models, it often came from a variety of sources and
timeframes, making it difficult to merge into a common simulation environment.
The various analysis teams usually ended up employing their own Subject Matter
Experts (SME) to build unique scenarios for each project and analysis task.
These scenarios then had to be defended along with the analysis itself. To
confound the situation, warfare and the way we conduct it has been rapidly
changing since the end of the cold war with new technologies, new missions, and
unpredictable enemies engaging our forces not only in traditional warfare, but
also counterterrorism, Homeland Defense, and irregular warfare. The Department
of Defense (DoD) Analytic Community has recognized the problem and has
implemented a formal scenario development process that is designed to cover
current and future threats in the most likely environments. The scenarios that
make up the Analytic Agenda are to be used by all the military components
involved in the evaluation of existing and future forces. The objective is to
provide a suite of scenarios of sufficient breadth to evaluate the fore in its
expected environments against a range of possible enemy situations. In
conjunction with the scenarios, the Joint Data Support office was established
to maintained and redistribute the data generated as part of the Analytical
Baselines and making it available for OSD, Agency, and Service studies,
wargames and simulations. This presentation explains the process of developing
the DoD Analytic Agenda scenarios, describes the current status of the program,
and discusses some of the challenges that have been encountered along the
way.
Charles D. Burdick, Lockheed Martin Inc.
Addressing Hybrid Warfare in a Campaign Environment (Abstract
Only)
Campaign level models have traditionally focused on the
participating combatants waging war in a conventional combat scenario.
Increasingly, however, as stated in the JFCOM Capstone Concept for Joint
Operations (CCJO), Operational Art is becoming the arranging and balancing of
combat, security, engagement, relief and reconstruction, and training
activities to achieve the objectives of the joint operation or
campaign—and their continual rearrangement as that operation or
campaign unfolds. This presentation describes the use of agent-based simulation
objects to simultaneously represent all of the above functions and to shift the
allocation of the available forces among these functions in response to
automated assessment algorithms estimating the success of the campaign using a
combination of hard (e.g. casualties) and soft (e.g. popular support) factors.
The participating units on each side adapt by automatically changing their
doctrinal rule sets in response to achieving both short and long term goals. As
new data is continuously generated on Irregular Warfare (IW) by both
human-in-the-loop wargames and insightful inputs from both current and
historical IW campaigns; the opportunity exists to expand the capability of
agent-based campaign models to analytically represent all aspects of
operational art in rapidly executing stochastic simulations.
Steven Wilcox, Serco, Inc Simulation
Modeling as a Paradigm for Quantitative Sociological Research
(Abstract
Only)
The conventional approach to quantitative sociological research is
posit a theory discursively, discuss the alternative theories and construct a
test of the theory in which one is interested in only one of the causal paths
in the system, usually by testing a regression parameter. However, the concepts
employed are often system-level attributes such as social disorganization and
the use of prose argumentation to understand how the various effects can be
separated in a regression model is subject to error. Another problem is that
the theoretical entities tend to be fuzzy sociological constructs and poorly
suited for direct measurement. Hence indirect estimation is needed wherein the
simulation model uses fuzzy constructs to reflect sociological theory but is
estimated using measurable entities. To address these concerns, we consider a
methodology for making simulation modeling substitute for the quantitative
reasoning and inference chain and give an example in the area of criminology.
The basis for the new paradigm is a method of flexibly calibrating integrated
sociological/economic/psychological simulation models to available data. In
addition to testing the parameters of interest, it produces useful model
identification information. It sheds light on the sufficiency of measures to
identify the coefficients employed, whether there is redundancy, and whether
the model needs refinement in view of actual data. To illustrate this method,
we employ it to estimate an agent-based model of social influence on
neighborhood crime that incorporates the major theories and fills in the
missing pieces required to form an integrated theory. This model is summarized
as a causal network of concepts with influences between them—a hairball
diagram at the individual level, which is estimated using available published
data. The analysis of the results from estimating the model reveals redundant
parameters based on the simulation runs performed, while showing that one
measure of the actual data cannot be adequately accounted for despite the
model’s having numerous parameters.
Neal F. Schmeidler, OMNI Engineering &
Technology, Inc.
Staffing Model Development (Abstract
Only)
Many organizations are anxious about the anticipated retirement
bubble of baby boomers. Nervousness about this approaching reality is
unnecessary. Human capital planning, based on sound staffing models, provides
the data required to formulate the plan to prevent catastrophic results.
Staffing models are formulas or mathematical models, used to estimate the
number of journey-level personnel needed to perform one or more functions for a
specific planning period. Staffing models facilitate budget formulation, cost
control, alignment of resources with output expectations, workforce expansion/
contraction planning, performance measurement, and more. This presentation,
including a case study, describes the steps that lead to credible, defensible
staffing models.
Nastaran
Coleman and Ellis Feldman, Federal Aviation Administration
Estimating
Conflict Detection Counts in Air Traffic
The number of potential conflicts which must be detected and
resolved between pairs of aircraft reflects sector complexity in the en route
environment. It also contributes to air traffic controller workload. A linear
programming model was developed in ILOG/OPL to detect potential conflicts
between any two aircraft, taking positional uncertainties into account. A set
of rules was defined to filter out aircraft pairs having no chance of a
conflict. This reduced the number of linear programming iterations from
hundreds of millions to tens of thousands. Processing time was further reduced
by preventing memory leaks in the modeling environment..
Russell R. Vane III
Modeling an Adaptive Competitor (Abstract Only)
This talk addresses the challenges of modeling a finite, but
adaptive, competitor to predict behavior to achieve better than game theoretic
outcomes. The competitor is considered to be human (subject to psychological
preconditioning) with limited resources, both physical and cognitive. Using
simple observations in a simple game, the audience will be coached in how to
observe the choices and outcomes for any competition, plan for and frequently
achieve victories in many situations at work and play, or avoid playing games
that are not valuable to you. Using competition theory, evolving strategies
based on behavioral indicators allows the player to anticipate others’ strategy
shifts. No knowledge of game theory is presumed. Arithmetic will be used in the
examples and a new notation representing the current game considerations will
be revealed.

 

JOHN W. KLUGE
CENTER OF THE LIBRARY OF CONGRESS

Mary Lou Reker,Special Assistant to the
Director Office of Scholarly Programs Library of Congress
Where
Scholars Gather
(Abstract Only)
The Kluge Center at the Library of Congress presents an opportunity
to attract to Washington the best available minds in the scholarly world. It
facilitates their access to the Library’s remarkable collection of the world’s
knowledge, and engages them in conversation with the U.S. Congress and other
public figures. It funds research in the humanities and social sciences,
including the History of Science. The Center seeks to bring a group of the
world’s best young researchers and senior thinkers into residence, to
stimulate, energize, and distill wisdom from the Library’s rich resources. This
session will be an opportunity to learn more about the Kluge Center and the
research it sponsors

 

MARIAN
KOSHLAND SCIENCE MUSEUM OF THE NATIONAL ACADEMY OF
SCIENCES

Nancy Huddleston and Ian Kraucunas, National
Research Council; Jes Koepfler or Joe Heimlich, Institute for Learning
Innovation; Sapna Batish, Marian Koshland Science Museum of the National
Academy of Sciences
Communicating Climate Change (Abstract Only)
A recent survey by the Yale Project on Climate Change and the
George Mason University Center for Climate Change Communication revealed that
over half of the American public is either alarmed or concerned about
anthropogenic climate change. How do we provide the information they need to
make decisions related to climate change? Recent and upcoming work from the
National Research Council addresses many of the important policy issues on this
topic. During this session, speakers will discuss different approaches to
communicating about climate policy to teens and adults through print products,
web sites, and museum exhibitions

 

NATIONAL
CAPITAL SECTION/OPTICAL SOCIETY OF AMERICA & IEEE/PHOTONICS
SOCIETY

 

George
Simonis
Opening Remarks on the Early Developments of the Laser

No abstract available
Ron
Driggers , Naval Research Laboratory
Overview of Optical Sciences and
Laser-related R&D at NRL
No Abstract Available
Mike
Krainak, NASA Goddard Space Flight Center
NASA: Lasers in
Space
NASA continues to develop and deploy lasers in space primarily for
Earth and planetary science. Recently the Ice, Cloud and land Elevation
Satellite (ICESat) mission used the Geoscience Laser Altimeter System (GLAS)
science instrument to complete 2 billion measurements of the Earth. The Lunar
Orbiter Laser Altimeter (LOLA) science instrument on the Lunar Reconnaissance
Orbiter (LRO) satellite has already made over 1 billion measurements of the
lunar topography with unprecedented precision with more coming. Plans are
underway for the second Ice, Cloud and land Elevation Satellite (ICESat -2)
mission with the Advanced Topographic Laser Altimeter System (ATLAS) science
instrument under development for launch in 2016. Meanwhile, research and
development efforts, including airborne laser remote sensing instruments,
continue on global measurement of carbon dioxide and methane to provide
important information on Earth ecosystem green house gases as well as data for
investigating the possibility of life on Mars. New applications for lasers in
space include; 1) interplanetary optical communication – with the Lunar Laser
Communication Demonstration (LLCD) flight mission on the Lunar Atmospheric Dust
Environment Explorer (LADEE) scheduled for 2013 launch- 2) gravity-wave
measurements using the Laser Interferometer Space Antenna (LISA) 3) Earth
ecosystem measurements including vegetation canopy and biomass 4) global
measurement of Earth atmospheric winds and 5) high-resolution full Earth
topography.
John
Degnan , Sigma Space Corp
3-D Imaging Lidar
The first successful photon-counting airborne laser altimeter was
demonstrated in 2001 under NASA’s Instrument Incubator Program (IIP).
Although the tiny microchip laser transmitter emitted only 2 mJ at a few kHz
rate, the “micro-altimeter” successfully recorded single photon
ground returns in daylight from altitudes as high as 6.7 km. Sigma Space
Corporation has subsequently developed a second generation 3D imaging lidars
for use in small aircraft or mini-UAV’s. From altitudes of 1 km, the lidar
generates contiguous 3D maps with 15 cm horizontal and few cm vertical (range)
resolution. A frequency-doubled Nd:YAG microchip laser produces a 22 kHz train
of 6 mJ, sub-nanosecond pulses at 532 nm, permitting underwater imaging and
bathymetry. A Diffractive Optical Element (DOE) breaks the beam into a 10×10
array of ground spots, imaged by the receiver onto individual pixels of the
focal plane array detector. Each pixel is then input to one channel of a 100
channel timer for a 2.2 Megapixel/sec data rate. The multiple stop capability
of the detector and range receiver permits daylight operation with large range
gates and enhances penetration of tree canopies, ground fog, water columns etc.
The dual wedge optical scanner characteristics are tailored to provide
contiguous coverage of a ground scene in a single overflight. Recent ground and
flight tests have produced outstanding 3D images with minimal point cloud
processing. By increasing the laser power (P) and/or the telescope aperture
(A), the lidar can be scaled for high altitude operation (40,000 to 60,000 ft)
in support of large scale national mapping missions. A photon-counting lidar is
seriously being considered for NASA’s ICESat-II mission and has been
suggested for high resolution, globally contiguous mapping of the Jovian and
Saturnian moons.
John Wood
, NASA Goddard Space Flight Center
Measurements of the Polar Ice
January 2000 to December 2009 was the warmest decade on record.
Greenland is showing a 15 year climate-related trend in ice sheet melt area
with a 5 year periodicity to melt. Antarctica shows significant thinning in the
margins, and growth in the interior for a net loss of ice. ICESat-2 will
continue ice thickness measurements begun in 2003 with the launch of ICESat.
Mr. Ward
Trussell , Night Vision Laboratory
Development of Compact Lasers for
Army Applications at NVESD
The Laser Technology Team of the Night Vision and Electronic
Sensors Directorate has been active in the development of compact, low cost,
solid state lasers over the last 15 years. Key goals have been reducing the
size, weight and power consumption of laser systems while maintaining the
ability to operate over the full military environment. The development has
resulted in new lasers for laser rangefinders, laser designators and active
imaging applications. This briefing will give a summary of the technologies
developed and show examples of new products that are now in limited production.
Gary
Wood , United States Army Research Laboratory DOD High-Energy Solid State
Lasers and Selected Laser-Related Efforts at ARL
DOD High-Energy Solid
State Lasers and Selected Laser-Related Efforts at ARL
The Army Research Laboratory, ARL, is the Army’s corporate research
laboratory and as such provides the underpinning research and development for
the US Army materiel needs. Lasers and enabling technologies are important
components of many Army systems and possible future systems. This presentation
will briefly outline some of the laser R&D within ARL and the motivation.
In addition, this presentation will briefly survey solid state laser
development toward directed energy weapons throughout DOD.
Grace
Metcalfe , United States Army Research Laboratory
Generation
& Utilization of Optically-Generated THz Radiation
I will give an overview of optically-generated pulsed as well as
continuous-wave terahertz (THz) radiation, as well as THz applications in
imaging and spectroscopy. Topics will include photomixing, photoconductive
antennas, electo-optic sampling and photo-Dember effect. I will also discuss
the current research on THz radiation at the Army Research Laboratory,
including the development of nitride semiconductors as a novel efficient THz
source or detector based on built-in in-plane electric fields and high
resolution CW spectroscopy
Ron
Waynant , Food and Drug Administration
Light Therapy and Free Radical
Production: Their Role in Cell Function and Disease
Light therapy or laser therapy has been around for more than 100
years without a known mechanism, but recent research is showing that radiation,
perhaps broadband, can influence the mitochondria, the organelles in most of
our cells. Depending upon the dose, light or other radiation falling on the
cells can generate free radicals and influence the reaction of the cells in
both positive and negative ways. Free radicals are believed to play a role in
malignant diseases, diabetes, atherosclerosis, neurodegenerative diseases,
rheumatoid arthritis, HIV infection, ischemia and reperfusion injury,
obstructive sleep apnea and ROS levels increase in old age. This talk will give
an overview of the progression of this field.
Ilko Ilev ,
Food and Drug Administration
Advanced Multifunctional Sensing and
Imaging Approaches in Biophotonics and Nanobiophotonics
Biophotonics and nanobiophotonics are emerging fields in modern
science and biomedical technology which have opened up new horizons for many
unique practical applications in various areas ranging from minimally-invasive
diagnostics and imaging to development of novel nanobiosensors and
nanobiomaterials. In these fields, there has been great impetus recently for
bioimaging and sensing intracellular structures and functions as well as to
obtain quantitative information for light-tissue interactions at cellular and
intracellular level in the sub-wavelength nanometer range. The presentation
will cover fundamental principles, recent developments and trends in advanced
biophotonics and nanobiophotonics techniques. It will discuss novel concepts
for ultrahigh-resolution bioimaging and sensing, which are based on alternative
fiber-optic confocal microscope methods. This technology can be employed in
various key biophotonics and nanobiophotonics applications such as developing
independent test method for noninvasive pre-clinical evaluation safety and
effectiveness of novel medical devices and technology; for studying fundamental
mechanisms of light-tissue interactions at cellular/molecular levels; for
probing/monitoring intracellular structures and functions; and for
characterizing fundamental properties of various nanobiomaterials.

NATIONAL
INSTITUTE OF STANDARDS AND TECHNOLOGY (NIST) – PHYSICS
DEPARTMENT

Panel Presentation: Climate
Change and its Mitigation: The Role of Measurement
(Abstracts not
available)

1. Gerald (Jerry) Fraser, Chief ,
NIST Optical Technology Division
Examples of Satellite Calibration
work

2.
Yoshi Ohno, Group Leader in Optical Technology Division
Examples of
Solid-State Lighting work

3. Hunter
Fanney, Chief, NIST Building Environment Division
Examples of Solar
Panel work
Panel Presentation: The Second
Quantum Revolution: Putting Weirdness to Work
A video of this session is at
http://washacadsci.org/videos/nist.htm

1. Steven L. Rolston,
Professor of Physics, University of Maryland; Co-Director, Joint Quantum
Institute
Planck to the Present

How we got to where we are, from the origins of quantum
mechanics to the grand challenges of the 21st century: A brief summary of the
fundamental components of quantum mechanics and quantum-information
science.

2. Luis A. Orozco, Professor of Physics,
University of Maryland; Co-Director, JQI Physics Frontier Center
The
Quantum Frontier Today

How JQI uses a multidiscipinary approach to three broad
goals: basic research, quantum simulations and device applications. Among the
questions being pursued: What is the ideal qubit, and how can control
decoherence and transfer information?

3. Carl J. Williams, Chief, Atomic Physics
Division, National Institute of Standards and Technology; Co-Director,Joint
Quantum Institute (JQI); Adjunct Professor, University of Maryland
Applications for Tomorrow

What kinds of problems a quantum information-processing
system could address, including secure data encryption, unstructured database
searches and simulation of quantum system. Why these are intractable using
classical tools, and how quantum computing would impact science and
society.

PHILOSOPHICAL SOCIETY OF WASHINGTON

Eugenie V. Mielczarek Department of Physics,
Professor Emeritus George Mason University
The Nexus of Physics and
Biology
(Abstract Only)
Genetics specifies the organism but all of its functions are
determined by gravity, electromagnetism, thermodynamics and quantum mechanics.
From the mechanics of fueling cells, to the automotive-like clutches of E.coli
bacteria, and the ability of geckos to walk on the ceiling, the physics of
living organisms is remarkable. In 2008 the National Academies of Sciences
published “Inspired by Biology: From Molecules to Materials to Machines,”
examining how research at the intersection of the physics and biology will lead
to new materials and devices, with applications ranging from nanotechnology to
medicine. This lecture will describe several of these systems and the physics
which governs their motion.
Kenneth
Haapala,Executive Vice President, Science and Environmental Policy Project
(SEPP)
Nature Rules the Climate: The Physical Evidence
Many organizations have uncritically accepted the reports of the
United Nations Intergovernmental Panel on Climate Change (IPCC) as the
authoritative work on the subject and have accepted the IPCC’s central premise
that carbon dioxide is the principal driver of global warming and climate
change. Yet, thousands of independent scientists have not. Many of these
scientists assert that significant physical evidence contradicts the IPCC’s
central premise. This lecture will cover how the methodology used by the IPCC
results in an overestimate of human influence on global warming and some of the
physical evidence either ignored or dismissed by the IPCC. Kenneth A. Haapala
has spent much of his career critiquing the quantitative methods similar to
those used by the IPCC. Over thirty years ago, he demonstrated that the US
Federal Energy Administration used inappropriate models to predict the world
would run out of oil and the US would run out of natural gas by the end of the
20th Century. He is the Executive Vice President of the non-profit Science and
Environmental Policy Project (SEPP) and a contributor to the reports of the
Nongovernmental International Panel on Climate Change (NIPCC)
Larry
Millstein, Practicing Biotechnology Patent Law – Millen, White, Zelano &
Branigan, PC
Sequencing Single DNA Molecule and NexGen Genomics
Revolutionary technologies for determining DNA sequences have been
developed in the last ten years and are being rapidly commercialized. The first
of these “NexGen” devices has already increased the throughput of DNA
sequencers more than 100 fold, and greatly reduced the costs per base. A new
generation of devices now being developed promises to improve performance
another 100 fold and further decrease costs. These technologies will make
possible routine whole genomic studies of individuals. Already, “direct to
consumer” genomics companies have been formed that provide extensive
genetic profiles for just a few hundred dollars. Shortly, such companies will
be able to provide complete individual DNA sequences at relatively modest
prices. The information from these new DNA sequencing technologies will
profoundly alter our understanding of the biological universe that surrounds
us, our own genomes and the roles of human genetic variation in health,
disease, and therapeutic responsiveness. This lecture will describe in some
detail the technology of several second and third generation DNA sequencing
platforms, focusing especially on those that operate on individual (single) DNA
molecules. It also will briefly explore some ways the new sequencing
technologies will be used to study genomic variation, gene expression,
epigenomics and metagenomics
Major
Catherine M. With,Legal Counsel, The Armed Forces Institute of Pathology The
Legal Landscape of Personalized Medicine
The Legal Landscape of
Personalized Medicine
The term “personalized medicine” has been used to refer to
health care that is tailored to the individual. More recently, the term has
been used to refer to genetically-based health care. There are numerous
apparent benefits to both patients and health care providers when choices about
medications, surgery, prevention, and other medical interventions can be made
through taking in to consideration each patient’s unique circumstances.
This advancement in medicine is derived from the tremendous efforts of the
Human Genome Project, and now genetic tests are increasingly seen as the key to
dramatic improvements in clinicians’ ability to individualize health care.
While the concept of using genetic information to individualize health care is
intuitively appealing, such use of genetic information presents unique legal
and ethical issues. This presentation will discuss the legal landscape of
various U.S. Federal and State laws that have bearing upon the interests of
patients, health care providers, and the scientific community as we move
forward into the realm of personalized medicine.

 

POTOMAC
CHAPTER OF THE HUMAN FACTORS AND ERGONOMICS SOCIETY

William
A. Schaudt(1), Darrell S. Bowman(1), Joseph Bocanegra(1), Richard J. Hanowski1,
and Chris Flanigan(2)
Enhanced Rear Signaling (ERS) for Heavy Trucks:
Mitigating Rear-End Crashes Using Visual Warning Signals
(1)Virginia Tech Transportation Institute, Blacksburg, VA 24061
(2)U.S. Department of Transportation, Federal Motor Carrier Safety
Administration, Washington D.C. 20590
In 2006, there were approximately 23,500 rear-end crashes involving
heavy trucks on our roadways. Of these crashes, 135 resulted in fatalities and
1,603 resulted in incapacitating injuries. The Federal Motor Carrier Safety
Administration (FMCSA) contracted with the Virginia Tech Transportation
Institute (VTTI) to investigate methods to reduce or mitigate those crashes
where a heavy truck has been struck from behind by another vehicle. This
particular collision type results in higher-than-usual rates of fatalities and
injuries compared to types of rear-end crashes in which the lead vehicle is a
light vehicle. The most prevalent contributing factor is that of the
following-vehicle driver looking away, either into the vehicle interior or to
the outside (but not the forward view). Most previous work on prevention of
rear-end crashes has been directed toward attention-getting and eye-drawing;
that is, trying to get the following-vehicle driver to look forward instead of
continuing to look away. The Enhanced Rear Signaling (ERS) for Heavy Trucks
project investigated many categories of rear-end crash countermeasures which
included both visual and auditory warning signals. The purpose of introducing a
visual warning signal, the focus of this paper, was to redirect the driver’s
attention and visual glance to the forward view. Visual warnings have been
shown to be effective, assuming the following driver is looking directly at the
warning display or has his/her eyes drawn to it. This paper will provide an
overview of testing performed with visual warning signals positioned on the
rear of a heavy truck trailer. These visual warning signals were tested using a
static method (parked vehicles with individuals not driving) to determine how
well various configurations of visual warning signals would provide improved
eye-drawing capabilities. Two static experiments were performed to down-select
several visual warning signal configurations prior to dynamic testing on the
Virginia Smart Road. Each experiment and the results obtained will be
discussed.
Gerald P.
Krueger, Ph.D.,Krueger Ergonomics Consultants, Alexandria VA
Effects of
Medications, Other Drugs, and Nutritional Supplements on Driving Alertness and
Performance
This presentation: (1) reviews what the literature conveys about
the effects of various chemical substances such as prescribed or
self-administered medications, other drugs like stimulants or hypnotics, and
nutritional supplements, including energy drinks, have on roadway driving
alertness and performance; (2) addresses issues concerning the use of such
chemicals for maintaining alertness, especially in terms of affecting safe
driving performance of commercial long-haul truck and bus/motorcoach drivers;
and (3) provides recommendations about issues of concern to highway safety
advocates, employers, and the driving public, including commercial drivers
Nicole E.
Werner, David M. Cades, Deborah A. Boehm-Davis, Matthew S. Peterson, Sahar J.
Alothman, and Xiaoxue Zhang, George Mason University
Where was I and
what was I doing? Individual differences in resuming after an interruption and
implications for real-world distractions
Interruptions infiltrate our lives in a myriad of ways. Emails,
instant messages, cell phones, and other devices as well as people all vie for
our attention on a daily basis. Studies show such interruptions can be
detrimental to performance and potentially to our safety. In the office
environment one study found in 40% of interrupted situations, people failed to
resume the original task. Interruptions are also detrimental to high risk
environments such as aviation, driving, and in healthcare settings. Hospitals
report distraction was a contributing risk factor in 126 incidents of wrong
site, wrong person, or wrong procedure events. It is important to try and
understand the cognitive mechanisms underlying how we attempt to recover from
interruptions. A better understanding may lead to better methods to mitigate
the negative effects of interruptions. A study exploring the ways people
recover from interruptions found two distinct groups of performers when the
content and location of tasks were manipulated upon resumption of task
performance. One group was fastest to resume from an interruption when the task
type and location were changed. The other group performed slowest in this
situation. A follow-up study aimed at determining whether cognitive measures of
individual differences in working memory and spatial ability can predict the
group into which individuals fall. Surprisingly, neither working memory
capacity nor spatial abilities reliably predicted which group people were in
based on their resumption performance, suggesting that for some types of tasks,
performance may be dictated by the task itself, or by some strategic approach
to the task not affected by common cognitive traits such as working memory or
spatial ability. Understanding the differential negative impact of different
types of interruptions has implications for environments such as office work,
driving, and healthcare where distractions are common. In provision of
healthcare, better understanding of the specific mechanisms of interruption and
resumption of task performance will certainly aid in creating a safer medical
environment.
Erik
Nelson David G Kidd, and David M Cades, George Mason University
The
effect of repeated exposures to simulated driving on ratings of simulator
sickness
Simulator-based research is a principal method for exploring human
behavior and performance in dangerous environments. Examples include examining
aviation crew performance while responding to critical failures on the aviation
flightdeck; or assessing performance during distracted driving. However, one
drawback of conducting research in motion-base simulators employing dynamic
computer graphics is the occasional occurrence of simulator sickness, which may
lead some subjects to have to drop out of studies before their performance is
assessed adequately. For those subjects who chose to continue despite the
discomforts of simulator sickness, their performance may also be adversely
affected. While considerable research has been conducted to develop symptom
scales and other measures for screening out people who may be prone to
simulator sickness, few studies have explored how a participant’s
susceptibility to simulator sickness changes with experience. That is, how
their symptoms and performance change over multiple exposures to training or
testing in a simulator. The purpose of this study was to see how the expression
of simulator sickness symptoms changes with repeated exposures to a simulated
driving environment, i.e. do participants learn to cope with or overcome
simulator sickness as a function of continued training? Over the course of
three days, test participants were exposed to 10 separate sessions in a
motion-base driving simulator. Subjective ratings of simulator sickness were
highest during the first simulated drive, and the symptoms seemed to stabilize
over subsequent days. Participants who reported prior motion sickness while
sitting in the rear seat of a vehicle reported higher levels of simulator
sickness during their first exposure to a driving simulator. This research
demonstrates that it may be possible to pre-screen and identify individuals who
may be more prone to simulator sickness than others. Potentially, these
individuals could be “inoculated” against simulator sickness by providing them
with repeated brief exposures to simulated environments.

 

POTOMAC
OVERLOOK REGIONAL PARK AUTHORITY (A property of the Northern Virginia Regional
Park Authority)

Martin Ogle,
Chief Naturalist
Viewing Nature through the “Lens” of
Energy
Energy is the greatest “nexus” between human beings and the rest of
the living systems of Planet Earth. At Potomac Overlook Regional Park, in
Arlington, VA, a new exhibit suite called the “Energerium” explores this
relationship, viewing at nature through the “lens” of energy. In the coming
decades, society will need to address large and unprecedented challenges
related to energy use and it is important that the public is aware of and
conversant on these issues and how to solve them. This program will discuss the
flow and main points of the Energerium and the relevance and importance of this
message for school curricula and public education facilities. The exhibit suite
is divided into four main areas. The first area, titled “It’s All Energy,”
introduces energy as central to all life on Earth (including all human
systems). The second area, “Nature Transforms Energy,” takes classic ideas of
ecology (trophic levels, production, consumption, decomposition, etc.) and
weaves them together as a cycle driven by the sun’s energy (literally in a
physically circular layout). This area includes an introduction to Gaia Theory,
the scientific idea of Earth as a single living system. Visitors then find
information on the history of human energy use (especially locally) in the
third area, called “Energy Transformation and You,” and explore energy
challenges and solutions in the final area entitled “Refocusing Our Energy.”
This program will also describe working solar energy, energy efficiency and
conservation systems employed at Potomac Overlook not only to educate the
public, but to minimize the park’s energy consumption

 

SALISBURY
UNIVERSITY, WASHINGTON ACADEMY OF SCIENCES STUDENT
CHAPTER

Chuck Davis
Faculty Advisor: Dr. Mark Holland, Department of Biology Salisbury University
Effects of nitrogen availability on lipid production in Neochloris
oleoabundans
In order to reduce America’s dependence on foreign oil, biodiesel
has been suggested as an alternative that can be produced domestically.
Cellulosic and seed sources, however, require vast acreage and influence the
price of food. Lipid rich algae are a source of biodiesel that can be produced
in large quantities in a small area. In order to develop a strategy for
increasing the productivity of algae farms, one of two types of pink pigmented
facultatively methylotrophic bacteria (PPFMs), either wild type
Methylobacterium mesophilicum or a B-12 overproducing mutant, were
co-cultivated with the microalga Neochloris oleoabundans. After
measuring algae growth, significant differences were noted between treatments
with added bacteria and those grown without, as well as between the two strains
of bacteria. Analysis shows that the addition of even a small number of PPFMs
enhances the growth of algae. Fourteen days into the trials, however, the
chlorophyll content of all cultures decreased dramatically and cell size
increased. These are symptoms of depletion of nitrogen in the media. Without
nitrogen, algae are unable to synthesize chlorophyll or proteins. Without
nitrogen the cells are forced to convert new photosynthate into hydrocarbons
and simply store them. It has been shown that under nitrogen limiting
conditions N. oleoabundans responds with increased lipid production. The
purpose of this study is to understand the growth of N. oleoabundans
when co-cultivated with PPFMs under varied nitrogen concentrations and to
develop a protocol for optimum cell growth and lipid production that will be of
commercial significance
Katie
Pflaum and Justin McGrath Faculty Advisor: Dr. Elizabeth Emmert, Department of
Biology Salisbury University
Use of Bdellovibrio bacteriovorus to
control infection in Caenorhabditis elegans
Bdellovibrio bacterivorus is a motile predatory Gram
negative bacterium that feeds on various types of other Gram negative bacteria.
While Bdellovibrio occupies the periplasmic space of a host bacterium,
it facilitates the breakdown of host macromolecules in order to produce
progeny. After exhausting host nutrients, Bdellovibrio lyses the host
cell and searches for a new prey to feed on. Caenorhabditis elegans has
been proven to be an ideal animal model for the study of bacterial pathogens,
including pathogens that affect humans. A wide range of studies have shown that
once ingested, pathogenic bacteria reside in the intestinal lumen of C.
elegans
, eventually causing death of the worms. We exposed C.
elegans
to pathogens for approximately forty-eight hours to establish
intestinal infections. Then we treated the nematodes with a liquid
Bdellovibrio solution for fifteen minutes, allowing the
Bdellovibrio and the pathogen to interact in the intestinal lumen of the
C. elegans. Our lab has demonstrated that worms exposed to
Bdellovibrio in this manner survive longer than control worms exposed to
the pathogen. Additionally, we have examined persistence of Bdellovibrio
on the treated worms. Following the Bdellovibrio treatment, the worms
were ground in a solution and plated out. Preliminary data from our lab shows
that Bdellovibrio is present in the C. elegans for an average of
four days at a high concentration.
Christina
M. Martin Faculty Advisor: Ryan Taylor, Department of Biology, Salisbury
University
More than meets the ear: male position relative to foam nests
influences female mate choice in the túngara frog, Physalaemus
pustulosus
Vocalizations of male frogs are critical for attracting mates
during courtship. Females express strong mate preferences for males who produce
vocalizations with specific properties (e.g. fast rates or lower frequencies).
This female preference produces a selection pressure favoring males with
particular calls. Recent work has demonstrated that visual signals are also
important for mate attraction, even in nocturnally active species.
Túngara frogs, Physalaemus pustulosus, are tropical frogs that
build conspicuous white foam nests in which eggs are deposited. Foam nests
persist for several days and on subsequent nights, male frogs are often
observed calling adjacent to these foam nests. We tested the hypothesis that
females preferentially approach the vocalization of a male adjacent to a foam
nest. We conducted a two-stimulus choice test where a call was broadcast
antiphonally from each speaker. We placed a Petri dish in front of each
speaker; one contained a foam nest and the other contained water only. A female
was placed under a funnel equidistant from each speaker and allowed to
acclimate to the calls before raising the funnel. We scored a female’s choice
when she approached to within 5 cm of a speaker. Females expressed a
significant preference for the speaker with the foam nest. These data are the
first to show that males may position themselves next to visually conspicuous
objects in the environment (foam nests) to improve their probability of
attracting a mate. Thus, sexual selection in frogs is likely to be more
complicated than simple female attraction to male vocalizations.
Denise
L. Tweedale, Hannah Greene, and Lauren Kopishke Faculty Advisors: Dr. Mara
Chen, Dr. Barbara Wainwright, Dr. Veera Holdai, Departments of Geography and
Geosciences & Math and Computer Sciences, Salisbury University
Analysis of the Maryland Residential Housing Sales data, 1995 –
2005
The recent housing market crash has contributed significantly to
the economic recession. It is imperative to develop a better understanding of
factors that affect the housing market in the wake of the economic crisis. This
study analyzed housing data from the years 1995 to 2006 on 12 different
variables such as the number of house sales, median sale price, average
interest rate, and various population and socioeconomic factors. Cluster
analyses were done to separate the state into different regions based on the
annual changes of housing sales data. Regression modeling was then carried out
to examine the effects of a common set of factors on the residential housing
markets in each of the cluster regions.
Rebecca
L. Flatley and Frederick D. Bauer Faculty Advisors: Michael S. Scott, PhD and
X. Mara Chen, PhD, Department of Geography and Geosciences, Salisbury
University
The Risk and Vulnerability Impact Assessment of Sea Level
Rise for Wicomico County, Maryland
Over half of the world’s population lives near or along the
coastline, which is also the case for Wicomico County, Maryland. It is situated
between the Chesapeake Bay and Atlantic Ocean. Consequently, it is extremely
susceptible to the sea level rise and has experienced frequent flooding. In
addition, the flood impact can be potentially catastrophic since it is the
cultural, economic, and transportation center of Maryland’s Eastern Shore. The
objective of the study is to assess the risk and vulnerability of sea-level
rise to Wicomico County. The overall structural damage and monetary loss within
the 100-meter buffer zone of the 100-year floodplains are evaluated and
assessed by assuming the worst case scenarios – two to seven millimeters
sea-level rise per year for the next 50 and 100 years. These projections are
explored using the Federal Emergency Management Agency’s (FEMA) loss estimation
software, HAZUS-MH MR4 and ESRI’s ArcGIS. The research findings are of
importance to local community planning and government decision making
processes, and the research method is applicable to the studies of other
coastal areas.
Kayla
Pennerman Faculty Advisor: Dr. Sam Geleta, Department of Biology, Salisbury
University
Cuscuta transmission and secondary infection of Fusarium
wilt
Fusarium oxysporum f. sp. lycopersici (Sacc.) W.C. Snyder
and H.N. Hans infects the vascular systems of tomatoes, resulting in Fusarium
wilt. This disease is generally considered monocylic as the fungus remains
within the host until tissue death. For the disease to be polycyclic, the
fungal propagules must originate from an infected host plant and be transferred
to a new host. One possible mode transmission may be through Cuscuta.
Species of this genus are rootless vine-like parasitic angiosperms, which form
intimate aerial relationships with the vascular systems of a wide range of
plants. Symplastic connections exist between the parasite and host xylems and
phloems. Through these, vascular-colonizing viruses and prokaryotes are able to
utilize Cuscuta as a vector. This study aims to demonstrate the
possibility of fungal transmission via a parasitic angiosperm. Given that the
symplastic connections are large enough to allow entry of hyphae and/or
microconidia and Cuscuta does not have efficient defense mechanisms
against the proposed fungus, it is conceivable that a pathogenic fungus could
be transmitted. If transmission is successful, the second goal will be to
demonstrate the possibility of secondary infection of F. oxysporum f. sp.
lycopersici
in tomatoes. The results will further characterize the
relationship between C. pentagona and its hosts; and they will have
implications on possible transmission mechanisms and on the disease cycle of
Fusarium wilt in tomatoes
Nicole S. Massarelli Faculty Advisor: Dr. Don
Spickler, Department of Mathematics and Computer Science,Salisbury
University
The Mathematics Behind Anamorphic Art (Abstract Only)
Anamorphic art is created by distorting an image so that is only
revealed from a single vantage point or from its reflection on a mirrored
surface. This artistic process was first attempted during the Renaissance and
became exceedingly popular during the Victorian Era. The earliest known
examples come from the notebooks of Leonardo da Vinci. He successfully sketched
an eyeball in 1485 that could only be discerned when looking at the drawing
from a certain angle. Artists can achieve this illusion by drawing the image on
a distorted grid or looking at the mirror image while drawing on a flat
surface. More modern artists using these techniques include Julian Beever, Hans
Hamngren, and István Orosz. Hamngren and Orosz use the mirrored cylinder
technique while Beever creates three-dimensional illusions on sidewalks using
chalk. This project explores how to use mathematics to distort a given image so
that it appears correctly on a mirrored surface. Using mainly vector calculus
and ray tracing techniques we determined an algorithm for printing the
distorted image for the cylinder and the sphere. We then moved on to
triangulation methods that can be used for surfaces defined as a mesh. In this
talk we will discuss the general procedure for mapping the original image to
the distorted image for simpler objects, such as the sphere and cylinder.
Sabrina
E. Kunciw Faculty Advisor: E.Eugene Williams, Department of Biological
Sciences, Salisbury University
Temperature-induced changes in the
expression of enzymes involved in membrane restructuring in Coho salmon cells
With the current uncertainty over climate change, it is
increasingly important to study how species and individuals respond to changing
temperature. Tolerance to variations in temperature is a key attribute for
survival during periods of geologically rapid climate change. Because their
components are held together by non-covalent interactions, biological membranes
are exquisitely sensitive to temperature. Many animals routinely survive
changing environmental temperature s (e.g., daily, seasonal) in part because
they have the ability to adjust the physical characteristics of their cell
membranes. The phospholipids of their membranes are restructured, through
specific enzymatic reactions, such that only those phospholipids with physical
properties appropriate for the prevailing temperature are included in the
membrane. Our goal is to use molecular biology techniques to examine the
temporal patterns of the expression of the genes that code for the enzymes
responsible for phospholipid restructuring during temperature change.
Biochemical evidence suggests that there may be a distinct temporal pattern in
the expression of these enzymes; some appear to be switched on early to effect
“emergency” measures, while others are activated later in the acclimation
process to supplant or augment the emergency changes. We have designed primers
for many of these enzymes and tested them using cDNA synthesized using mRNA as
a template. The mRNA was extracted from Coho salmon (Oncorhynchus kisutch)
embryo (CHSE) cells maintained in culture. In a series of experiments, cells
were maintained at 22C then transferred to 5C for periods from 2 hours to 27
days. mRNA was extracted, converted into cDNA and the cDNA was probed for the
enzyme sequences. We have found that indeed, messages for some enzymes of
membrane restructuring are constitutively produced (e.g. the delta-6 desaturase
[linoleoyl-CoA desaturase, EC 1.14.19.3]) while others are activated after
different times of cold exposure. There are also difference in the expression
of enzymes of the de novo synthetic pathway and the phospholipid in situ
restructuring pathways. Our results suggest that the temperature-induced
restructuring of cell membranes in the Coho salmon occurs in a temporal
hierarchy.
Jordan
Estes and Shelby Smith Faculty Advisor: Dr. Kimberly Hunter, Department of
Biology, Salisbury University
Nordihydroguiaretic Acid in the Polyploids
of Larrea tridentata: Effects of Temperature and Developmental
Stage
North American Larrea tridentata is a long-lived shrub that
has three ploidy levels in three distinct regions: Chihuahuan Desert
–diploid; Sonoran Desert – tetraploid; Mojave Desert –
hexaploid. Nordihydroguiaretic acid (NDGA) is a secondary metabolite of
Larrea found in high concentration in the leaves and has antiviral,
antimicrobial and antioxidant properties. However, NDGA’s function in Larrea is
unknown. The relationship between NDGA expression and polyploidy was
investigated in greenhouse grown plants of all ploidy levels and hexaploid
individuals at early developmental stages. A new method was developed to
measure NDGA concentration in a leaf pair. NDGA was extracted using methanol
and quantified using reverse-phase HPLC. In greenhouse grown leaf samples and
hexaploid seedlings, NDGA concentrations vary with ploidy level and
temperature. At lower temperatures in the winter months, an increase in NDGA
concentrations correlates with an increase in ploidy level in adult plants and
developmental stage in seedlings. At higher temperatures in the summer months,
NDGA concentration is decreased, and is near equivalent among ploidy level and
developmental stage. The biosynthesis of NDGA appears to be turned on during
the transition from seedling to mature plant with expression levels linked to
environmental temperature. The positive correlation with increased ploidy
suggests that NDGA is likely a protective molecule important for conferring
tolerance to high temperature or light.
Catherine
M. Walsh Faculty Advisor: Dr. Michael J Bardzell, Department of Mathematics and
Computer Science, Salisbury University
The Dynamics of Finite Cellular
Automata with Null Boundary Conditions
Cellular Automata (CA), a type of discrete dynamical system, are
often studied with periodic boundary conditions over a finite lattice of cells.
While there are numerous results about periodic boundary conditions, results
can also come from null boundary conditions. Consider CA over finite lattices
of cells, where cells take on values from a finite alphabet G. The values of
these cells are updated in discrete time steps using a local rule. If G is an
abelian group the states of the cellular automaton form a group and the time
evolution map often can form a group homomorphism. The kernel of this evolution
homomorphism will reveal information about the dynamics of the underlying
system, which is represented by a state transition diagram. The nodes of the
diagram represent states and the arrows represent time evolution. For example,
if the kernel is trivial, then the evolution map is one-to-one and the CA is
reversible. Furthermore, the kernel can suggest how long or how many steps
until each state within the system will hit a fixed point, or hit a cycle. The
state transition diagram’s geometry ties in with the kernel of the evolution
homomorphism and determines whether the diagrams is a collection of rooted
trees, cycles, and/or products of cycles with rooted trees. Computations of the
kernel are analogous to techniques used in linear algebra. However, care must
be taken since these systems are not always defined over finite fields, but
over abelian groups.
Robert Figliozzi Faculty
Advisor: Dr. Miguel Mitchell, Department of Chemistry, Henson School of Science
and Technology, Salisbury University
Synthesis of Butyrylcholinesterase
Inhibiting Nordebromoflustramine B
(Abstract
Only)
Butyrylcholinesterase (BChE) has been cited in recent studies to be
one of the major causes of adult dementia and Alzheimer’s disease. The
inhibition of this enzyme, BChE, has been shown to improve the memory of
Alzheimer’s patients because of the neurofibrillary plaques that it produces.
Nordebromoflustramine B is a compound that has significant Alzheimer’s
treatment potential because of its BChE inhibition. Through computer modeling
and docking software (+) – nordebromoflustramine B and other debromoflustramine
B congeners have shown greater BChE inhibition and therefore show substantial
potential as Alzheimer’s treatment. We have synthesized (+) –
nordebromoflustramine B in a three step process and resolved the racemate into
its enantiomers. This process is non-reductive which will make possible the
synthesis of future analogs. The information attained through these processes
will not only lead to future AD treatments but will also encourage further
research on the subject because of the simplicity of the proposed method.
Steven Sanders, Christine Davis and Brett
Spangler Faculty Advisor: Dr. Kimberly Hunter, Department of Biology, Salisbury
University
Genetic Variability in Five Species of Tree Ferns Collected
from Cusuco National Park (Honduras)
(Abstract
Only)
Five different species of tree ferns, Cyathea divergens, Alsophila
salvinii, Cyathea bicrentata, Cyathea valdecrenata, and Sphaeropteris horrida,
have been collected from Cusuco National Park in Honduras. Operation Wallacea
led an expedition into the park, where specimens were collected within three
defined collection sites. These samples were collected using FTA cards and
extracted with DNeasy Qiagen kits. The geographical location and height was
collected from every individual. Heights of individuals detected the age
classes within the sites. Inter-Simple Sequence Repeats (ISSRs) were used to
define the genetic fingerprint of the specimens. The bands are scored as
present or absent in the agarose gels, and then converted to 0 and 1 for
analysis in excel. The program Popgene 3.0 was used to generate a genetic
diversity index of the populations, while TESS 2.3.1 analyzed the genetic
structure within the given population based on the genetic markers and
geographical locations of the individual specimens. This program looks at
dominant markers to seek out any discontinuities within a continuous
population, revealing the genetic variability. Small scale genetic variability
within these given populations has been detected using TESS 2.3.1. A
phylogenetic analysis was conducted using PAUP 4.0.

SCIENCE AND ENGINEERING APPRENTICE PROGRAM, GEORGE
WASHINGTON UNIVERSITY

Anh Dao,
Thomas Jefferson High School for Science and Technology Mentored by: Dr.
Ramchandra S. Naik, Walter Reed Army Institute of Research
Quantification of Paraoxonase Activity in Animal Serum to Study Nerve
Toxicity
Organophosphorus (OP) nerve agents function by irreversibly
binding to and inhibiting the action of acetylcholinesterase (AChE), resulting
in an excessive accumulation of the neurotransmitter acetylcholine (ACh) in
synaptic clefts that can culminate in death by asphyxiation. The doses of these
toxic chemicals required to induce lethality in half of all experimental
animals (LD50, measured in ug/kg) are dependent upon the extent of AChE
inhibition in the body, and vary from species to species according to the
presence of other enzymes/proteins found in the bloodstream called
bioscavengers (e.g. butyrylcholinesterase (BChE), carboxylesterase (CaE),
paraoxonase (PON), albumin, etc.). Bioscavengers reduce the concentration of
free OPs potentially available to inhibit AChE by binding to or hydrolyzing
their toxic components, thus increasing the amount of nerve agent required to
induce morbidity. Therefore, knowledge of levels of activity/quantities of all
bioscavengers present in blood is essential for the determination of LD50s as
well as the selection of animal models used to test nerve agent toxicity.
Levels of PON activity in the blood of experimental animals were reported by
various investigators using different substrates and dissimilar assay
conditions. Therefore, the objective of this study is to examine the
comparative levels of PON activity in plasma/serum from various animals (mouse,
rat, guinea pig, monkey and human) under standard conditions by using
diethyl-p-nitrophenylphosphate (paraoxon) as the substrate. The buffer solution
contained a high concentration of salt to prevent interference from other
enzyme/proteins exhibiting esterase activity. Under these conditions, no
paraoxon hydrolytic activity was exhibited by high concentrations of albumin,
cholinesterases (AChE or BChE, up to 4 U/ml), or CaE (up to 22 U/ml).
Preliminary results indicated the highest levels of PON activity in humans and
comparably lower levels of activity in rats, cynomolgus monkeys, mice, and
African green monkeys. No PON activity was detected in guinea pig and rhesus
monkey plasma. Experiments are underway to determine the PON levels in fresh
serum or plasma samples prepared in the absence of ethylenediaminetetraacetic
acid (EDTA)
Nader
Salass, Washington International School and Ahmad Yassin, Lincoln Memorial
University Mentored by: CPT Jacob Johnson, Walter Reed Army Institute of
Research and Dr. Geoffrey Dow, Walter Reed Army Institute of
Research
Hit-to-Lead Evaluation of Antihistamines for Use in Treatment of
Malaria
Rapid development of resistance to antimalarial drugs is emerging
as a detrimental threat to endemic areas. This is leading to a decrease in
viable effective antimalarial treatments and an increase in morbidity and
mortality associated with Plasmodium falciparum (Pf) and vivax infections.
After utilizing an algorithm to filter through > 5000 pharmaceutical
substances we discovered the potential utility of antihistamine compounds as
antimalarial pharmaceuticals. Therefore we hypothesized that antimalarial
activity of antihistamine compounds could be predicatively modeled via
development of an in silico pharmacophore. In order to use antihistamines as
antimalarial pharmaceuticals they must demonstrate potent nanomolar bioactivity
across Pf strains and show no clinically significant QT prolongation or
sedation. Additionally, we hypothesized that QT prolongation and sedation
effects associated with many antihistamines are distinct from their
antimalarial bioactivity. To test these hypotheses, we organized the compounds
in a database that included chemical structure, mode of action, drug class,
side effects, toxicology, hepatic activity and in vitro antimalarial activity.
All information was retrieved from Walter Reed Army Institute for Research
(WRAIR) Chemical Information System (CIS), except for in silico hERG data
obtained in collaboration with University of Pittsburgh and Adverse Event
Reporting System (AERS) data from Pharmacovigillence Center. For compounds
missing data fields, the appropriate tests were ordered through the CIS. Next,
we computed the conformational models starting with the optimal spatial
antihistamine structures, performed analyses of chemical functions mapping on
to the structures, and calculated predictive 3D-QSAR pharmacophore models. The
different pharmacophore model developed to test our hypotheses were: A)
antimalarial activity of antihistamine compounds against the
mefloquine-resistant D6 Pf laboratory strain (Trial 1) and
chloroquine-resistant W2 Pf laboratory strain (Trial 2); and B) estimation of
the QT prolongation using the in silico hERG data (Trial 3) and estimation of
the QT prolongation using the clinically relevant QT prolongation reported from
AERS (Trial 4). As further data is obtained for the antihistamines, the
pharmacophore will be iteratively updated generating a robust picture.
Pharmacophore results may lead to optimization and the discovery of new active
molecules useful against malaria without the prolonged QT interval and sedation
effects of antihistamines.

 

WASHINGTON
CHAPTER OF THE INSTITUTE FOR OPERATIONS RESEARCH AND MANAGEMENT SCIENCES (see
Institute for Industrial Engineers)

 

WASHINGTON
SOCIETY FOR THE HISTORY OF MEDICINE

STEPHEN GREENBERG
Do Come Play: A Demonstration of Online Resources in the History of
Medicine from NLM
(Abstract Only)
The National Library of Medicine is the
world’s largest medical library, and its History of Medicine Division offers
unique online resources for students and researchers. Stephen Greenberg, the
Division’s Coordinator of Public Services, will offer a hands-on demonstration
of these online resources, including PubMed, IndexCat, LocatorPlus, and Images
from the History of Medicine. He will also describe the NLM’s new digitization
program, the Medical Heritage Library, being created in conjunction with
Harvard’s Countway Library, Yale’s Cushing-Whitney Library, Columbia
University’s Health Sciences Library, and the New York Public Library.
ALAIN TOUWAIDE Digitizing
Renaissance Herbals – The PLANT Program
From the last decades of the 15th century,
printers produced illustrated herbals. This production increased rapidly, in
both quantity and quality, and became a significant part of 16th-century
printing and publishing activity. Although this production has been abundantly
studied from the repertoire of Pritzel to the analytical list of Nissen, and
many such books are available in a digital format on scattered Web sites on the
Internet, it has not been systematically screened taking advantage of the many
resources offered by modern information technologies. The Web site PLANT
(acronym of PLantarum Aetatis Novae Tabulae, that is, Renaissance Plant
Illustration) aims to compensate for this lacuna by systematically listing,
digitizing, and indexing the representations of plants in 15th- and
16th-century printed herbals. The communication will illustrate both the
content and the promises of the program.
CHRISTIE MOFFATT AND
SUSAN SPEAKER
The Profiles in Science Project at the National Library of
Medicine
(Abstract Only)
In 1998, the NLM launched its Profiles in
Science site to make the historical records of eminent biomedical researchers
and clinicians available on the World Wide Web. We will discuss the development
of this pioneering digital archives project, demonstrate some of the website’s
features, and then provide more detailed stories about several of our favorite
profiled scientists.
ELIZABETH FEE
A Rapid Romp through the History of the World Health Organization
(Abstract
Only)
The World Health Organization was created
in 1946 – 1948 in the social medicine tradition and a brief flush of
optimism in the immediate post-war period. Soon, however, the Cold War had an
immense impact on WHO policies and personnel. When the Soviet Union and other
communist nations walked out of the United Nations and thus out of WHO, the
United States was able to exert a dominating influence. This presentation will
examine the changing political conditions and their impact on WHO and its
programs as it gradually moved from being the unquestioned leader of
international health to an organization in crisis, facing budget shortfalls,
threatening new diseases such as HIV/AIDS, Ebola, SARS and these days, the
likely impact of climate change on population health. New and powerful players
on the international health scene have sometimes overshadowed WHO’s role such
as the World Bank, the Gates Foundation, and a multiplicity of public-private
partnerships created to evade both the bureaucracy and the democratic control
of a struggling but still idealistic global health organization.
PATRICIA TUOHY, HEAD,
EXHIBITION PROGRAM
Medicine Ways: creating an exhibition about Native
peoples’ concepts of health and illness
(Abstract Only)
In 2006, the National Library of Medicine
began the process of developing an exhibition about Native peoples’ concepts of
health and illness. The journey has take members of the exhibition team to
downtown Washington DC and the National Museum of the American Indian, to
Juneau and Anchorage, to Santa Fe and Seattle, and to Honolulu, Hawaii. We have
spoken with Native physicians, nurses, healers, public health advocates, clinic
directors, educators, community leaders, and mothers, fathers, sisters and
brothers. Through these discussions, stories emerged about the origins of
medicine, different ways of healing, about healing places and healing people,
and about individuals choosing Native medicine and Western medicine to best
care for their communities and families. Today, the Exhibition Program is
drawing together these experiences and focusing its effort to develop
individual “case studies” that will make visible to visitors through media and
traditional displays the stories of Native medicine ways.
JIWON KIM, EXHIBITION
PROGRAM, HISTORY OF MEDICINE DIVISION, NATIONAL LIBRARY OF MEDICINE, BETHESDA,
MD.
History, Literature and Science: Engaging Educators and Students in
Harry Potter’s World
(Abstract Only)
The Exhibition Program develops companion
web sites for its exhibitions that incorporate and present multiple
disciplines—science, history, society, technology, medicine, health,
literature, etc. These online exhibitions stay accessible on the virtual world,
even after the physical exhibitions close or finish traveling. The online
exhibition also features education resources that are developed by and for
educators from K-12 and higher education institutions. This presentation
introduces the process and results of developing the education resources on the
companion web site of the Harry Potter’s World: Renaissance Science, Magic, and
Medicine banner exhibition. The Harry Potter’s World online exhibition provides
education resources—i.e., English and Science lesson plans, a higher
education module, online activities, and a bibliography, which are developed in
collaboration with educators who have already incorporated and taught various
themes from the “Harry Potter” series written by J. K. Rowling. The
presentation highlights how working with experienced educators help produce
resources that reach diverse audiences beyond traditional patrons of the
National Library of Medicine.
PAUL THEERMANThe History of
Tropical Medicine, as seen in the Images and Archives Collections of the
National Library of Medicine
One of the strengths of the manuscript,
still image, and film collections of the History of Medicine Division of NLM is
its documentation of tropical medicine. I will provide a short history of
20th-century efforts to understand tropical diseases and control them,
illustrated with materials from these collections, and also provide information
about HMD’s two subject guides on these topics: “Guide to Tropical Disease
Motion Pictures and Audiovisuals at the National Library of medicine,” just
released, and “Tropical Medicine Manuscript Collections in the History of
Medicine Division of the National Library of Medicine