CapSci2006

Capital Science 2006

AAAS –
DISCUSSION ON THE EFFECT OF PATENT LEGISLATION ON SCIENCE RESEARCH

Reception Sunday 4:00PM
Speakers

  • Mary Webster, M.S., J.D., Partner,
    Schwartz, Sung & Webster
  • Lawrence Sung, Ph.D., J.D., Law
    School Professor and Director, Intellectual Property Law Program, University of
    Maryland School of Law and Partner, Schwartz, Sung &
    Webster
  • Theodore O. Poehler, M.S., Ph.D.
    Vice Provost for Research, The Johns Hopkins University

The Effects of Patent Reform Legislation on
the Conduct of Scientific Research


Major patent reform legislation is on the horizon in U.S. lawmaking. Currently, the Patent Reform
Act (H.R. 2795)-introduced in 2005, proposes sweeping changes to current patent
law which, if adopted, will likely affect the conduct of academic research, the
dissemination of knowledge, and the accessibility of that knowledge in the
public domain. One such change is from the “first to invent” to the more
universal “first to file” system that may increase competition for patents as
well as shorten the grace period after a publication to apply for a patent.
Would such a change lead researchers to concentrate more on patenting than
publishing? The other change would redefine what constitutes “prior art”-the
body of preexisting, publicly-accessible knowledge for which patents are
unavailable. Though scarcely discussed but no less important an issue in patent
reform, that redefinition could have significant or unforeseen consequences for
the placement of scientific knowledge in the public domain. This session brings
together the perspectives of multiple experts on the “first to file” and “prior
art” aspects of the Patent Reform Act. Speakers will address whether those
proposed changes represent major areas of concern to the scientific community
and, if so, what potential effects the changes could have on the conduct of
scientific research. In turn, that will facilitate a discussion among them over
their expressed viewpoints. Ample time will be allowed for members of the
audience to pose questions and to participate in the discussion.

 

AMERICAN ASSOCIATION OF PHYSICS TEACHERS,
CHESAPEAKE SECTION

David M. Schaefer (presenter), Cameron
Bolling Towson University, John Sunderland, Rajeswari Kolagani , Department of
Physics, Astronomy and Geosciences, Tyler Bradley, Towson High School, Towson
Md. 21252, Bonnie Ludka, James Madison University, Physics Department
,
Undergraduate Experiments in Nanolithography
Saturday 9:00AM
The continued miniaturization of devices and components has
produced an urgent need for fabrication techniques on a nanometer length scale.
Nanolithography using the atomic force microscope (AFM) is emerging as a
promising tool for nanotechnology. In this presentation, we discuss experiments
using the AFM to perform nanolithography in an undergraduate laboratory. We
report our results of AFM- induced nanoscale surface modifications in thin
films of the CMR manganite material La0.7Ba0.3MnO3. CMR manganite materials
have been demonstrated to be useful for a variety of technological applications
including magnetic sensors and bolometric infrared detectors.

 

Lincoln E. Bragg, Simple Black Hole
Formation Model
Saturday 9:20AM
The first level of black hole formation ideas is just physical and
is suitable for anyone you can mention the idea of black holes to. More depth
is suitable if they are comfortable with (x,t) history diagrams of light ray
paths. Later level is suitable if you have introduced the Schwarzschild black
hole model and they know about changing coordinate systems in a plane.

 

Carl E. Mungan, Physics Dept, U.S. Naval
Academy, Annapolis, MD,
Escape Speeds and Asteroid
Collisions
Saturday 9:40AM
Simultaneous conservation of momentum and energy determines the
relative speed of a pair of gravitationally attracting bodies as a function of
the distance separating them. This has applications such as solar-system
satellite escape and asteroid-earth collisions. It is not necessary to start
from an infinite-earth-mass approximation. Careful choice of reference frames
eases the calculations.

 

William T. Franz, Randolph-Macon College,
Ashland, VA,
Bottle Rockets, Teacups and the Real World: A senior
seminar to bridge the gap between physics student and life after
college
Saturday 10:00AM
One of the peculiar aspects to being a professional physicist is
the authority we all seem to have to comment on ‘real life’ phenomena. I have
been asked about everything from divining rods to space junk during my career.
The senior seminar at Randolph-Macon College is designed to be a culminating
experience that asks students to synthesize their course and research
experience and improve their presentation skills. The most recent iteration
focused on problems that varied from urban legends to wacky theories with an
emphasis on laboratory measurement, practical calculation, and presentation of
results. Methods for heating water to make tea, the practicality of launching
people with bottle rockets, and the use of aluminum helmets to prevent
brainwashing will be discussed.

 

Eric Kearsley, NRAO, Green Bank, WV and A.
Einstein HS, Kensington, MD, K. O’Neil NRAO, Green Bank, WV,
From 20 cm
– 1µm: Measuring the Gas and Dust in Nearby Massive Low Surface
Brightness Galaxies
Saturday 10:20AM
Archival data from the IRAS, 2MASS, NVSS, and FIRST catalogs,
supplemented with new measurements of HI, are used to analyze the relationship
between the relative mass of the various components of galaxies (stars, atomic
hydrogen, dust, and molecular gas) using a small sample of nearby (z<0.1),
massive low surface brightness galaxies. The sample is compared to three sets
of published data: a large collection of radio sources (Condon, 2002) from the
UGC having a radio continuum intensity >2.5 mJy; a smaller sample of low
surface brightness galaxies (Galaz, 2002); and a collection of NIR LSB galaxies
(Monnier-Ragaigne 2002). We find that if we naively assume the ratio of the
dust and molecular gas mass relative to the mass of HI is a constant we are
unable to predict the observed ratio of stellar mass to HI mass, indicating
that the HI mass ratio is a poor indicator of the total baryonic mass in the
studied galaxies. HI measurements obtained during this study using the Green
Bank Telescope also provide a correction to the velocity of UGC 11068.

 

David Wright, Tidewater Community College,
Virginia Beach, VA,
Physics in the Courtroom
Saturday 10:40AM
Two very different cases, a car accident and a spotlighted
helicopter, will be used to illustrate how physics can be used in the
courtroom. The cases will be presented to the audience for consideration. Will
their judgment match that of the judge?

 

Rhett Herman, Radford University, Radford,
VA,
Learning astronomy at the Green Bank National Radio Astronomy
Observato
Saturday 11:00AM
Physics students and faculty from Radford University have taken
advantage of using the 40-foot-diameter educational radio telescope at Green
Bank NRAO for the past several years. We have found that even on a “getaway”
weekend such as this, students tend to put in a great amount of work in
learning how to use the equipment, and to interpret and process the data. The
question has arisen as to whether course credit should be offered for these
weekend trips. And the answer is …

 

Deonna Woolard, Randolph-Macon College,
Ashland, VA,
Learning astronomy at the Green Bank National Radio
Astronomy Observato
Saturday 11:20AM
We have been conducting the Force Concept Inventory Pre and Post
tests for the last six years. Preliminary analysis of the data shows a downward
trend of pretest scores. Might this be attributed to more students going to
college and taking intro physics, for example, as compared to years past where
the college environment was geared towards a certain type of student? I would
like to know other’s opinion on this and the actions that they are taking to
address the situation.

 

Business Meeting of the CS-AAPT
Saturday 11:40AM

 

AMERICAN
METEOROLOGICAL SOCIETY

Student RepresentativesOverview of Programs at Area Universitites Sunday 2:00PM
The greater Washington-Baltimore area is one of the world’s
centers for research and operations in the atmospheric and oceanic sciences.
Numerous local universities are integrally linked to these efforts, offering
vibrant graduate programs where students and faculty make new contributions
everyday. At the session of the DC Chapter of the American Meteorological
Society, student representatives have been invited from several universities
(University of Maryland College Park, University of Maryland Baltimore County,
George Mason University and Howard University) to provide program overviews.
This will offer students the opportunity to learn about neighboring programs,
meet their peers and identify research connections and possible synergies. In
addition, Dr. Ron McPherson, Executive Director Emeritus of the AMS, will
present an overview of the services and scholarships available to students
through the AMS as well resources the AMS provides to all of its members. Both
students and interested WAS affiliate members are invited to attend this
session.

 

ASSOCIATION FOR
SCIENCE TECHNOLOGY AND INNOVATION

John Bosma Synthesis Partners ,
Advances in Rapid Manufacturing
Saturday 9:00AM
Advances in “rapid manufacturing” (RM) – which includes “rapid
prototyping” and “rapid tooling” (RP, RT) — permit us to turn out diverse
products at low cost and high quality – e.g.: 1) aircraft wing structures up to
14 ft in one dimension, using laser-sintered titanium-aluminide alloys; 2)
large composite tools (molds) with dimensional tolerances of 0.01 inch in 5
minutes, also large composite airframe sections up to 8 x 10 ft within minutes;
3) small to large complex structures — from small sonar transducers to ship
deckhouses — by extruding (like toothpaste) various materials mixes — from
composites and o\plastics to metal powders — using simple gantries, then
following the deposition nozzle with curing and finishing tools; 4)
single-process production of electronic hardware like TV remotes and handheld
controllers; 5) rapid forward repair of damaged helicopters, trucks and tanks
with parts made to order from data generated from video-based coordinate
measuring machines that a forward mechanic uses to ‘;map’ damaged parts; and 6)
finely structured metal and composite structures (channels, grids, honeycombs).
There are more than 100 RM-RP-RT processes available around the world – yet
they have not yet penetrated major industries like car and truck manufacturing,
much less the aircraft industry. At the same time, these processes greatly
reduce the startup costs for small manufacturing firms and allow them to
quickly enter “high end” markets. The author believes that rapid manufacturing
could become the single greatest competitive edge the US develops — yet
competition is fierce from newer entrants intro this field like China.

 

Jim Burke, Northrop Grumman Information
Technology Intelligence (TASC),
Wild Cards and Weak
Signals
Saturday 9:30AM
Have you ever been surprised by something in science, politics or
your own organization and wondered why it was a surprise. Why are these things
such surprises to so many folks? Are people not paying attention because the
in-box is so full? Don’t folks trust forecasters and futurists who offer
insights about the future? These are some of the questions that Jim Burke,
Northrop Grumman Information Technology Intelligence (TASC) will be talking
about in his presentation–real surprises, wild cards, weak signals and
unintended consequences. He will do this in a rich stew of subjects–science,
the environment, robotics, as well as some practical tools for forecasting. Jim
heads up a group that assesses future technologies and organizations and is a
former president of the Capitol Area World Future Society chapter, as well as a
member of the Association of Professional Futurists.

 

Gene Allen Director, Collaborative
Development MSC Software Corporation,
Stochastic Simulation – A New
Engineering Process Demonstration Using MSC Robust Design
Saturday 10:00AM
Mr. Allen will be providing background on a new engineering
analyses process being used that takes advantage recent advances in computer
capabilities. The process incorporates the natural variability and uncertainty
that exists in reality into computer simulations. The process, referred to as
stochastic simulation, uses advanced Monte Carlo techniques. The results of a
stochastic simulation are displayed in a cloud of points that represents the
reality of the physics being modeled, with each point representing a possible
situation. Cause and effect information is quickly derived from the results and
displayed in Decision Maps. Design improvements can be realized by using the
Stochastic Design Improvement (SDI) process to move the cloud towards design
targets. The process enables users to get an order of magnitude more
information from computer models. Some companies are using stochastic
simulation in product design with significant success. EADS-CASA has reduced
weight of a satellite launch dispenser was reduced from 500 to 337 lbs by
changing the composite layup. Alenia has used the process to reduce weight in
commercial aircraft. Application of this process in the auto industry has
resulted in improved crash worthiness with weight reduction. BMW reduced weight
in a car model by 33 pounds, Nissan – 35 pounds, other cars at other companies
had weight reductions of 55, 40, and 13 pounds.

 

Thomas Meylan, Ph.D., EvolvingSuccess
Hyattsville, MD,
The Formation of Collaborative Sub-cultures Within a
Pervasively Competitive Business Culture
Saturday 10:30AM
Every human culture, including cultures in business organizations,
is built on a basis of interpersonal competition, and of competition between
human groups. For corporate success against other groups, internal competition
has to be replaced with collaboration. This provides highly leverage-able
advantages relative to other organizations which retain in-house competition as
a primary culture characteristic. The willingness of individuals to give up
interpersonal competition in favor of collaboration is completely dependent on
the individual’s primary Drive Satisfaction Strategy (DSS). All DSSs contribute
to the formation of competitive business cultures, but two DSSs can also
contribute to the formation of collaborative business cultures. While it is not
possible to build an organization with a completely collaborative culture, it
is possible to increase the number and sizes of the pockets of collaborative
culture housed within a pervasively competitive business culture.

 

Richard H. Smith, MS Flexible Medical
Systems, LLC,
Nanotechnology Moves from the Lab to Medical
Practice
Saturday 2:00PM
Two years ago, Nanotech commercializer Richard Smith described how
nanotechnology might be exploited to fit the needs of the military. This year,
he will describe the progress of a nano-based diagnostic device that is now
moving from the lab into medical practice. The uses for this device will range
from insulin management for diabetics to shock/trauma care for soldiers in the
field to a “canary- in-the-mine” detector for asian flu and other potentially
pandemic infectious diseases.

 

Dr. Geoffrey P Malafsky,TECHi2,
Scalable Ontological Sense Matching
Saturday 2:30PM
The dramatic increase in availability of information is inundating
people with enormous quantities of data and information that must be sifted
quickly and accurately enough to support decision making and actions. Despite
the significant improvements in technologies, they still lack the ability to
determine the meaning and sense of the information. This is especially acute
for Knowledge Discovery since knowledge is differentiated from information by
context, confidence, pedigree, and relationships to other information. A main
gap is representing knowledge in a manner that is accurate and scalable for
large-scale computer processing. Current techniques of knowledge representation
with ontologies rely on expensive and time consuming efforts to produce static
upper, middle, lower level ontologies. This large level of effort and the
resulting static nature of the ontologies make them difficult to scale to
realistic operations where the volume is extremely large, and the concepts and
knowledge change at a rate faster than can be accommodated with this approach.
We developed a new ontology framework that uses repeatable ontology templates
aligned with natural organizational boundaries for roles, responsibilities, and
domain knowledge. The templates define domain and term level senses for
concepts using both controlled and domain specific vocabularies that express
functional domain knowledge. This provides a unified logical design with a
distributed physical system which is the hallmark of a services oriented
architecture. The sense matching ontology is part of an integrated architecture
that connects and traces domain knowledge to ontologies to business rules to
semantic metadata. It provides a single unified specification of knowledge
expressed in an ontology, rules distilled from an ontology, and metadata
annotating data and information with rules and knowledge.

 

Martin Schwab Author, consultant and member
of the Aerospace Technology Working Group (ATWG), chartered by NASA
Headquarters in 1990,
The Future of Humans in Space: Alternative Views
and Strategies
Saturday 3:00PM
Every human culture, including cultures in business organizations,
is built on a basis of interpersonal competition, and of competition between
human groups. For corporate success against other groups, internal competition
has to be replaced with collaboration. This provides highly leverage-able
advantages relative to other organizations which retain in-house competition as
a primary culture characteristic. The willingness of individuals to give up
interpersonal competition in favor of collaboration is completely dependent on
the individual’s primary Drive Satisfaction Strategy (DSS). All DSSs contribute
to the formation of competitive business cultures, but two DSSs can also
contribute to the formation of collaborative business cultures. While it is not
possible to build an organization with a completely collaborative culture, it
is possible to increase the number and sizes of the pockets of collaborative
culture housed within a pervasively competitive business culture.

ASSOCIATION FOR
WOMEN IN SCIENCE. DC-METRO CHAPTER

Mentoring: Making the Most of Scientists
Symposium
Saturday

 

Rachael Scholz, Senior Consultant, Booz Allen
Hamilton and Caren Chang, Associate Professor, Department of Cell Biology and
Molecular Genetics, University of Maryland,
Mentoring Graduate
Students
9:00AM
The relationship between the graduate student and mentor is
multifaceted and can be central to the success of the student. The mentor
ultimately serves as a role model for the student, both professionally and
personally. Therefore, it is crucial that the student select a lab/mentor that
best fits the needs and personality of the student. It is imperative that
mutual respect and good lines of communication are established early in the
mentor-mentee relationship. The expectations of the mentor should be clearly
defined in the beginning of the student’s tenure in the lab. The mentor should
recognize the student’s strengths, weaknesses and learning style, set
appropriate goals, and provide encouragement and support. The role of the
mentor changes over time and serves to help the student make the transition to
an independent scientist. During this time, it is important that the mentor and
student have regular meetings to discuss progress, outside funding
opportunities, and career guidance. In the event that problems arise between
them, then both the mentor and mentee need to be equipped with options and ways
to deal with the problems. These and other issues will be explored in this
panel discussion.

 

Jennifer Shen, Postdoctoral Fellow, National
Cancer Institute and
Jonathan Wiest, Associate Director for Training and
Education,

Center for Cancer Research,
National Cancer Institute,
Mentoring Postdocs
9:45AM
The origin of mentor is traced to ancient Greek mythology, when a
friend of Odysseus, Mentor, was entrusted with the education of Odysseus’s son
Telemachus. In modern times, the role of mentor can be defined as a trusted
guide or counselor for his or her protégé. During the
postdoctoral training period, successful mentoring can lead to productive
working relationships, as well as enhancing professional career development for
both mentor and protégé. While the mentor and postdoctoral fellow
might have distinctive career goals, it is possible, and beneficial, to build a
successful mentoring relationship. Both mentor and trainee ought to strive to
achieve open communication based on trust, respect, and compromises. While a
successful mentoring relationship could be rewarding – perhaps leading to a
lifelong friendship – unexpected negative outcomes might result. As long as the
mentor and protégé start building the relationship with a clear
understanding of each other through open communication and well-defined
expectations, a successful mentoring relationship will benefit the professional
careers for both.

 

Donna J. Dean, Senior Science Advisor,
Lewis-Burke Associates, LLC and President, Association for Women in Science and
Kathryn L. Beers, Research chemist, Combinatorial Methods Center, Polymers
Division, National Institute of Standards and Technology,

Mentoring Professionals
10:45AM
By the time young scientists reach their first professional
position, there are likely many different roles that they must learn to
balance. As a result, there are many types of mentoring relationships that can
enhance awareness, accelerate success and enrich the experiences of someone
during the course of their career. We will discuss several types of
mentor-mentee relationships, how to establish new mentoring relationships with
mutually agreeable expectations, and the important balance between setting
realistic goals and creating personal challenges through the relationship. We
will also discuss some of the unique issues facing the career professional,
including different organizational cultures, workplace diversity, and long-term
strategies for career planning. Finally, we will discuss the evolution of
mentor-mentee relationships as individuals progress in their careers, including
the challenges of mentoring upwards and managing the changing needs of mid- and
late-stage career professionals.

 

Laurel L. Haak, Program Officer,
Committee on Science, Engineering, and Public Policy, The National
AcademiesMentoring Graduate Students,
Moderated
Discussion
11:30AM

 

INSTITUTIONAL PANEL – Moderated by Laurel L.
Haak
2:00PM
  • Roosevelt Johnson, Program Director,
    Alliances for Graduate Education and the Professoriate, National Science
    Foundation

A primary component of all AGEP projects is significant capacity
building with respect to administrative infrastructure. More specifically, the
successful and effective implementation of strategies to coordinate innovative
graduate education activities across multiple departments at participating
institutions and across multiple partnering institutions requires the
establishment of new administrative infrastructure (i.e., policies, practices,
offices, and staffing). These newly established administrative infrastructures
involve a variety of resources, including (but not limited to) space,
equipment, and staff. This administrative infrastructure will exist after the
term of the NSF-supported activity. Transition and graduate success strategies
tend to be focused in multiple departments across partner institutions
(intra-alliance activity), and the career development strategies are more
likely to involve inter-alliance activity. Each level of interconnectivity
poses exciting challenges and opportunities for developing new paradigms for
the vertical integration of research and education that will lead to
significant increases in the pool of minority professionals interested in and
prepared for careers in the STEM professoriate. In addition to addressing the
enhanced preparation of Ph.D.s for the professoriate, AGEP alliances have taken
significant steps toward proactively recruiting from that well-nurtured pool to
fill future faculty positions. Through AGEP, an unprecedented community of
institutions committed to acting cooperatively at the graduate level is being
created. AGEP provides an opportunity for participating institutions to
leverage their resources with a community of other institutions sharing a
commitment to enhance recruitment, retention, advancement and long-term career
success of students.

  • Irelene P. Ricks, Director of Minorities
    Affairs, The American Society for Cell Biology

The ASCB MAC programs were awarded the 2004 NSF Presidential
Award for Excellence in Science, Mathematics, and Engineering Mentoring. Under
the auspices of a National Institutes of Health (NIH), National Institute of
General Medical Science (NIGMS) Minority Access to Research Careers (MARC)
grant, the ASCB MAC hosts a variety of activities. The ASCB MAC recruits
members and students for participation in programs at the ASCB Annual Meeting
and sponsors 50-60 minority students and faculty to attend and present a
research poster for competitive judging at this meeting. MAC provides financial
support for minority students to complete summer coursework at Marine
Biological Laboratory and Friday Harbor Laboratory and MAC’s Linkage Fellows
Program provides fellowships to junior or associate level faculty from minority
serving institutions. ASCB MAC’s mentoring work includes hosting a mentoring
symposium and a Junior Faculty Workshop, which invites minority junior faculty
and postdoctoral fellows to discuss issues related to tenure track positions,
publications, grant writing, and service to the community. ASCB MAC also
participates in conferences aimed at studying minorities in the workforce and
is involved in conducting an assessment of all MARC and non-MARC programs.
Independent of NIH NIGMS MARC support, ASCB MAC hosts minority outreach and
networking activities. This session will discuss the details of these and other
ASCB programs.

  • Phyllis Robinson, Professor of Biology
    and Co-PI on ADVANCE Grant, University of Maryland Baltimore County

In 2003 the University of Maryland, Baltimore County (UMBC)
received a very prestigious Institutional Transformation Award from the
National Science Foundation (NSF) to support the ADVANCE Program. ADVANCE at
UMBC is designed to enhance policies and practices affecting the recruitment,
retention, advancement, and leadership of women faculty in science, technology,
engineering, and mathematics (STEM), and is instrumental in developing and
implementing mentoring activities campus-wide. Mentoring and outreach are key
components of the Faculty Horizons Program, a professional development
opportunity for upper level graduate students and post-doctoral fellows,
particularly women, with an emphasis on women from underrepresented groups, in
STEM. Participants of this program gain the knowledge and tools needed to build
successful and productive faculty careers. A key vehicle for mentoring junior
STEM women faculty is the Eminent Scholar Mentor Program. This ADVANCE
initiative pairs UMBC women faculty with researchers eminent in their fields,
developing a broader connection to their research community and supporting the
faculty member’s future success. Thus far, several successful matches have
developed, with the mentor visiting UMBC the first year and the junior faculty
visiting the mentor’s institution the second to continue the relationship as
well as conduct research presentations. Another successful mentoring effort
includes work with the WISE (Women in Science and Engineering) group, which
predates ADVANCE at UMBC. Informal mentoring takes place within this network of
STEM women faculty. This informal mentoring has branched out to include the
formalized Faculty ADVANCEment Workshop Series, which is open to all STEM
faculty at UMBC.

 

CHEMICAL
SOCIETY OF WASHINGTON (CSW)/NSF INSPECTOR GENERAL’S
OFFICE

Scott J. Moore and Kenneth L. Busch, Office
of Inspector General National Science Foundation,
Chemistry’s “Creative
Commons”: Changing Perspectives for Plagiarism and Intellectual Property
Theft
Saturday 2:00PM
The term “creative commons” can be used to refer to information (in
all its forms) that most would consider to be in the public domain, and thought
to be free for use without the need for specific citation of a source or an
individual. The wide availability of information on the internet, both with and
without subscription, and allied with or separate from classical print
publication, has engendered disparate views of just what the chemistry
“creative commons” might encompass, and just what the standards of scholarship
for appropriate use might be. Such divergences of view are apparent in
allegations of plagiarism or intellectual property theft forwarded to our
office. Our assessment of each allegation must include the standards of the
relevant research community, as well as the often pointed opinions of those
directly involved in specific issues, as complainants, subjects, investigators,
or adjudicators. As examples of the chemistry “creative commons,” and what it
may or may not include, we will present some recent case-based perspectives
that touch on the issues of 1) appropriate use of text and information from the
internet; 2) faculty/student interactions; 3) figures, photos, and
fabrications; and 4) proposal and manuscript review processes.

 

INSTITUTE OF
ELECTRICAL AND ELECTRONICS ENGINEERS (IEEE), WASHINGTON AND NORTHERN VIRGINIA
SECTIONS

Haik Biglardi, PhD, Senior Director of
Electronics Systems and Controls, Fairchild Controls Corporation,
The
Genesis of Quaternions
Sunday 10:00 AM
Complex numbers have been studied for nearly five centuries. The
first hint of existence of such a number is recorded in the work of Nicholas
Chuquet in 1484. In solving general equations he showed that some equations
lead to imaginary solutions, but dismisses them (“Tel nombre est ineperible”).
Then Cardano in 1545 writes Ars magna on the solutions of cubic and quartic
equations. In it, there are solutions to polynomials which lead to square roots
of negative quantities. Cardano calls them “sophistic” and concludes that it is
“as subtle as it is useless.” In 1637, Descartes coins the term “imaginary”. By
1747 imaginary numbers are well accepted and Euler presents his famous
identity. The period from 1825-50 and the work of Cauchy is considered to be
the beginning of modern complex analysis. And finally in 1843, Hamilton
presents the Quaternion numbers. For nearly a century the Quaternion numbers
remained useless. In the mid twentieth century the Quaternions are applied to
Attitude Control problems and soon after they become very useful for computer
graphics applications. It is interesting to know that as long as Quaternions
have been with us, so have Gaussian Integers and their by products which are
the Golden Triangles or Golden Gaussian Integers. Then a natural question
arises: Is there any Golden Quaternion? That is, the quaternion magnitude is a
perfect square and the Quaternion components are also integers. The answer is
emphatically yes!

 

Kiki Ikossi, PhD, Adjunct Professor of
Electrical and Computer Engineering, George Mason University and President of
I-Cube Inc.,
The Next Generation Electronics
Sunday 10:40 AM
With the invention of the transistor electronic devices quietly
took a prominent role in our modern lives. Technological advancements and the
demand for faster and more efficient computations and communications gave the
impetus to move beyond the crystalline silicon dominated electronics. Today’s
advanced materials and nanosize dimensions allow for new physical concepts to
be directly implemented in optical and electronic devices. In this presentation
we will examine the potential for some of the most recent advancements in the
antimonide based semiconductor material system and structures. New
hetero-structures allow the development of electronic devices with 10-fold
reduction in energy consumption and 3-fold increase in high frequency
performance. Furthermore, the application of these materials in photovoltaic
devices allows for the development of IR sensors and solar cells. The
photovoltaic devices can be made so that they respond to the whole solar
spectrum instead of a narrow wavelength resulting in more efficient solar
cells. The potential for significant energy conservation has broad
ramifications in biomedical, remote sensing, communications, military and
consumer applications. Technical, economic and political barriers for bringing
this technology to use are examined.

 

Shahid Shah, CEO, and Chief Software
Architect of Netspective Corporation,
Service Orientation in
Modern Software Architectures
Sunday 11:20 AM
As our computer and software systems become more and more complex
engineers are finding that specialization, standardization, and scalability
lessons from the real world of services are more and more applicable to the
world of software. Just like entire service industries (like transportation,
retail, and telecommunication) cropped up as the business world became more
complex, companies that use a service orientation approach in the design and
architecture of their computer systems can benefit enormously from
specialization. This paper will introduce SOA and take a look at the impact of
SOA in various industries.

 

Jonathan Ward, PhD, Project
Leader, Nantero, Inc. and Murty Polavarapu, Senior Principal Engineer, BAE
Systems
,
Carbon Nanotube-based Nonvolatile
Memory Device for Space Applications
Sunday 2:00 PM
There exists a great need for a high density
radiation-hard non-volatile random access memory (RH-NVRAM) with performance
comparable to that of a Static Random Access Memory (SRAM) and with the
non-volatility of an electrically erasable programmable read-only memory
(EEPROM) for Department of Defense, Space and other national security
applications. Current solutions for filling the void are Ferroelectric RAM
(FeRAM), Magnetic RAM (MRAM) and Chalcogenide RAM (CRAM). None of these
technologies has the all of the desired attributes of performance, density,
radiation hardness and non-volatility. Nantero’s Nanotube RAM (NRAM™)
is a revolutionary memory scheme developed to overcome many of the limitations
of current RH-NVRAM. This technology exploits the unique mechanical properties
of carbon nanotubes to realize a non-volatile electromechanical memory element.
NRAM is also virtually immune to radiation events because of the absence of any
charge storage element. The concept of NRAM and its integration into a state of
the art CMOS technology will be discussed.

 

Panel Discussion: The University of
Maryland Solar Decathlon: Past, Present, and Future
Sunday
2:40 PM
Panelists:
   Harry R. Sauberman, Senior Professional
Engineer in FDA’s Center for Devices and Radiological Health (CDRH), Office of
Device Evaluation. – Panel Moderator

Dan
Vlacich

Dan Feng
Rifat
Jafreen

INSTITUTE OF
INDUSTRIAL ENGINEERS, NATIONAL CAPITAL CHAPTER/WASHINGTON CHAPTER OF THE
INSTITUTE FOR OPERATIONS RESEARCH AND MANAGEMENT
SCIENCES

Meet the Officers and Members of IIE and
WINFORMS.
Saturday 9:00AM

 

Richele R. Scuro, P.E., Manager of Process
and Methods at the Fredericksburg Distribution Center of CVS Pharmacy,
Optimizing System and Operational Capital
Saturday 10:00AM
The objective of the presentation is to provide the analytical
tools to mechanically and operationally balance distribution-related systems.
The presentation will cover details about different designs of merges, curves,
turns, belts, and manpower associated with maximizing equipment throughput
potential. Speed and spacing at each mechanical point is critical to mechanical
operational balancing to optimize a system that requires capital expense.
Manpower at input and at output points is also critical to optimizing the
system, but does not require capital expense. All points will be discussed in
detail with examples..

 

Richele R. Scuro, P.E., Manager of Process
and Methods at the Fredericksburg Distribution Center of CVS Pharmacy,
GAP Analysis, The Best Way to Discover Your Own Best
Practice
Saturday 11:00AM
The objective of the presentation is to provide the analytical
tools and subject matter to complete a gap analysis. Gap analyses are used to
determine variations between two operations, systems or processes. In this
presentation subject details will be discussed from staffing, processes, tools,
parameters that should be included in a complete analysis. Also, selecting
which operations to compare based on certain criteria will be used. This
presentation will benefit engineers, analysts, managers, and others that have
similar sites at various performance levels that can not compare to any
industry standards.

 

Dr. Russell Vane, Senior Researcher, General
Dynamics Advanced Information Systems and Dr. Douglas Griffith, Principal
Cognitive Psychologist, General Dynamics Advanced Information Systems,
The Origins of Hypergame Theory: Why Decision Theory and Game
Theory Rarely Capture the Actual Decision Problem
Saturday 2:00 PM
Hypergame Theory invented in 1979 by Peter G. Bennett is
re-emerging as an analytic tool in post-9/11 strategic reasoning. Attendees
will receive a brief history and the properties of this theory which promotes
thinking about the competitor’s mindset, information, and constraints when
generating and choosing options. This work borrows heavily from Dr. Vane’s
doctoral dissertation at GMU in Fall 2000. Dr. Griffith will discuss how
hypergames could decrease cognitive errors, such as decision-maker
“confirmation bias.”

 

Dr. Russell Vane, Senior Researcher, General
Dynamics Advanced Information Systems and Dr. Douglas Griffith, Principal
Cognitive Psychologist, General Dynamics Advanced Information Systems
,
A Step by Step Guide to Hypergame Theory
Saturday 3:00 PM
Hypergame theory is discussed as an approach to account for
uncertainty, good and bad luck, the possible contexts of competitors, how to
collect evidence about competitors, and how to incorporate the inherent
fragility or robustness of different options. A stock market example will be
explained that considers when to accept or eschew growth oriented portfolios.
The advantages and caveats of hypergame theory are revealed.

 

Dr. Russell Vane, Senior Researcher, General
Dynamics Advanced Information Systems and Dr. Douglas Griffith, Principal
Cognitive Psychologist, General Dynamics Advanced Information
Systems,
A Practical Example about Hypergame Theory, Reasoning
about New US Policies Concerning Cuba
Sunday 10:00AM
Hypergame theory has great power in structuring reasoning about
future actions and policies. In this session, attendees are invited to
participate in a highly interactive exploration of the Pros, Cons, and Emergent
Properties of a Hypothetical Shift in US strategy that involves opening Cuba to
US tourist trade, investment and a new era of cooperation; and some potential
pitfalls.

 

Dr. Russell Vane, Senior Researcher, General
Dynamics Advanced Information Systems and Dr. Douglas Griffith, Principal
Cognitive Psychologist, General Dynamics Advanced Information Systems,
Hypergame Theory, Participant Practice
Sunday 1100AM
Attendees will be challenged to attempt to set up a hypergame to
explore their own domains of expertise and to gain hands-on experience about
hypergame theory. Participants will receive paper and pencil templates and an
excel spreadsheet on CD-ROM that aids the process. Ten doctoral dissertation
topics will be revealed in a handout that can be discussed during lunch.

 

Dr. Douglas A. Samuelson, Senior Analyst at
the Homeland Security Institute, Arlington, Virginia, President of InfoLogix,
Inc., and an adjunct faculty member at the University of Pennsylvania. He is
the in-coming President of WINFORMS for 2006 and the author of the ORacle
column in ORMS Today,
The Hyper-Hypergame: Issues in
Evidence-Based Evaluation of Social Science
Sunday 2:00PM
Advanced social science methods produce models and other
analytical structures whose complexity approaches that of the actual systems of
interest. For such analyses, traditional methods of validation are often
ineffective. Also, in some cases, theoretically sound validation experiments
are ethically indefensible. These difficulties force us to reconstitute our
methods of assessing model quality, tracing these methods back to philosophical
fundamentals. In particular, with reference to examples from current social
science, we reexamine such questions as: What is evidence? How compelling is a
piece of evidence? How is the value of a piece of evidence affected by context?
How do we recognize inappropriate specifications of context and other overly
restrictive assumptions? What degree of interaction with a complex system
(violating traditional notions of maintaining perpetual separation between
observer and observed) is appropriate to understand sufficiently how the system
works? What degree of uncertainty must be accepted as intrinsic to the system
we observe, and/or to the process of observing it? To what extent can and
should we use our understanding of complex social systems to predict how our
analyses will be received? Most important, as we move beyond conventional
physical science answers to these questions, how do we best advance social
science and still maintain scientific objectivity and credibility?

 

Richard Leshuk. Program Manager, IBM Federal
Systems (retired), Technology Transfer Society Education Director – USDA
Graduate School Course in Technology Transfer,
Technology
Transfer Trends and Implications
Sunday 3:00PM
The pace of technology transfer is clearly increasing. A number of
sub trends can be identified, e.g., the advent of business process and software
patents, which have implications for the long-term effectiveness of technology
transfer. This paper will examine both qualitative and quantitative changes in
the flow of innovation into commercialization and will consider the impact
likely to evolve

 

INTERNATIONAL
ASSOCIATION FOR DENTAL RESEARCH, WASHINGTON SECTION

Abstracts not yet available Sunday 10:00AM

NATIONAL
CAPITAL SECTION/OPTICAL SOCIETY OF AMERICA &
IEEE/LEOS

Chair: Jim Heaney
F.A. Narducci, Naval Air Systems Command, EO
Sensors Division,
Atom Wave-Guiding using Bi-chromatic
Fields
Saturday 9:10AM
Under the right conditions, bi-chromatic fields can exert strong
dipole forces on neutral atoms. In this paper, we theoretically explore the
possibility of atom guiding using bi-chromatic fields, and in particular,
fields whose frequency difference is close to the ground state splitting of the
atoms being guided. Implications to all-optical guiding experiments and
hollow-core fiber being performed at the Naval Air Systems Command will be
discussed..

 

J.P. Davis and F.A. Narducci, Naval Air
Systems Command, EO Sensors Division,
Effects of Frequency-Chirp
in Electro-magnetically induced Transparency
Saturday 9:25AM
Electro-magnetically induced transparency (EIT) and coherent
population trapping (CPT) are phenomena that have been studied now for quite
some time. In contrast to the usual studies based on the steady state solution
of the equations of motion, we have simulated more closely a typical
experimental arrangement by making the laser detuning a dynamic variable and
solving the equations as a function of time. We present our simulations that
not only show the expected decrease in the contrast of the EIT and CPT
resonances against the Doppler line, but we also found, under certain
conditions, an unexpected “ringing” in the EIT and CPT resonances. We
demonstrate that this is a curious interplay between the spectrum of the laser
fields and the width of the two-photon resonance and not a direct result of
quantum coherence. Experimental results will also be presented.

 

Ming-Chiang Li, Spooky
Phenomenon Initiated by Two Laser Sources
Saturday 9:45AM
Physical process on spooky phenomenon1 initiated by two laser
sources was discussed in a Physical Review article more than twenty five years
ago. Around 1990, there were very active experimental pursuits on such a spooky
phenomenon, initiated by a single laser source with the help of crystal
parametric down conversion. This present talk will review the similarity and
difference between these two distinct processes. Spooky phenomena form the
physical basis for quantum computation and quantum encryption. Experimental
investigations on that initiated by two laser sources will lead to better
understanding of spooky phenomena, and provide clues for the successes of
quantum computation and quantum encryption as well as on high precision
measurement of various atomic energy levels.
Reference: 1. Ming-Chiang Li,
Phys. Rev. A 22 (1980) 1323.

 

Dr. Donald J. Michels, Naval Research
Laboratory and the Catholic University of America
,
Using the
Sun to Teach the Teachers
Saturday 10:05AM
A new approach to improving the preparation of K-8 teachers for
the all-important role of teaching and inspiring the next generation of
scientific leaders is under development at the Catholic University of America
in Washington. A course specially designed to capture the interest of
early-grade teachers and to promote confidence in their ability to teach
physical science is now in its third year. Course content is focused on the
concept of force fields (gravitational and electromagnetic) and it capitalizes
on the current wealth of stunning solar and space imagery to illustrate on
astrophysical scales the application of simple concepts taught hands-on in the
classroom and laboratory. Students work with elementary hands-on materials as
well as with real-time data from operating space missions such as the SOHO and
other major space observatories. A poster and slide show presentation of
Physics 240 – Sun and Earth: Concepts and Connections will be presented
throughout the day (Saturday 3/25) in foyer outside Room 375.

 

Brian Redman; Barry Stann; William Ruff;
William Lawler; Mark Giza; Paul Shen; Nuri Emanetoglu; Keith Aliberti; John
Dammann;Deborah Simon; and William Potter, Army Research Laboratory,
Chirped AM Ladar Development at ARL
Saturday 10:45AM
The Army Research Laboratory has developed ladars based on the
chirped amplitude modulation (AM) technique. A summary of other ladar
techniques is presented first, followed by an explanation of the chirped AM
technique. This technique is based on the linear frequency modulation –
continuous wave (FM-CW) waveform, which has been employed in radars and
coherent ladars for decades. In the chirped AM technique, the linear frequency
modulation is applied to the frequency of the amplitude modulation of the
transmitted laser power, rather than to the electromagnetic field as in radar
and coherent ladar. This enables the ladar to use optical direct detection with
coherent mixing in the radio-frequency electronic domain. Unlike optical
coherent detection, optical direct detection is easier to scale to large format
detector arrays and wide fields-of-view, is unaffected by speckle and
turbulence induced phase noise and loss of coherence, is compatible with lower
cost broadband laser and LED transmitters, and is much more tolerant of
misalignments and platform vibrations. Unlike radar, ladar has excellent
spatial resolution for relatively small receiver apertures, and rarely suffers
from multi-path interference even in highly cluttered environments. Like radar
and coherent ladar, the chirped AM ladar technique enables simultaneous
measurement of range, velocity, and vibrations. Unlike pulsed direct detection
ladars, the chirped AM technique down-converts the wideband (~GHz) signal
necessary for good range resolution to much lower frequencies (<MHz) for
compatibility with standard bandwidth readouts. Lastly, 3D imagery,
range-Doppler tracking, and ladar vibrometry results from various chirped AM
ladar prototypes are presented.

 

Clifford M. Krowne Microwave Technology
Branch, Electronics Science & Technology Division, Naval Research
Laboratory,
Negative Refracting, Negative Index, and Left-Handed
Materials: Physics, Optics and Electronics
Saturday 11:05AM
There has been a tremendous amount of research theoretically,
numerically and experimentally into the area of materials and structures which
display properties whereby the conventional association of the phase front
velocity and power flow of a wave is not what we would regularly find in
ordinary materials. Such materials, I will indicate, have been known for quite
a while, and their properties are much affected by the boundary conditions of
the structures containing them. What is striking, though, is not that there
have been indications of such behavior for so long [1], but rather that new
technologies allowing micro-fabrication, nanomaterials, and small scale
heterostructure layering, are making room for constructing truly artificial
multi-dimension materials which have the projection of the phase velocity on
the power flow direction being negative. Dramatic effects occur, including
subwavelength imaging of waves from the microwave regime to the optical regime
in unusually shaped lenses, completely reconfigured electromagnetic fields [2],
[3] which are totally unlike anything seen for convention electronic structures
made with ordinary materials, and electronic quasi-lumped element circuits
which can have new properties including zero electrical length. I will discuss
what we have been doing at NRL [4], [5], [6], and finish by suggesting where
research and development efforts ought to go next in this broad field.
[1]
C. M. Krowne, “Left-Handed Materials for Microwave Devices and Circuits,” in
Encyclopedia of RF and Microwave Engineering, Wiley, Vol. 3, pp. 2303 – 2320,
2005. [2] C. M. Krowne, IEEE Trans. Microwave Th. Tech. 51, 2269, Dec. 2003.
[3] C. M. Krowne, Phys. Rev. Letts. 92, 053901, 3 Feb. 2004. [4] C. M. Krowne,
Phys. Rev. Letts. 93, 053902, 30 July 2004. [5] C. M. Krowne, J. Appl. Phys.,
15 Feb. 2006. [6] C. M. Krowne, Am. Phys. Soc. Meet., Bull. APS 49, 15 Mar.
2006.

 

H. John Wood and Tammy L. Brown, NASA Goddard
Space Flight Center,
New Developments at NASA’s ISAL
Saturday 11:25AM
NASA’s Instrument Synthesis & Analysis Laboratory (ISAL) has
developed new processes to provide an instrument study in one “virtual” week.
The week-long study is spread out over a period of three weeks to add fidelity
to the mechanical model so that structural analysis is possible, and to add
confidence to the cost model. By staggering the effort between different
engineering disciplines, while still maintaining a collaborative work
environment, each engineer still contributes a week of their time to the
process to maintain the study costs. The benefits are added value, fidelity,
and confidence in the multi-discipline instrument design.

 

Richard B. Gomez, George Mason University,
School of Computational Sciences,
Self-Adaptive Hyperspectral
Application Strategies Using Smart Satellites
Saturday 2:05PM
NASA’s Instrument Synthesis & Analysis Laboratory (ISAL) has
developed new processes to provide an instrument study in one “virtual” week.
The week-long study is spread out over a period of three weeks to add fidelity
to the mechanical model so that structural analysis is possible, and to add
confidence to the cost model. By staggering the effort between different
engineering disciplines, while still maintaining a collaborative work
environment, each engineer still contributes a week of their time to the
process to maintain the study costs. The benefits are added value, fidelity,
and confidence in the multi-discipline instrument design.

 

Dr. John J. Degnan Sigma Space Corporation,
Laser Ranging to Satellites, the Moon, and the Planets
Saturday 2:25PM
The first successful laser tracking of an artificial satellite
carrying an array of passive retroreflectors took place at the NASA Goddard
Space Flight Center in Greenbelt, MD on October 31, 1964. Over the next four
decades, international scientists and engineers developed a global network of
almost 40 satellite laser ranging (SLR) stations which have provided precision
tracking to over 60 international space missions. Over the same period, the
ranging precision has improved approximately three orders of magnitude, from a
few meters in 1964 to a few millimeters today. The scientific data gleaned from
these experiments has contributed immensely, either directly or indirectly, to
our understanding of the Earth’s gravity field, plate tectonics, regional
crustal deformation, ocean circulation, global warming, etc. In the late 1960’s
and 1970’s, the manned US Apollo and unmanned Soviet Lunakhod lunar missions
placed a total of five retroreflectors on the Moon. Ranging data from a few
select stations (notably in the US and France) to these reflectors has been
used by analysts to study Earth-Moon and Solar System dynamics and to provide
unique tests of General Relativity. Recent laser ranging experiments to the
Messenger and Mars Global Surveyor spacecraft, carried out in 2005 at distances
of 24 and 80 million Km respectively, have demonstrated the ability of lasers
to make decimeter or better measurements throughout the inner solar system.
Such capabilities have important consequences for interplanetary ranging and
time transfer, spacecraft navigation, planetary ephemeredes, solar mass and
gravity field studies, more accurate tests of general relativity, asteroid mass
distribution, and interplanetary optical communications.

 

Eric P. Shettle and Rangasayi N. Halthore,
Naval Research Laboratory, Remote Sensing Division,
Lunar
Backscattering Measurements of Aerosol Optical Depth at Night
Saturday 2:45PM
Previous satellite measurements of the atmospheric aerosol optical
depth (AOD) have utilized measurements backscattered sunlight, starting with
the AVHRR [Advanced Very High Resolution Radiometer] measurements on the NOAA
polar-orbiting satellites. The National Polar-orbiting Operational Satellite
System [NPOESS] currently being developed, will also use backscattered solar
radiation with the Visible Infrared Imaging Radiometer Suite [VIIRS] to measure
the AOD. However, the VIIRS instrument also includes a Daytime/Nighttime
Visible Imagery band or the Day-Night Band (DNB) which is a broad band channel
covering from about 500 to 900 nm, with sufficient dynamic range to provide
calibrated cloud imagery from quarter moon to full sunlight. We have used
MODTRAN to simulate the lunar radiances scattered by the atmosphere under
different conditions and investigate the possibility of using the DNB to
provide nighttime measurements of AOD. Preliminary simulations indicate that
there is sufficient signal-to-noise to measure the AOD with an uncertainty of
±0.08 to ±0.09 for lunar phase angle of 60 , near nadir, assuming
a calibration accuracy of about 10%. The uncertainty increases towards the edge
of the across-track scan. This would provide improved temporal information on
the aerosol transport or other transient phenomena (such as Land/Sea breeze),
due to the twice-a-day revisit frequency, instead of once. These retrievals of
AOD would be limited to over the ocean where the surface albedo is small and
well known. They could not be extended over vegetated surfaces using the “dark
pixel” method, because the DNB includes much of the high reflectance in the
near-IR due to vegetation. This means that the contribution to the at-sensor
radiances reflected by the vegetated surface would be much larger and more
poorly known than that due to the ocean surface so that the uncertainties in
the surface contribution could be greater than the magnitude of the radiances
backscattered by the atmospheric aerosols. The presence of the aurora could
mask any aerosol signal.

 

A. N. Chryssis; C. Stanford; Jui Hee; Prof.
W. E. Bentley; Prof. P. DeShong; S. S. Saini; S. M. Lee and Prof. M. Dagenais,
University of Maryland, College Park,
High Sensitivity
Bio-Sensor Based on Etched Fiber Bragg Grating
Saturday 3:05PM
We have developed a high sensitivity chem/bio sensor based on an
etched fiber Bragg grating. This sensor permits the precise measurement of the
index of refraction of liquids. The sensor is based on the evanescent wave
interaction of the etched fiber core with the surrounding environment. A high
index resolution of 7.2 x 10-6 was achieved by monitoring the wavelength shift
of the Bragg grating peak. Still, the performance can be significantly enhanced
by utilizing the third order propagation modes in the fiber which are more than
3 times more sensitive to index changes, and can provide us with measurements
robust to temperature and stress. The sensor surface can be functionalized with
single strand DNA (20-oligomer) and the complementary strand has been
successfully detected. The attachment of protein molecules (Con-A) to a
functionalized surface using glucose is also currently under investigation and
will be described.

 

Vincent T. Bly and Maria D. Nowak NASA
Goddard Space Flight Center,
Single Crystal Silicon Instrument
Mirrors
Saturday 3:25PM
We describe a process for fabricating light weight mirrors from
single crystal silicon. We also report ambient and cryogenic test results on a
variety of mirrors made by this process. Each mirror is a monolithic structure
from a single crystal of silicon. Mirrors typically weigh 1/3rd to 1/6th that
of an equal diameter solid quartz mirror. We avoid print through of the
underlying support structure by light weighting after the optical surface has
been formed. Because of the extraordinary homogeneity of single crystal
silicon, distortion of the optical surface by the light weighting process is
negligible for most applications (<1/40th wave RMS @ 633nm). This
homogeneity also accounts for the near zero distortion at cryogenic
temperatures.

NATIONAL
INSTITUTE OF STANDARDS AND TECHNOLOGY (NIST)

Carl J. Williams, Chief Atomic Physics
Division, NIST,
An Introduction to Quantum Information Science
and Its Future Technological Implications
Sunday 2:00PM
Quantum information science is described as a revolutionary
development that has the potential of impacting the 21st century in a manner
similar to what the laser and transistor had in the 20th century. How much of
this is hype and why should one think that this might be a possibility? This
talk will give a brief introduction into quantum information and its potential
for affecting technology and innovation in the 21st century. I will try and
give an argument of why such radical statements might be true and how difficult
achieving them might be. After explaining why one might expect that quantum
information could be revolutionary and describing some of its better known
impacts I will provide a brief introduction to quantum information. This will
be followed by a high level overview of the Quantum Information Program at
NIST. The remainder of the session will focus on two of the major applications
– quantum communication and quantum computing.

 

J. Bienfang; D. Rogers, X. Tang; L. Ma, A.
Mink; A. Nakassis; B. Hershman; D. S. Su; C. J. Williams; and C. W. Clark,
Quantum Communications Systems*
Sunday 2:40PM
The application of quantum mechanics to information technology has
resulted in devices with capabilities not available by classical means. In
particular, recent research in quantum communications has focused on the
development of cryptographic systems that provide verifiable security over
unsecured communications channels. Such systems exploit the unique laws
governing quantum-state measurement to give users sensitivity to the actions of
an eavesdropper, resulting in the equivalent of a natural wax seal on the
channel. The revolutionary aspect of this technology is that its security is
based on the laws of physics, as opposed to computational complexity and
attendant assumptions about an adversary’s technological ability. While quantum
cryptography addresses a real problem in communications networks, it has also
served as a vehicle for the development of techniques in quantum engineering
and device physics, particularly single photon sources, detectors, and the
design of systems based on manipulating quantum states. This talk will present
an overview of the physics upon which this technology is based, as well as a
survey of the state of the art in quantum cryptographic systems.
* Work
supported by DARPA/QuIST and NIST.

 

D. Leibfried, J. Britton; R. B. Blakestad;
R. Epstein; W. M. Itano; J. D. Jost; E. Knill; C. Langer; R. Ozeri; R.
Reichle+; S. Seidelin, J. Wesenberg, and D. J. Wineland
National Institute of Standards and Technology, Boulder, Colorado,

Quantum Information Processing in a System of Trapped Ions*
Sunday 3:20PM
Recent theoretical advances have identified several computational
algorithms that can be implemented on a system utilizing quantum information
processing (QIP) with an exponential speedup over all known algorithms on
conventional computers. QIP makes use of the counter-intuitive properties of
quantum mechanics, like entanglement and the superposition principle (being in
more states than one at a time). Unfortunately nobody has been able to build a
practical QIP system that outperforms conventional computers so far. Atomic
ions confined in an array of interconnected traps represent a potentially
scalable approach to QIP All basic requirements have been experimentally
demonstrated in one and two qubit experiments. The remaining task is to scale
the system to hundreds and later thousands of qubits while minimizing and
correcting errors in the system. While this requires extremely challenging
technological improvements, no fundamental roadblocks are currently foreseen.
The talk will give a survey of recent progress in implementing simple quantum
algorithms with up to six ions in trap arrays. The prospects and challenges of
scaling this particular approach towards a large scale computing device will
also be summarized.
* Work supported by ARDA/NSA and NIST. + present
address: University of Ulm, Germany

NATIONAL SCIENCE
FOUNDATION (NSF)

The National Science Foundation workshops will provide an overview
of the Foundation, its mission, priorities, and budget.It will cover the NSF
proposal and merit review process and NSF programs that cut across disciplines.
On Saturday, George Wilson, Legislative Specialist from the Office of
Legislative and Public Affairs will join David Friscic of the Office of Polar
Programs in the morning and Almadena Chtechelkanova (Program Director,
Communications and Communications Foundations) in the afternoon. Dr.
Chtechelkanova will give short presentation and highlight NSF role in High End
Computing Interagency Task Force initiative. On Sunday morning Anita Klein will
discuss the Directorate for Biological Sciences and Cross-Cutting Programs and
Sunday afternoon Julias Palais, of the Office of Polar Programs will be
available for discussions.
Saturday 10AM-noon and 2:00PM-4:00PM

Sunday 10AM-noon and 2:00PM-4:00PM

 

NORTHERN
VIRGINIA REGIONAL PARK AUTHORITY

Martin Ogle, Chief Naturalist, NVRPA,
Gaia Theory; The Fullest Expression of Earth Science
Sunday 2:00PM
This paper introduces the Gaia Theory, a compelling scientific
context for understanding life on our planet. The theory asserts that the
organic and inorganic components of Earth form a seamless continuum – a single,
self-regulating, living system. British scientist, James Lovelock, who was
commissioned by NASA to determine whether or not there was life on Mars,
developed the Gaia Theory in the 1970’s. Ironically, this theory has yielded
some of the most “cutting edge” insights into life on Planet Earth. For
example, Lovelock found ways in which the Gaian system regulates surface
temperature, ocean salinity, and other conditions at levels necessary for life
to survive. This paper includes discussion about the value of the Gaia Theory
for science and society.

 

William Folsom, William B. Folsom
Photography Inc.,
The Butterflies at Meadowlark
Sunday 3:00PM
Native host plants played a crucial role in Meadowlark Botanical
Garden’s extraordinary success in attracting native butterflies to the Garden
at a time when many butterfly populations have drastically diminished. Thanks
to the creation of the Potomac Valley collection, and its philosophy, many
native plants have been reintroduced at Meadowlark Botanical Gardens. This has
allowed the number of native butterflies seen at the Gardens to double in the
past five years.

PHILOSOPHICAL SOCIETY OF WASHINGTON

Robert Hershey, Energy
Trade-Offs
Sunday 10:00AM
Energy choices involve trade-offs. The costs and availability of
energy sources must be considered, as well as the environmental effects.
Generally, these factors will be site-specific to the area being served. This
lecture will address these issues in relation to a project the author analyzed
in the Czech Republic. Various scenarios were considered for heating the town
of Cesky Krumlov, 90 miles south of Prague. Based on the trade-offs, an
economical method was chosen to provide heating for the buildings while
achieving a significant reduction in particulate emissions.

 

Kenneth J. Haapala,
Provost General, Northeast, Brotherhood of the Knights of the Vine,

Adding a Bit of Scientific Rigor to the Art of Finding and Appreciating Fine
Wine
by
Sunday 1100AM
The murky history, local traditions, customs of wine making, and
the lineage of vines and wines often confuse those who appreciate wines. One
wine variety may have many different names. Conversely, the name of a wine may
be the same in several locations, but the wine and vine may be totally
different. A history of viniculture and why it is so convoluted are briefly
discussed. The fickle characteristics of the vines explored; the diseases that
almost destroyed the industry explained; several common misconceptions
dismissed; and 20th Century efforts to rigorously identify vines and wines are
briefly discussed. Rigorous efforts to understand the innumerable sensations of
appreciating wines will be presented. Three individual experiments to better
understand one=s palate and food and wine combinations will be described.

POTOMAC
CHAPTER OF THE HUMAN FACTORS AND ERGONOMICS SOCIETY – Medical Errors: Reducing
Risk and Enhancing Safety

Session Chairman: Doug Griffith,
PhD, General Dynamics Advanced Information Systems,
Introduction
Saturday 9:00AM

 

Colin F. Mackenzie, MD., National Study
Center for Trauma & EMS,
Videos of Emergency Care Show Challenges
for Patient Safety
Saturday 9:10AM
Traditional data collection methodologies have
difficulty capturing fleeting events, subtle cues, brief utterances or team
interactions and communications. There is a paucity of data about what occurs
in uncertain emergency medicine workplaces, where risky but beneficial
procedures are carried out, often in non- optimal circumstances. Such data may
be critical to identification of what are often identified as unsafe acts,
pre-cursor events, accident opportunities, latent and systems failures. This
presentation will illustrate how patient safety shortcomings in the emergency
medical domain can be identified, and potentially rectified through a
video-based data collection, analysis, and educational feedback approach.
Preventive strategies will be discussed to avoid patient and clinician safety
performance problems that could only have been revealed using this robust,
inexpensive technology through which fine-grained data analyses are possible.
This approach for improving patient care outcomes in healthcare, used in a
systematic manner can identify many deficiencies in knowledge about pre-cursor
events, error opportunities and provide solutions for correction of
deficiencies. Video clips will be presented from our extensive video library
and 11-year experience of video data collection and analysis methodologies for
emergency care of trauma patients. We will characterize the challenges in
identification of safety, organizational, and systems-based problems in
technical work in emergency care, using human factors and ergonomic methods. We
will describe a multidisciplinary approach that includes experienced trauma
clinicians, experts in industrial engineering, psychology and applied
technology.

 

Marilyn Sue Bogner, PhD., Institute for the
Study of Human Error, LLC,
It’s Not Who in 98,000 Medical Error Deaths,
It’s What!
Saturday 10:00AM
In 1999 the Institute of Medicine reported that
44,000 to 98,000 hospitalized patients die annually due to medical error.
Following recommendations in that report, Congress appropriated funds for
research on provider accountability to reduce the incidence of error by 50% in
5 years. The products from that $250 million of research are characterized
“Efforts to attain the 50% reduction in error not only did not meet that goal,
the impact of those efforts on error is negligible.” That should not have been
a surprise if one considers what error is. Given that error is behavior, and
behavior – as attested by centuries of empirical literature in the physical and
social sciences as well as millennia of philosophical writings – be it person
or particle is the product of the interaction of the individual entity with
factors in the environment. An empirically-based approach to address such
environmental factors, the Artichoke systems approach, is described. Cases
illustrating the power of the Artichoke approach in reducing error and examples
of actual error-inducing factors in today’s health care are discussed.
Implications for changing the paradigm for addressing health care error from
“who” to “what” are underscored.

 

Gerald P. Krueger, Ph.D., CPE Krueger
Ergonomics Consultants Alexandria, Virginia,
Fatigue, Drowsy
Decision-Making and Medical Error: Issues of Quality Health Care

Saturday 11:00AM
Health care providers must give careful attention to important
life-sustaining details, such as monitoring critical vital signs of patients in
emergency or intensive care; or in diagnosing ailments; administering correct
levels of anesthetic gases to prepare a patient for surgery; or while
dispensing proper levels of prescribed medications. Like other workers,
health-care providers are affected by physiological, psychological and
behavioral variables associated with their jobs and their particular
lifestyles. Often, interns, residents, nurses, and other critical care
providers obtain insufficient quantity and quality of sleep; they participate
in lengthy work shifts in excess of 10-hours; they engage in overtime work,
work through the night, or serve on-call at the hospital in excess of 24-hours
at a stretch. Worker fatigue creeps in, drowsiness occurs, performance
degrades, mood and attitudes swing to lower levels; and from time-to-time these
health care providers approach critical care decision-making and the
administration of medical care while they are at less than 100% effectiveness
due to the effects of fatigue. This talk will outline some of the quality of
health care issues pertaining to the impact of different shiftwork schedules,
the lengthy hours of work associated with internship or residency training,
around-the-clock nursing care, the effects of circadian rhythms and physiology,
sleep loss, and fatigue. It will address performance expectations of health
care providers who are expected to maintain continuously high levels of
attention in sustained monitoring of patients; the likelihood of
fatigue-related error, and other safety aspects of providing institutional
health-care services.

SCIENCE AND
ENGINEERING APPRENTICE PROGRAM, GEORGE WASHINGTON
UNIVERSITY

Cycle Hawkins. Mentors: Dr. Margery Anderson,
Mr Walter Bryant, Dr. Jose Hernandez-Rebollar School: School Without Walls
Senior High School
Automated Wi-Fi Decryption Device
Sunday 9:00AM
In analyzing wireless networks, it is not only possible to break
the encryption of the wireless network in the most common forms it is
formulaic. Wired equivalent privacy (WEP) along with Wi-Fi Protected Access
(WPA) slated to replace WEP transmit the keys to both of these standards within
a large percentage of the transmitted packets. Designing a piece of networking
equipment using off-the-shelf components and commonly available software, to
increase the ease of constructing the piece, is no easy task. Thus, the piece
of equipment had to use modular components such as Wi-Fi radios, commonly found
in Peripheral Component Interconnect (PCI) cards or Mini-PCI cards, as well as
Compact Flash memory cards to store information and the instruction sets on a
non-mechanical medium. The end user of the device has access to the device
through Secure Shell (SSH) via Ethernet and Serial Ports, LCD and push buttons
mounted on to the front case of the device. The minimum requirements to run a
full desktop operating system such as Microsoft Windows 2000 are met, however
due to the headless nature of the device the device runs an embedded derivative
of the Linux 2.4 kernel, which also allows for a more secure device. In order
to maintain the mobility and usability of the device the device implores both
an AC adapter and a battery interface. With additional work on the firmware of
the device, the overall user interface the device and shrinking and automating
the decryption applications, the device is feasibly completed.

 

Bhuvanesh Govind. Mentor: Dr. Madhusoodana
Nambiar School: Thomas S. Wootton
Demonstration of Airway Absorption as
an Important Route of Exposure to Nerve Agents in Addition to Alveolar Gas
Exchange Following Inhalation Poisoning in Guinea Pigs
Sunday 10:00AM
Inhalation is a common route of exposure to toxic nerve agents such
as soman and VX. Previous studies of inhalation exposure to soman vapors
indicated non-linear kinetics with respect to dose and time. We have developed
a microinstillation technique in guinea pigs to assess lung injury following VX
exposure. During VX exposure (0.5 – 0.9 LD50), some animals exhibited twitching
of the upper limb but no twitching of the lower limb was observed. This
indicates a site-specific increased localization of the nerve agent
specifically at the airway and thoracic region. To further elucidate
differential localization of the nerve agent following inhalation exposure, we
determined the level of acetylcholinesterase (AChE) inhibition in various
tissues, including samples close to the lung (diaphragm, chest muscle,
esophagus, heart), intermediate (spleen, liver, intestine) and distant (thigh,
muscle, kidney, testis). In diaphragm, nearly 50% inhibition of AChE was
observed at 0.5 LD50 and the inhibition gradually increasing with the dose of
VX. However, thigh tissue showed no significant inhibition of the AChE compared
to untreated animals. These results suggest that VX is absorbed from the airway
and the lung microcapillary and is transmitted to surrounding tissues. Thus
inhalation exposure to VX is localized to tissues surrounding the lung region.
Site-specific localization of VX supports the notion of airway absorption in
addition to alveolar gas exchange as an important route of nerve agent
exposure.

 

Student: Erica Price Mentor: Dr. Sachin Mani
School: Rutgers University
The Role of RNA quality in the identification
of gene markers in Staphylococcal enterotoxin-A (SEA) induced incapacitation
Sunday 11:00AM
Staphylococcal enterotoxin A (SEA) is a protein superantigen
produced by Staphylococcus aureus. SEA causes food poisoning i.e., when
ingested, it results in profuse vomiting, spurting diarrhea, severe dehydration
and in grave cases, lethal toxic shock. Contamination of food with SEA is a
serious health issue because it can incapacitate large numbers of people in a
short amount of time and the likelihood of a successful recovery is reduced
depending on the length of exposure without diagnosis and treatment. In this
experiment blood from a SEA intoxication model is used to isolate RNA using the
PAXgene RNA isolation method. It is important to have a good quality of RNA to
determine the genes that are specific to SEA exposure. These genes or
biomarkers act as molecular targets for developing targeted therapy against
SEA. After isolation, the quality of RNA is confirmed using native gel
electrophoresis and the Agilent 6000 system. Both these tools have demonstrated
that the PAXgene RNA isolation method produces good quality RNA which can be
utilized to identify

 

SOCIETY FOR
EXPERIMENTAL BIOLOGY AND MEDICINE

Curtis R. Chong; Xiaochun Chen; LiRong Shi;
Jun O. Liu; and David J. Sullivan, Jr., Department of Pharmacology, The Johns
Hopkins University School of Medicine,
Identification of Astemizole as
an Antimalarial Agent by Screening a Clinical Drug Library
Saturday 9:00
The high cost and protracted timeline of new drug discovery is a
major roadblock to creating therapies for diseases common in the developing
world. One way to sidestep this barrier is to identify new uses for existing
drugs. Because the toxicity and clinical properties of existing drugs are
known, any novel therapeutic use identified can be rapidly translated into the
clinic. We created and screened a library of 2,687 existing drugs for
inhibitors of the human malaria parasite P. falciparum using [3H]-hypoxanthine
incorporation. The non-sedating antihistamine astemizole and its principal
human metabolite desmethylastemizole potently inhibit chloroquine-sensitive and
multidrug-resistant parasite in vitro and in three mouse models of malaria.
Like the quinoline antimalarials, astemizole inhibits heme crystallization,
concentrates within the P. falciparum food vacuole, and co-purifies with
hemozoin from chloroquinesensitive and multidrug-resistant parasites.
Importantly, astemizole is equally effective against multidrug-resistant P.
falciparum. In mice infected with chloroquine-resistant P. yoelii astemizole
and desmethylastemizole reduced parasitemia with an apparent IC50 of 15 mg/m2,
which is near the dose used to treat allergic rhinitis. These results suggest
astemizole is promising for the treatment of malaria, and highlight the
potential of finding new treatments for diseases of the developing world by
screening libraries of existing drugs.

 

Ion Cotarla and Michael Johnson, Oncology,
Lombardi Comprehensive Cancer Center, Georgetown Univ.; Ji Luo and Lewis
Cantley,Cell Biology, Harvard Medical School; and Priscilla A. Furth, Oncology,
Lombardi Comprehensive Cancer Center, Georgetown Univ.,
Loss of Both
Allelleles of the P85Alpha Regulatory Subunit of P13 Kinase Results in Impaired
Mammary Gland Development, Whereas Loss of Only One Does Not Alter Mammary
Tomorigenesis
Saturday 9:15AM
Activation of PI3 kinase (PI3K) pathways contributes in many organs
to both normal physiology and complex process of tumorigenesis through a
plethora of pathway-specific functions. In this study, a genetic approach using
knockout mice was taken to investigate the effects of loss of PI3K activity on
mammary gland development and tumorigenesis. First, mammary development was
examined through mammary gland transplantation, as Pik3r1-/- mice (which lack
all isoforms of the p85alpha regulatory subunit of PI3K) die perinatally.
Ultrasound and GFP imaging (actin-GFP transgenic mice were bred into Pik3r1
mice), histology and molecular analyses were employed to assess the development
and terminal differentiation of the mammary gland. Endogenous mammary gland
development was normal in Pik3r1+/- mice through terminal differentiation and
lactation, and in Pik3r1-/- mice through embryogenesis and first postnatal
week. Nonetheless, mammary glands from newborn Pik3r1-/- mice grown as
transplants, either in cleared mammary fat pads or mid-abdominal regions of
host nude mice, showed marked underdevelopment during puberty in comparison to
Pik3r1+/+ or Pik3r1+/- transplants. Mammary whole-mount analysis using
MetaMorph® software demonstrated a statistically significant decrease in
mammary epithelial density of Pik3r1-/- transplants compared to Pik3r1+/+ and
Pik3r1+/- transplants (reduced total ductal length: p=0.01, and reduced ductal
branching: p<0.001). Decreased rates of proliferation, but not apoptosis
found in Pik3r1-/- mammary transplants may explain this phenotype.
Interestingly, Pik3r1-/- mammary transplants were able to terminally
differentiate during late pregnancy. This was indicated by the presence of
lipid vacuoles inside the epithelial cells on H&E-stained sections, and
expression of molecular markers, such as phosphorylated-Stat5 or milk proteins
WAP and beta-casein, detected by Western blotting. Mammary epithelial density,
measured on H&E-stained sections using MetaMorph®, remained however
reduced in Pik3r1-/- transplants at late pregnancy, although the difference
between the two groups was not as obvious as at puberty. Statistically
significant decreases in E-cadherin and p63 (myoepithelial marker) mRNA
expression, measured by real time RT-PCR, confirmed the decreased epithelial
content of Pik3r1-/- transplants. Brca1, and to lesser extent ERalpha and Bcl-2
mRNA levels were statistically significant reduced in Pik3r1-/- transplants
compared to the other group. No change in PR and Cyclin D1 mRNA expression or
phosphorylation levels of Akt was found. Second, the impact of haploid loss of
Pik3r1 locus on mammary tumorigenesis was explored in a mouse model of breast
cancer (WAP-TAg). Reflectance confocal microscopy, GFP imaging and
H&E-stained sections were employed to visualize and analyze the tumors.
Haploid deficiency of Pik3r1 did not alter the onset of mammary tumorigenesis
(WAP-TAg, Pik3r+/+ = 208.5 days; WAP-TAg, Pik3r1+/- = 204 days; n=24). Also,
haploid deficiency of Pik3r1 did not alter mammary tumor burden or tumor
grading in the WAP-TAg model (p>0.05). In summary, complete loss of the
p85alpha regulatory subunit of PI3 kinase resulted in impaired mammary gland
development, especially during puberty, and reduced Brca1 expression during
late lactation. Haploid deficiency of Pik3r1 did not alter mammary
tumorigenesis in a mouse model of breast cancer.

 

N. Farkas; R. Aryal; E. A. Evans; and R. D.
Ramsier, Departments of Physics, Chemistry and Chemical and Biomolecular
Engineering, The University of Akron; L. V. Ileva and S. T. Fricke Department
of Neuroscience Georgetown Univ.; J. A. Dagata, Precision Engineering Division
National Institute of Standards and Technology,
Patterned Iron Thin Film
and Microfluidic Phantoms for Quantitative Magnetic Resonance Imaging
Saturday 9:30AM
Magnetic resonance imaging (MRI) studies often use magnetic
particles as contrast agents to identify and monitor cell movement. Although
most of this research demonstrates imaging of groups of labeled cells, a
growing number of articles report the detection of single cells and even single
superparamagnetic iron oxide particles. However, since the size and iron
content of these particles exhibit large variations, quantitative assessment of
MRI sensitivity and resolution is difficult, and few attempts have been made to
investigate the correlation between signal attenuation and iron concentration.
To address these unavoidable issues, our ultimate goal is to design and
construct MRI standards for instrument calibration. The first group of phantoms
consists of systematically patterned iron nitride thin films generated by a
high-voltage parallel writing technique we developed recently to modify
refractory metal thin films. The volume of the iron nitride features can be
precisely controlled during the intrinsically simple fabrication, which
provides a means to correlate iron content with the resulting MRI signal
decrease and to determine detection threshold limits as well. The next phantom
design includes irregular iron patches forming clusters of different sizes that
resemble labeled cells or iron oxide particles with respect to agglomeration
and iron distribution. Magnetic force microscope characterization confirms the
permanent magnetic nature of all the iron thin film phantoms. As an integral
part of the systematic calibration procedure, quantification of magnetic
susceptibility artifacts from these iron patterns is obtained over a broad iron
content range of 100 pg-0.01 mg. We show that the proposed synthetic
ferromagnetic phantoms could facilitate standardization and development of new
MRI protocols to advance single cell detection and tracking. To extend our
efforts toward establishing accurate spatial resolution and sensitivity
measurements of various solution-phase MRI contrast agents, we make use of
microfluidic devices as high-resolution MRI phantoms. Continuous wedge and step
phantoms with bifurcating 300-800 micron wide channels machined into Plexiglas
and sealed with PVC gaskets have been designed for instrument calibration at
the micrometer scale. To demonstrate that these microfluidic devices provide a
convenient way to administer different contrast agents in-situ, MR imaging of
laminar-flow-induced diffusion of positive and negative contrast agents is
presented. Since the manufacturing processes described here do not require
complicated instrumentation, they promote rapid prototype production and
subsequent MRI testing.

 

Curtis R. Chong; David Z. Qian; Fan Pan;
Roberto Pili; David J. Sullivan; Jr.; and Jun O. Liu, Department of
Pharmacology, The Johns Hopkins University School of Medicine,
Identification of Type I Inosene Monophosphate Dehydrogenase as an
Antiangiogenic Drug Target
Saturday 9:45AM
Angiogenesis, the formation of new blood vessels, plays a central
role in the pathophysiology of diseases such as metastatic cancer and diabetic
retinopathy. In an effort to rapidly discover clinically useful angiogenesis
inhibitors, we created and screened a library of 2,687 existing drugs for
inhibition of endothelial cell proliferation using [3H]-thymidine
incorporation. Because the toxicity, pharmacokinetic, and clinical profiles of
existing drugs are well established, novel hits identified in the library can
rapidly be moved into the clinic. Mycophenolic acid (MPA), an immunosuppressive
drug, inhibits endothelial cell proliferation with an IC50 of 100 nM and causes
G1/S cell cycle arrest. In mouse models, MPA inhibited VEGF and bFGF stimulated
angiogenesis, blocked tumorassociated angiogenesis, and decreased tumor volume
and metastasis. Inhibition and cell cycle arrest are overcome by addition of
guanosine, suggesting the de novo nucleotide synthesis pathway, and more
specifically, inosine monophosphate dehydrogenase (IMPDH), as the target of MPA
in endothelial cells. As MPA is equally potent against the two known IMPDH
isoforms, we selectively knocked down IMPDH-1 or IMPDH-2 using siRNA to
determine which is essential for cell cycle progression. Knockdown of IMPDH-1
in endothelial cells caused cell cycle arrest similar to that observed with MPA
treatment, while IMPDH-2 knockdown had no effect. In contrast, knockdown of
IMPDH-2 in T-cells caused cell cycle arrest while IMPDH-1 siRNA treatment had
no effect. These results suggest MPA may be useful as an anti-angiogenic drug,
and that IMPDH-1 specific inhibitors could block angiogenesis without causing
immunosuppressive side-effects.

 

Joe Garman and Alexis Dixon, Department of
Medicine, Georgetown Univ. and Christine Maric, Department of Medicine and
Center for the Study of Sex Differences: in Health, Aging and Disease,
Georgetown University Medical Center,
Protective effects of Omega-3
Fatty Acids in an Animal Model of Diabetic Nephropathy
Saturday 10:30AM
Background: Omega-3 fatty acids have been shown to have
beneficial cardiovascular effects; however, their effects on kidney function,
especially in diseases such as diabetic nephropathy, are unknown.
Objectives: Walnuts are high in antioxidants and omega-3 fatty acids.
This study examined the effects of walnuts, as a source of omega-3 fatty acids,
on the kidney in an animal model of diabetic nephropathy. Methods: The
study was performed in non-diabetic and diabetic (induced by streptozotocin –
STZ, 55mg/kg) male Sprague-Dawley rats fed either a normal diet or walnut
supplemented diet. The diabetic rats were treated with 4 Units of slow acting
insulin (glargine) 3 times per week to maintain body weight and prevent
ketoacidosis. For the duration of the experiment, animals were fed sodium
deficient rat chow (Harlan 90228), either with added walnuts (W diet, 40% of
calories) or without walnuts (R, regular diet). Both diets were supplemented
with salt to provide approximately 20mg Na+/rat daily. Food consumption was
monitored daily, body weights and plasma glucose levels were measured weekly,
and urine production and albumin content were measured monthly. Results:
After 10 weeks of diabetes, body weights for the diabetic rats on the walnut
diet, STZ-W, were significantly greater than the diabetic rats on the regular
diet, STZ-R, (382 +/- 28 vs. 281 +/- 26g). There was no difference in body
weight between non-diabetic rats fed the regular or walnut-supplemented diet.
Blood glucose levels were lower in the STZ-W group vs. STZ-R (296.0 +/- 9.2 vs
363.3 +/- 70.3 mg/dl, at 10 weeks). The urine albumin excretion rate was lower
in the STZ-W vs. STZ-R (15.3 +/- 17.0 vs 38.2 +/- 11.1 mg/day). No difference
in albumin excretion rates were observed in the non-diabetic rats fed either
diet. Conclusion: STZ induced diabetic rats maintain greater body
weight, lower plasma glucose, and less albuminuria when fed a diet containing
walnuts compared to diabetic rats on a regular diet. We conclude that a diet
supplemented with walnuts, as a source of omega-3 fatty acids, is beneficial in
attenuating renal dysfunction associated with early diabetes.

 

Alexis Jeannotte, Interdisciplinary Program
in Neuroscience and Anita Sidhu, Dept. of Biochemistry, Georgetown Univ.,
The Norepinephrine Transporter is Bimodally Regulated in Vitro and Ex
Vivo by Alpha-Synuclein
Saturday 10:45AM
Plasma membrane Norepinphrine transporters (NET) are the primary
source for terminating noradrenergic synaptic transmission and are targeted by
several therapeutic agents for cognitive, affective, and motor disorders.
Several proteins act in concert to regulate the function and trafficking of
NET; however, a definitive mechanism has yet to be defined. A novel interaction
with the cytosolic protein a-synuclein (SYN) has been identified to modulate
NET activity in a concentration dependent manner. SYN was found to inhibit
[3H]-NE uptake by 58% in Ltkfibroblasts cotransfected with a 3:1 ratio of SYN:
NET DNA, which mimics the expression level in several brain regions, including
the prefrontal cortex (PFC), a major destination of noradrenergic terminals.
When these cells were cotransfected with 0.5:1 ratio of SYN: NET, [3H]-NE
uptake was found to be increased by 48%, representing a change that is novel to
NET, unseen in other transporters. This bimodal regulation of NET was dependent
upon the integrity of the microtubular network. In synaptosomes isolated from
PFC and treated with nocodazole, which disrupts mictrotubules dynamics, there
was a relief in the tonic inhibition of NET resulting in increased [3H]-NE
uptake. In contrast, there was no significant difference in striatal
synaptosomal uptake. Primary neuronal cultures isolated from the rat brainstem
at E20, displayed a similar trend for an increase in [3H]-NE uptake with
nocodazole treatment. This novel bimodal regulation of NET indicates that the
transporter is under differential regulation by SYN, both in vitro and from
tissue and neurons isolated from varying brain regions.

 

Curtis R. Chong and David J. Sullivan, Jr.,
The Johns Hopkins Bloomberg School of Public Health
Inhibition Of Heme
Crystal Growth By Antimalarials And Other Compounds: Implications For Drug
Discovery
Saturday 11:00AM
During intraerythrocytic infection, Plasmodium falciparum
parasites crystallize toxic heme (FP) released during hemoglobin catabolism.
The proposed mechanism of quinoline inhibition of crystal growth is either by a
surface binding or a substrate sequestration mechanism. The kinetics of heme
crystal growth was examined using a new highthroughput crystal growth
determination assay based on the differential solubility of free versus
crystalline FP in basic solutions. Chloroquine (IC50 = 4.3 uM) and quinidine
(IC50 = 1.5 uM) showed a previously not recognized reversible inhibition of FP
crystal growth. This inhibition decreased by increasing amounts of heme crystal
seed, but not by greater amounts of FP substrate. Crystal growth decreases as
pH rises from 4.0-6.0, except for a partial local maxima reversal from pH
5.0-5.5 that coincides with increased FP solubility. The new crystal growth
determination assay enabled a partial screen of existing clinical drugs.
Nitrogen heterocycle cytochrome P450 (CYP) inhibitors also reversibly blocked
FP crystal growth, including the azole antifungal drugs clotrimazole (IC50 =
12.9 uM), econazole (IC50 = 19.7 uM), ketoconazole (IC50 = 6.5 uM), and
miconazole (IC50 = 21.4 uM). Fluconazole did not inhibit. Both subcellular
fractionation of parasites treated with subinhibitory concentrations of
ketoconazole and in vitro hemozoin growth assays demonstrated copurification of
hemozoin and ketoconazole. The chemical diversity of existing CYP inhibitor
libraries that bind FP presents new opportunities for the discovery of
antimalarial drugs that block FP crystal growth by a surface binding mechanism
and possibly interfere with other FP-sensitive Plasmodium pathways.

 

Arindam Mitra and Suman Mukhopadhyay VA-MD
Regional College of Veterinary Medicine, University of Maryland,
Multiple Regulators Modulate Quorum Sensing In E. Coli
Saturday 11:15AM
Cell-cell communication requires incorporation of multiple cues
from the environment. Quorum sensing, a process of cell-cell communication is
mediated through the production and recognition of extracellularly secreted
molecules called autoinducers (AI) that accumulate in the environment in
proportion to cell density. E. coli is known to synthesize a signal termed
autoinducer-2 (AI-2), synthesis of which is known to be dependent on luxS gene
products. This study showed that a single-copy chromosomal luxS-lacZ reporter
fusion was regulated by the BarA-UvrY two-component system and other
regulators. CsrA, the global regulatory RNA binding protein, negatively
regulated luxS expression, and positively regulated AI-2 uptake, demonstrating
the existence of a balance between the synthesis and transport. Hfq, another
RNA binding protein, also regulates luxS expression in a growth-dependent
manner. RpoS, the stationary phase sigma factor, also regulated the growth
phase-dependent expression of luxS. Furthermore, we also show that known cyclic
AMP-CRP-mediated catabolite repression of luxS expression was UvrY-dependent.
These findings suggest that the regulation of luxS expression is complex, and
operates at the level of both transcription and translation. Thus multiple
metabolic regulators seem to modulate quorum sensing in E. coli. .

 

Phillip L. Van, MS; Vladimir K. Bakalov, MD;
and Carolyn A. Bondy, MD Developmental Endocrinology Branch, National Institute
of Child Health and Human Development, National Institutes of Health,
Monosomy For The X-Chromosome Is Associated With An Atherogenic Lipid
Profile
Saturday 11:30AM
Background Differential X-chromosome gene dosage may
contribute to gender-specific differences in physiology and longevity. Men
typically have a more atherogenic lipid profile than women characterized by
higher LDL-cholesterol and triglyceride levels and reduced lipid particle size,
contributing to a greater risk for coronary disease. To determine if
X-chromosomal gene dosage affects lipid metabolism independent of sex steroid
effects, we compared lipid profiles in ageand body mass-matched young women
with ovarian failure, differing only in X-chromosome dosage. Methods
Women with ovarian failure associated with monosomy X, or Turner syndrome (TS,
n = 118) were compared to women with karyotypically-normal, premature ovarian
failure (POF, n = 51). These women were normally on estrogen replacement
treatment, but discontinued the estrogen two weeks prior to study. We examined
fasting glucose, insulin and lipid levels, and NMR lipid profiles in study
subjects at the NIH CRC. Results Average age, body mass, and insulin
sensitivity were similar in the two groups. Women with TS had higher
LDL-cholesterol (P = 0.001) and triglyceride levels (P = 0.0005), with total
cholesterol slightly higher (P = 0.06) and HDL-cholesterol slightly lower (P =
0.1) compared to women with 46,XX POF. Also among women with TS, average LDL
particle size was reduced (P < 0.0001) and LDL particle number increased,
with a two-fold increase in the smallest particle categories (P < 0.0001).
Conclusions 45,X women with ovarian failure exhibit a distinctly more
atherogenic lipid profile than 46,XX women with ovarian failure, independent of
sex steroid effects. The observation, that the direction and magnitude of the
differences in lipid levels and lipid particle characteristics between TS and
POF parallel the differences seen between normal men and women, is consistent
with the view that the second X-chromosome normally contributes to a reduced
risk for ischemic heart disease in women.

 

Curtis R. Chong, Shri Bhat, KeeChung Han, and
Jun O. Liu, Department of Pharmacology, The Johns Hopkins University School of
Medicine,
Manganese-Specific Methionine Aminopeptidase Type II
Inhibitors Induce Cell Cycle Arrest In Endothelial Cells And Block In Vivo
Angiogenesis
Saturday 11:45AM
Angiogenesis, the formation of new blood vessels, plays an
important role in the pathogenesis of numerous diseases such as cancer and
arthritis. TNP-470, the first anti-angiogenic drug, halts endothelial cell
proliferation by covalently inactivating the MetAP2 enzyme, effectively blocks
tumor growth and metastasis in animal models, and showed promise in preliminary
clinical trials. Its widespread use as an anticancer agent is limited, however,
by neurotoxic side effects and a plasma half-life measured in minutes. In an
effort to identify potent inhibitors of MetAP2 as antiangiogenic drug leads we
screened a library of 270,000 compounds for inhibitors of this enzyme. A
preliminary screen of 70,000 compounds against cobalt-substituted MetAP2
revealed nanomolar inhibitors which lacked activity against endothelial cell
proliferation. Repeating the screen using the entire library with
manganese-substituted MetAP2 identified the quinolinol and quinolinyl carbamate
class as potent and selective inhibitors over cobalt-substituted enzyme and
MetAP1 that also inhibited endothelial cell proliferation. The most potent
hits, JHU-1b and 4c, induced G1/S phase cell cycle arrest in endothelial cells,
activated p21 transcription, and inhibited bFGF and VEGF induced angiogenesis
in the mouse Matrigel model.

 

Greater Baltimore-Washington Graduate Student
Research Forum Awards Ceremony
Saturday 2:00PM

SOIL AND WATER
CONSERVATION SOCIETY

Peter Hildebrand, Chief, Hydrospheric and
Biospheric Sciences Laboratory,
Global Water Cycle: Floods, Droughts and
Water Resources
Sunday
10:00
The presentation addresses how NASA’s unique capabilities and new
measurements are helping to advance hydrologic science and applications
world-wide, tying together Earth’s lands, oceans, and atmosphere. The
presentation will include changes in the global water cycle and precipitation
distribution.

 

Robert Bindschadler, Chief Scientist,
Hydrospheric and Biospheric Sciences Laboratory and NASA’s lead scientist for
the International Polar Year, 2007-2008
Who Left the Freezer Door
Open?
Sunday
10:00
The talk will address satellite data have documented ice shelves
disintegrating, permafrost melting, and other large-scale environmental
changes. Changes in the cryosphere–the coldest areas of the planet where water
and soil are frozen–are altering our world. Climate models predict that Polar
Regions will warm more than elsewhere. Satellites provide a comprehensive
record of these changes from disintegrating ice shelves, accelerating glaciers,
melting permafrost and diminishing sea ice and snow cover. Why are these
changes taking place and how will the changes taking place so far away affect
us?

WASHINGTON
EVOLUTIONARY SYSTEMS SOCIETY

Jerry LR Chandler, President,
WESS
Introductory Remarks to the Symposium, “Emergence of
Designs”.
Room 375 Saturday 9:00AM

 

Roulette William Smith, PhD Institute for
Postgraduate Interdisciplinary Studies,
Evolution and Emergences of
Designs in Genetic, Immune and Brain Systems
Room 375
Saturday 9:05AM
In recent presentations, Smith (2006a; 2006b) distinguishes
between evolution per se and evolution of species. DNA appears to be the
underlying and parsimonious repository for evolution, with proteomic DNA
providing an underlying basis for speciation. In addition to proteomic/genetic
evolution, Smith posits that evolution has engendered two additional, albeit
interdependent, evolutionary pathways. One pathway is associated with the
immune system. The other pathway is associated with brain. Although concrete
heuristic systems may underlie the three evolutionary schemas, transmission
mechanisms differ in quite fundamental ways in these systems. Transmission
mechanisms associated with speciation occurs via germline genetic and
epigenetic strategies. Transmission mechanisms associated with the other two
systems may involve non-proteomic and non-germline approaches. This report
explores emergences in designs and heuristics among these three interdependent
evolutionary subsystems and in their associated transmission mechanisms. In
addition, we discuss countercurrent chemistry, structures and physiology in
living systems, and in ecology (e.g., oceans and wetlands) to underscore
potential parsimony in designs for both the living and non-living. In summary,
opportunities for novel anticipatory evolutionary systems are explored,
including approaches for quantifying nature and nurture. Roles of heuristics in
emergences of designs in living systems also are explicated.
References
Smith, R. Wm. (2006a). Evolution and Long-Term Memories
in Living Systems: Using molecular biology to resolve three great debates
… Lamarck versus Darwin, Nature versus Nurture, and the Central Dogma.
Presentation at the Winter Chaos 2006 Conference / Snowflake Forum
(<www.blueberry-brain.org/winterchaos/snowflake2006.html#roulette>)
[Pittsburgh, PA – February 3-5].
Smith, R. Wm. (2006b). Evolution and
Long-Term Memories in Living Systems: Using molecular biology to resolve three
great debates … Lamarck versus Darwin, Nature versus Nurture, and The
Central Dogma. Presentation to the San Francisco Tesla Society
(<www.sftesla.org/Newsletters/newslett2006.htm>) [San Francisco, CA
– February 12].

 

Paul C. Kainen, Georgetown University,
Emergences in Graph Theory and Category Theory
Room 375
Saturday 9:30AM
Design concerns itself with both abstract architecture and
concrete detail. Mathematics is the abstract language par excellence, and thus
mathematics is fundamental to design. In particular, the concept of a graph (a
set of vertices and edges, nodes and links) underlies many designs. It
describes networks of relationships, system components, parts and connections,
and applies to such areas as traffic flow, communications, social relations,
work scheduling, organic systems (neural, immune, genetic, ecological),
molecular models, electrons and photons in a Feynman diagram, verbs and nouns
in the diagram of a sentence: the list goes on. I will discuss examples, and
review specializations and enrichments of the notion of a graph, such as trees,
digraphs, categories, and the metagraphs which result from taking objects of
the lower orders as vertices at the next level of abstraction. A new concept,
adjointness arises as a relation between certain pairs of arrows in
the metagraph. It connects categories, logic, algebra, and analysis, providing
evidence of the grand design of mathematics.

 

Robert Artigiani History Department, U.S.
Naval Academy,
Emergence and Ethics: Steps to a Science for
Survival
Room 375
Saturday 10:00AM
From David Hume through T.H. Huxley to G.E. Moore, philosophical
analysis has concluded that ethics cannot be rooted in scientific descriptions
of nature. Assuming a qualitative difference between the material of nature and
the judgments of ethics, Hume summarized the situation with the pithy
conclusion that an ought cannot be derived from an is. At the start of the 20th
century, Moore revisited the analysis and said those who thought otherwise
perpetuated the “naturalistic fallacy.” In a similar vein, Huxley argued that
ethics were human attempts-which he lauded-to oppose the laws of nature-which
he knew would inevitably prevail. These conclusions reflect the “Modern”
conviction that limited science to describing the motions of physical bodies
using deterministic laws. In this reductionistic world there were quantitative
measures but not qualitative values, pleasures and pains but not rights and
wrongs. One consequence of this paradigm was that ethics became either
epiphenomenal or subjective. In the first case, life became meaningless because
ethics were illusions; in the second case, life was regularly terrorized
because ethics were playthings of emotional enthusiasms. As Prigogine often
said, by the late 20th century we were in desperate need of a “new
rationality,” a science friendly to humanity because able to account for the
creation of new information. The science of Complexity seems poised to meet
this challenge. Through interactions it can show how information is created,
and through self-organization it can show how new information is stored.
Interaction and self-organization are the processes by which life emerges. They
may also track processes creating ethical information that is then stored in
human social systems. If so, Hume’s skeptical conclusions can be finessed, for
it is now possible to see that when ethics emerged as societies self-organized,
nature itself derived oughts. Moreover, because of variations in initial
conditions and the unpredictability of human creative acts, no two societies
are shaped by exactly the same ethical rules. Thus, there is ample variety for
natural selection to operate upon, and it may be possible to show that slight
differences in ethical constructions provide advantages. This presentation will
argue that when human behaviors are guided by rules more-or-less analogous to
those Complexity finds operative in nature, societies are more likely to locate
opportunities and threats, globalize information, and take adaptive
initiatives. Being more the tinkerer than the designer, to paraphrase Francois
Jacob, natural systems do not generally lock themselves into idealized
outcomes. Rather than acting as if the final outcome of an evolutionary process
is what makes it meaningful, nature focuses on the means by which the
evolutionary process is sustained. As Stuart Kauffman puts it, nature evolves
in favor of the ability to evolve. Natural systems evolve by finding ways to
use whatever opportunities exist to compute solutions to whatever problems
arise. In other words, natural systems survive by distributive processing, by
individualizing, liberating, and empowering components. Thus, an ethics based
on nature would be one that achieved and encouraged adaptability by emphasizing
individuality, diversity, and freedom-essential human values. Although this
conclusion is intellectually comforting, it is not clear that human beings are
biologically equipped to deal with constant change, unpredictability, and
heightened self-awareness. If Complexity can be persuasively shown to support
life at the “edge of chaos,” however, perhaps its successful regrounding of
consciousness in logic and fact will provide a rationality able to replace
passionate commitment.

 

Gary Berg-Cross Ph.D., Engineering,
Management and Integration,
Ontological Design, the Ultimate Design?
Issues and Concepts
Room 375
Saturday 10:30AM
Information systems and their design are a major part of modern
technology, however, design principles for information and underlying knowledge
have been often ad hoc due to an inadequate foundation. Recent attempts to
formalize a theory of information and information flow between “systems” have
relied on the development of formal ontologies as explicit specifications of
the terms in the domain and relations among them (Gruber 1993). Formal
ontologies make clear the conceptual commitments that underlie their
specification. Such efforts employ a still uncertain mix of philosophy, formal
logic, cognitive science and informatics which, taken together, shift the
design problem of artifacts to what is often assumed to be a universal level of
design- the design of knowledge. However, there is still much debate on how to
properly design the foundational knowledge of an upper ontology. The current
consensus is to carefully rationalize a space of alternative choices reflecting
different focuses and purposes on a specific lower ontology that a foundation
serves. Within this framework formal logic is used as a truth preserving
process and Tarksi’s model theoretic approach is used as a base to make use of
general mathematical concepts such as sets. However, upper ontologies under
development may not be universal designs, although they may be as general a set
as can be accomplished with our current level of understanding. A fundamental,
somewhat philosophical problem, with this base is that it lacks real world
semantics. Such semantics are acquired by the epistemological experience of
ontological analysis. Ontological analysis provides such concepts as 3D and 4D
worlds empirically suitable for differing problems. For example, ontologies of
biological structures (anatomy) reflect a commitment to a 3D world of
biological objects which biological processes require a 4D ontological
commitment. Biology is currently one of the most active areas of ontological
development. For example, in the DOLCE ontology a fundamental object “Person”
can be committed to as a type but is subsumed by two different concepts –
Organism and Causal_Agent. In turn, Organism can be conceived as a type as
well, but Causal Agent takes two directions. It can be a type making Person a
similar to other causal agents such as chemical catalysts. It can also be
understood as a role that a person plays. A Person is also a sub-type of
Individual and some idea of the complexity of foundational ontology can be seen
in a portion of a taxonomy of Common Upper Ontology from effort to merge
several upper ontologies (e.g. DOLCE and BFO)

 

Michael Lissack, Institute for the Study of
Coherence and Emergence,
The Overlooked Role of Cues in
Design
Room 375
Saturday 11:00AM
Cues are not codes. They require different analysis and different
tolerances. Stories often work because of cues. Communication often fails
because of cues. Those who operate in the world of codes are less than tolerant
of the vagaries suggested by cues. Those vagaries are suggestive of
inconsistencies and incompleteness that bother the code people. By contrast,
those who are more comfortable in the world of cues are less bothered by the
assertions of the coders that there is such a thing as exact meaning and that
lookups are appropriate. In reality both groups make use of the conceptual
framework of the other, but the cuers are usually more explicit when making sue
of codes and the coders are usually more emotional (and want to declare not
themselves) when making use of cues. Design tools include infrastructure and
story. Code based viewpoints suggest that only infrastructure matters are
ineffective. Cue based viewpoints suggest that stories are of high effect. But,
high effect does not translate to high accomplishment of intent. Intention is
as often overridden by situation as it is by bad choices. Indeed, there may be
a chicken and egg problem here. The selection of infrastructure may depend upon
the stories told around it and the selection of stories may be demanding that
the infrastructure be pre-selected first. Such is the overlooked role of cues
in design.

 

Stuart Umpleby, The George Washington
University,
Quadrant Diagrams, Levels of Conceptualization and Requisite
Variety
Room 375
Saturday 11:30PM
In the field of architecture students are taught to make many
sketches, to “think with your pencil.” For those interested in social change or
the “design of intellectual movements” how does one “think with one’s pencil”?
Quadrant diagrams are one answer. Quadrant diagrams are used in social science
fields to organize many observations on two dimensions. This paper will present
several examples of quadrant diagrams, illustrating the wide range of their
uses in describing social systems and how they change. Quadrant diagrams can be
used to depict the additional complexity that results from introducing an
additional dimension, for example Fukuyama’s diagrams showing concern with the
strength of government in addition to the scope of government (e.g., size of
welfare programs). Quadrant diagrams also offer a way to compare the evolution
of two or more countries, for example the choice of Russia to emphasize
democracy over markets in the early 1990s and the choice of China to expand
markets before permitting more democracy. In the philosophy of science, the
correspondence principle says that science grows by adding a new dimension. If
a second dimension is added, four possibilities are created. The utility of
quadrant diagrams is that they bring at least an appearance of order to a large
number of cases. A higher level of conceptualization improves the chances that
requisite variety will be achieved.

 

Ted Goranson, Narrative and
Self-Organizing Systems
Room 375
Saturday 2:00PM
One can find mechanisms for self-organization in many disciplines
and situations, some apparently better than others. Usually, these posit three
elements, the components which organize, the system into which they tend and
the mechanisms themselves, ineluctable laws. We propose to look at them
literally as SELF organizing, with only two elements, the constitutes and the
wholes toward which they tend. Moreover, we propose attempting a single
mechanism that applies at physical, chemical, biological and social levels as
examples of a universal approach. Mixed natural and synthetic organizations are
intended. In addition, we’d like to have the same mechanism applied to the
externalities of human understanding of natural systems and the internalities
of those systems. Core notions of semiotics have been bent to organizing
imperatives with some limited success. We propose a system using a similar
metaphoric framework: that of the urge to narrative. Constituents have an urge
to “narrative” and construct those via context with others in a collaborative
manner. While the concept is human-oriented in perspective, an attempt is made
here to devise a new set of formal abstractions at each “layer.” Detailed
scenarios will be handed out, but not discussed to save time. Identity is seen
as inherently category theoretic, including matters of self and urges as
functions. Aggregation as structure is seen as group theoretic, dimensional,
with typical group operators in play to produce the new “layers” (chemical to
biological for instance). The comprehension logic needs to be extended from
“ordinary” first order logic to the situation theory. Some maps to other
thinkers will be provided. This talk focuses on new abstractions for scientific
theory, proposing an organizing notion of narrative that at the human level see
“narrative” in the ordinary way. This will be companioned with a talk by Beth
Cardier which is the complement, from the perspective of human narrative.

 

Beth Cardier, Impulses Dragged into
Words
Room 375
Saturday 2:30PM
A single story can showcase a range of logical systems, depending
on your perspective – linguistic, cultural, psychological, genre. My current
writing project matches metaphors across disciplines, so my own perspective is
different again. When trying to find common patterns in incompatible fields,
conflicts of notation are always an obstacle, so I focus on the subterranean
logic instead. Identifying the invisible dynamics of narrative has since become
part of my storyline, as well as the tools used to build it. A story is a
multidimensional system made visible by two-dimensional notations (language).
Words are simply the visible nodes of this network, the most conspicuous edge
of an emergent structure. When I speak of structure I am not referring to
linguistics, but the constellations of elements that create meaning at an
emotional level. These constellations manifest as images, and are conveyed by
words. I look for the key points within them and identify the dynamics they
describe, because in this abstracted form they are easier to match across
fields. The most important aspect of this process is to know what you are
looking for. The internal patterns of a narrative system are not made from
rules, but from urges. Evidence of the emergent ‘urge’ of a story can be found
at every level – from characters and plot, to the structure of the text, and
ultimately down to my own impulses as a person (this is a common causal tree in
fiction, although it is usually spoken of differently). For my purposes, the
urge and the form are the same thing. The narrative’s structure is actually a
form of the impulse. The words simply indicate its specific shape, in order to
enable a resonation within a reader. In my novel, parrellels are drawn between
the writerly battle to drag human impulses into words and the scientist’s
struggle to find similar information-carrying structures. Working with
Goranson, I am mapping narrative ecosystems in exchange for his mathematical
formalizations of a function-based, self-organizing world (as opposed to a
being-oriented mechanism). Our theme for this presentation, therefore, is:
“It’s narrative all the way down.”

 

Dail Doucette, Washington, D.C.,
Designing the “Science of Information Institute
Room 375
Saturday 3:00PM
Information is a part of every scientific and academic discipline.
It is emerging as a trans-disciplinary area of current research and is
beginning to develop its own theoretical foundation concepts. There is
international interest and participation in this effort. This paper will
propose a process and structure to coordinate and correlate the existing
theoretical studies on information being done across the major scientific and
academic disciplines and to promote the Science of Information as a focused
field of study in itself.

 

Jerry LR Chandler, Krasnow Institute for
Advanced Study,
The Emergences of Designs of Synthetic Symbol
Systems
Room 375
Saturday 3:30PM
Human communication based on a grammar may use one of several
different symbol systems. Examples of systems designed to transmit messages
include, among many others, alphabets, mathematics, chemical structures,
musical scores, and dance. Apparently, the utility of each symbol system is to
preserve meaning across time and space such that communication is not dependent
on the simultaneous presence of all participants. Early cultures appear to use
a single symbol system to convey informative messages. The emergence of
separate symbol systems from such a primitive system was dependent on the
genesis of new meanings and of new symbols. Grammatical symbol systems
necessitate specialized training in order to create the competence necessary to
encode and decode messages for transmission across time, place and space. The
design of a new symbol system does not require the design of a completely new
collection of structural symbols. A symbol may be used in two or more symbol
systems with different meanings by changing the grammar or positions of the
symbols. The symbol systems for alphabets, mathematics and chemical structures
often use the same symbols with different meanings, leading to substantial
confusion and potentially erroneous conclusions about the relations among the
natural sciences and between the natural sciences and society. In fact, the
chemical community designed a poly-modal grammar, a perplex number system, and
a subject-based existential logic to correspond with the existence of objects
in the external world. The commutative rules for relating the poly-modal
grammar of chemical symbol systems to alphabetic and mathematical grammars will
be briefly introduced. A priori, the illations of the chemical grammar emerged
reflexively within a self-referential framework. The perplex number system
encodes quantity such that compositions of structural codes become the basis
for genetic, anatomical, and biological codes. Subsequent to human development,
the perplex chemical compositions (synonymous with the structural hierarchy of
biological anatomy) become the generative source of the alphabetic and
mathematical codes that in turn become the generative sources of invariant
relations of both thermodynamics and quantum mechanics. The historical
precedence of and the self-reflexivity and self-referentiality of the perplex
number system suggests that the abstractions of the chemical symbol system is a
progenitor of other grammatical symbol systems.

 

Andrew Vogt, Axiomatics
Reconsidered
Room 375
Sunday 10:00AM
The speaker will review the progress of mathematical axiomatics
from Euclid to Gödel, and offer some ideas on the arc of that progress, on
the mysterious effectiveness of rule-based reasoning, and on the shortcomings
of this grand enterprise.

 

Robert M. Cutler, Ph.D. Senior Research
Fellow Institute of European and Russian Studies, Carleton University, Ottawa,
Organizational Design and Transformation in a Complex
World.
Room 370
Sunday 10:00AM
This paper sets out a framework of organizational design and
transformation that it designates the “paradox of intentional emergent
coherence” (PIEC). The example to which it turns for discussion is the modern
(political) state, although the framework is in fact applicable to any
organization or structured system, including evolutionary systems and not
excluding sentient beings. The paper unfolds through a brief introduction,
three main parts plus a conclusion. The introduction sets the stage for the
main argument by invoking the difference between first-order (“observed”) and
second-order (“observing”) cybernetic systems. With appropriate literature
references to the political and social sciences, it indicates what is the
significance of this for analysis of political and social systems. The first
main part of the paper sets out three very clear differences between the
epistemology of system-transformation from the standpoint of second-order
cybernetic systems on the one hand, and, on the other hand, from that of
first-order cybernetic systems. The latter perspective is typified by
Levi-Strauss’s structuralism, with special attention to Piaget’s well-known
exegesis of how that has been adapted into the political and social sciences.
The second main part of the paper reduces the messiness of the social-science
structural (-functional) approach to systems transformation, by using
complexity-science epistemology so as to simplify categorically the “conceptual
epicycles” required for sense-making. Since the specific example discussed is
the political executive of the modern state, the paper’s third main part
indicates intrinsic sources of sub-optimization of the use of information in
organizational decision-making. It relates these directly and continuously to
the epistemological categories of second-order cybernetic systems set out in
the introduction and first part of the paper. The conclusion to the paper draws
all these threads together, concisely summarizes their practical implications
and indicates which directions indicate by this approach are likely to be most
fruitful for further development.

 

John E. Gray Systems Research and Technology
Department, Naval Surface Warfare Center,
Strangeness in Proportion: A
Mathematical Approach to Design as an Inverse Problem
Room 375
Sunday 10:30AM
In creating a design that comprise disparate elements of
composition that constitute a design—-artistic or engineering, requires a
fusion of the atomic elements that comprise a design. Thus, any design that has
“atoms of composition” is a fusion of these elements into an overall design.
The method of fusion is based on the individual and group aesthetic that comes
from the artistic temperament. While it is very difficult to arrive at a
mathematical criteria that helps understand the process of composition, it is
possible to solve the inverse problem to understand the composition once it has
been completed. Each atomic unit can be regarded as a distinct element of the
composition that has an overall weight in the whole of the composition. Thus,
the design regarded as an inverse problem has mathematical elements based on
the weights of the individual elements that constitute the composition. Thus,
the composition is a fusion of individual weighted components that have two
properties: 1. The numerical weights of the individual atomic components
weights are either positive or zero. 2. The sum of the weights can be
normalized, so the individual weights add to be equal to one. Since these two
properties are mathematically equivalent to a probability, they can be treated
as probabilities. Thus a design represents an instance or sample drawn from a
probabilistic space of possible compositions. Any assignment of probabilities
can always be viewed as the solution to a Bayesian Maximum Entropy problem.
Given that is the case, then we can ask the question: “What is the Maximum
Entropy problem that an assignment of design weights is the composition an
answer to?” Being able answer this question for various designs, would allow us
classify them in the same fashion. This would lead to an ordering of design
algorithms in terms of the complexity of the problem they solve. This is one
method to analyze a design mathematically. There is another. Note that the
fusion of weighted information from different sources is equivalent to creating
the equivalent of a physical interaction system that has an underlying physics
that we can strive to understand. Thus a fusion of design elements when looked
at this way is physics model based on interpreting our weighing of the data
from different sources as an underlying interaction model. This provides an
entirely new method to analyze designs from the inverse problem perspective and
might point to a new direct approach to design.

 

William Wells, Comprehending Nature’s
Design: A Challenge Towards Discernment in the Role of Infinity
Room 370
Sunday 10:30AM
The challenge of communicating the active role of the design
function attempts to mirror the Complexities ofNature’s Design Laws. In
particular the infrastructure of postmodern urban design suggest that we are in
danger ofcreating habitat space as museum plazas to be occupied by a life form
of the elite rather than by and for thepeople. The challenge to radically
develop designs cognizant of human security will require a different financial
andmanagement style. In general, an emerging spiritual consciousness for the
evolutionary development of man’sfuture will recognize why the flow of
humanity is a function of determinism with chance. Ignoring the risks
ofecological impacts on populations by political consensus challenges infinite
hyper-real marketing. Poorcommunication patterns only bewilders and muddles the
urgency of comprehending the nature of life at scalarlevels inherent to secure
but potentially volatile political systems.

 

Gary G. Nelson, Systems Engineer, and Stanley
Salthe
Organizational Evolution, Life Cycle Program Design: Essential
Issues in Systems Engineering and Acquisition of Complex
Systems
Room 375
Sunday 11:00AM
A society of purposeful agents-humans and now their artificial
symbionts-self-organizes into partitions of activity (nations, corporations,
agencies, services) and a scale hierarchy of governance. This complex-adaptive
society is properly evolutionary. Such a society designs “projects” that create
purposeful artifacts with a developmental life cycle. As the projects become
more and more ambitious, the real and conceptual boundary between the
evolutionary and the developmental-between the project and the society-blurs.
Over just the last seven decades, this growth in “designed complexity” has
spawned the design of formal processes of systems engineering, concurrent with
formal processes of acquisition (allocation of and accountability for
significant social resources). The most interesting examples of such projects
are the information systems for decision support that become the social linkage
of indefinite extent, and the very means of collective design and acquisition
of projects. So, we have two essential problems: 1) The blurring of designing
subject and designed object, and; 2) A recursive relation between the object
and the designing process. Empirically (speaking from experience with major
federal programs), there is a vast conceptual confusion between evolution and
development (including the design phase) in this case. This is reflected in
persistent problems with the formal engineering and acquisition processes. Some
principles of complex systems, with special reference to decision support
systems, are articulated to identify and ameliorate the very practical problems
encountered in the life-cycle development of complex designs by complex
organizations.

 

Jerry Zhu, Ph.D., Unified Content
Methodology, A living systems approach for better, faster and cheaper
software.
Room 370
Sunday 11:00AM
Cancelled projects and maintenance cost is well over tens of
billion dollars yearly in the U.S. alone. Lack of sound practice, poor
requirements, poor quality control, and lack of project management tool suites,
etc. are commonly considered as sources of software failure. However, current
efforts of eliminating sources of software failure are not only unpractical for
their high cost but also illusive, for instance, can we say positively that the
requirement is complete and immutable? Statistics have shown that the larger
the software, the higher failure rate it will show. This paper claims that the
source of failure is in the methodology, its deficiency, not in perfecting the
use of methodology. Today’s prevailing software methodology, Unified Process,
along with its variations such as JAD and Agile, are not sufficient for their
intrinsic deficiencies. Innovation in the design of software methodology,
rather than in tools and standards, is the best and only way, to overcome
software failure, to develop fast, better and cheaper software. Unified
Content, as apposed to UP, is proposed in the paper. Rather than viewing
software development as a series of practices and the normalization of their
organization – as seen in UP, the new methodology views software development as
producing right and complete artifacts within the shortest path. The design of
the new methodology is the content structure organizing the flow of process as
opposed to the process structure organizing the flow of content. The content
structure is modeled as five levels of artifacts: business, requirements,
architectures, designs, and software. The creation of artifacts at each level
is guided by a set of principles. One important principle applicable to all
levels is open/close principle that is open to extension and close to
modification. The law of the new design is the law of economy for the
understanding of striving to reduce the length of the process to the smallest
number. Rather than being iterative and incremental in delivering software one
portion at a time as piecemeal approach, Unified Content applies living systems
approach in producing and testing content structure one level at time in its
entirety. The iterative and incremental approach in producing and testing
artifacts is applicable only between two levels with lower-level artifacts as
“requirement” and higher-level artifacts as “product”. The completed “product”
will become new “requirement” for still higher-level artifacts until the
completion of the entire content structure. The paper claims that the new
methodology will eliminate resource spent on rework with significant
improvement in both quality and speed over UP. It holds great potential to
eliminate software failures and significantly reduces not only development and
but also maintenance costs.

 

Ely Dorsey, Department of Physics, Quincy
College, Quincy, Massachusetts,
On the Completion of Quantum Mechanics
as a Theory of Probability
Room 375
Sunday 11:30AM
We pose that Quantum Mechanics is a complete theory and as such is
closed to further interpretation as a theory of probability. Using the
classical EPR paradox and the John Bell theorems we show that this classical
debate closed Quantum Mechanics. Causality is examined and we pose a quantum
epistemology, logical realism, as a theory of design reflecting on these
results. ( (1)This paper is a continuation of the work presented at the 2005
Annual Meeting of the American Society for Cybernetics at George Washington
University, Washington, DC. That presentation was entitled, “Is Quantum
Epistemology Epistemic?”)

 

Horace Fairlamb, University of
Houston-Victoria,
Justice as an Emergent System of
Constraints
Room 370
Sunday 11:30AM
1. The ideal of justice involves coordinating collective and
individual interests and teleologies. 2. Biology provides only a limited
starting point for theories of justice if we take biology to mean spontaneous
or teleonomic systems of design as distinguished from teleological designs.
Still, there are several ways in which biology continues to exert a powerful
influence over theories of justice. 3. Primarily, human biology suggests a
unique orientation toward maximum individual autonomy, at least potentially,
through innate abilities to learn, symbolize, and think strategically. This
suggests that individual autonomy is the biological telos of human nature. 4.
Classical Greek theories of justice erred in making society rather than the
individual the telos of human development. 5. Biological systems provide the
main prototype of systems of ethics and justice insofar as equilibrium systems
anticipate theories of prudential ethics and cost/benefit models of right
action. 6. The lawful physical or informational constraints on biological
systems provide the prototypes of rights systems that place categorical or
apriori limits on collective action. 7. A system adequate to preserve
individual freedom within collective actions must be structured by four kinds
of principles: (a) rights constraints on action; (b) collective policies
seeking maximum social welfare; (c) the inheritance of cultural values and
meanings; and (d) the individual’s construction of a life project.

 

Lawrence de Bivort, Ph.D.,
Evolutionary Incubation of Terrorist Groups
Room 375
Sunday 2:00PM
Many counter-terrorist strategies and tactics have had the
inadvertent effect of creating evolutionary environments for terrorist groups
that favor their rapid learning, evolving operational effectiveness, and
growth. Near-perfect evolutionary incubators for terrorist groups have been
created by their counter-terrorist foes in many areas, including Israel and
Palestine, Sri Lanka, Iraq, and Kashmir. This presentation will contrast the
evolutionary characteristics of terrorist groups with those of
counter-terrorist organizations. I will suggest that the terrorist groups have
superior evolutionary characteristics and, resource-for-resource, are winning
the contest between the two. Essential characteristics include: performance
pressure, timely learning, decision-making and implementation speed, goal
flexibility, alliances and net-works, and loss and recruitment rates. This
presentation is based on primary sources and discussions with both groups.

 

Anne Washington, School of Business George
Washington University,
Software Interfaces: Designing for Continuous
Change
Room 370
Sunday 2:00PM
Design principles provide continuity to software interfaces that
are under constant change. The computer software interface is a window into
what is stored on the computer. The interface facilitates a series of
interactions between a person and a computer. Computer interface design is
inherently open to modification. Changes in computing technology bring changes
in interface design. Each version of computer software brings an opportunity or
a need for a new design. Interface designs are created for a singular moment
when certain hardware and software are available. In the next moment, there
will be new advances that will lead to the creation of another design. Because
of the constantly shifting landscape, design principles become a vital point of
stability. Otherwise, every upgrade of Microsoft Word would send millions of
people scrambling to relearn how to open a file. Traditional design principles
can be also applied to computer software interfaces. Architecture provides
examples of understanding the impact of navigation and environment. Systems
theory and cybernetics provides a philosophical basis for designing concurrent
systems and subsystems. I will discuss computer software interface design and
its relationship to principles from architecture and systems theory.

 

John J. Maher, Arlington, VA, CILLI:
(Correctness Intrusively Localizing Legitimate Inquiry) & Emergence of
Design. (See Glossary at end)
Room 375
Sunday 2:30PM
Design, for humans, is remedial. Some problem needs to be
remedied. Correctness limits (localizes, establishes boundaries within which a
situation can be seen as problematic, and what can be seen as an acceptable
resolution). Correctness localizes what can be designed and how it can be
designed. This localizing is so restrictive that optimization is precluded in
favor of some situationally satisficing (Simon) termination of an emergency
[some bad escape valve (Okun)]. By cognitive processes that are known to
science but unknown to or disregarded by most Americans, even many with good
educations, this bad escape valve (Okun), is transmogrified (transformed by a
strange (poorly understood) process) into a correct principle, value, or
ritual. Asked if he was not discouraged by how much people did not know, Daniel
Borsten, retiring as Librarian of Congress, replied that he was more concerned
about what people do know. Thorsten Veblen wrote of “trained incapacity”, or,
as Kenneth Burke rephrased it, “A way of seeing is a way of not seeing”. Often
attributed to ill will or crass ignorance (“there are none so blind as those
who will not see”), this negative impact of knowledge is every bit as
“natural”, as biologically inherent, as are the beneficial impacts of
knowledge. “Johnny can’t read (or do math)” is often cited as emblematic of
(and synonymous with) the failure of education in America, whereas failure at
that level is hardly the tip of the iceberg of our educational failure. I
assert and attempt to demonstrate that fundamentalism, not exclusively
religious, is the principle fruit, as well as a principle cause of pervasive
failure from cradle to grave, and exempting no cadre, including the scientific
community. Discussed: “is government ‘by the people & for the people’ an
oxymoron?”; “No one in his right mind, trying to design a machine for taking in
evidence and ‘meeting out justice’ in a civil or criminal law case would ‘wire”
it the way the human brain is wired.” “When anthropologists discovered chimps
forming marauding parties to punish defectors, why didn’t they attribute that
behavior to ‘the devil’?” Was Darwin, in amplifying “Descent with Modification”
to “Variation and (adaptive) Natural Selection”, a victim of correctness, of
transmogrification? “Can religious correctness so localize inquiry that design
emerges from design-oids (Dawkins)?” GLOSSARY: Correctness: Unquestionable
application of a norm, not because of any demonstrable net benefit associated
with it, but because it is historico-situationally self-evident. Localizing
Legitimate Inquiry: limiting the field of inquiry to that within which a set of
correct norms is seen as self-evident.

 

David Abel, Life Origin: The Role of
Complexity at the Edge of Chaos
Room 370
Sunday 2:30PM
The computational genetic algorithms that organized life not only
predated humans, they produced our very brains and minds. Biological
cybernetics cannot be reduced to human knowledge, psychology, and epistemology.
Life origin is historically ontological and objective. Both Turing and von
Neumann got their computing ideas directly from linear digital genetics and
molecular biology. The study of life origin provides our best hope of
elucidating the emergence of Dawkins’ “the appearance of design” in a natural
world. Life-origin models frequently point to “complexity” in explanation of
self-organization. What exactly is “complexity”? By what mechanism does
complexity generate genetic algorithmic control? Rapid-fire succession of
Prigogine’s momentary dissipative structures can generate sustained
physicodynamic structure out of apparent randomness. This lecture examines the
role of such self-ordering phenomena at Stuart Kauffman’s “edge of chaos” in
the emergence of genetic instruction. The role of fractals, quantum factors,
and yet-to-be-discovered “laws of organization” are discussed. Self-ordering
phenomena are contrasted with self-organization. In modeling the derivation of
cooperative integration of biochemical pathways and cycles into holistic
metabolism, we examine: ” The linear digital nature of genetic instruction and
its inheritance, ” Physicodynamic base-pairing vs. dynamically-inert sequencing
” Messenger-molecule “meaning” (binding function in a metabolic context), ” The
encryption/decryption of coded genetic instructions, ” The many-to-one Hamming
“block-coding” used in triplet-nucleotide ordering of each amino acid to reduce
noise pollution of the signal, ” The essential requirements of any sign/symbol
system, ” The problem of representationalism in a purely physical, “natural”
reality ” Infodynamics vs. cybernetic prescription/instruction Are
three-dimensional protein conformations sufficient to explain life? The
nucleotide selection in polymerizing primordial RNA strings is rigidly
(covalently) bound before weak hydrogen-bond folding begins. The sequence of
ribonucleotides determines secondary and tertiary structure via minimum free
energy sinks. Each nucleotide selection corresponds to the setting of a
four-way configurable switch. How were these switches programmed ahead of time
so as to achieve computational halting (biofunction) only after secondary
folding of the string? Can genes be programmed by chance given large enough
periods of time? We examine the theories for how chance and/or necessity may
have generated utilitarian selections at successive decision nodes. Constraints
are contrasted with controls. Laws are contrasted with rules. Do prions and
examples of epigenetic inheritance invalidate information analogies in biology?
What about the fact that any nucleotide “works” at many sites? Life origin
studies provide the most pristine, parsimonious, elegant models of the origin
of “apparent design.”

 

Frederick David Abraham Blueberry Brain
Institute (USA) & Silliman University (Philippines),
Cyborgs,
Cyberspace, Cybersexuallity and the Evolution of Human Nature.
Room 375
Sunday 3:00PM
Human nature resides at the fractal imbrications of the individual
and culture, which are evolving in some very rapid ways. Advances in science
and technology drive much of this evolution. Some of these advances are in
computer systems (cyberspace); some are in the hybridization of the human body
with robotics (cyborgs); and some are in communications, artificial
intelligence, cloning, genetic manipulation, stem-cell ontogenetic manipulation
(which can now eliminate the male from reproductive participation),
pharmaceutical and molecular manipulation, nanotechnology, and so on. This
evolution influences the programs of emancipation suggested by postmodern
social theory and philosophical hermeneutics. Cybersexualitya philosophical,
literary, scientific genreprovides a prime example. This evolution also
involves some very fundamental human motivations. For example, the desire to
optimize knowledge and stability, to know our origins and destinies, our
meaning; the ontological-existential quests. The quests for truth and for
stability are at once two sides of the same tapestry, sometimes in competition
with each other, and sometimes synergistic, but always interactive, playing in
the same attractors. Creativity lies in exploring where and how to weave within
these fractal imbrications. And creativity requires instability. How does the
tension between the need for stability and instability resolve itself? Or put
another way, why does stability require instability?Herein is some commentary
on this evolution within the context of a thread of literary and philosophical
work that Jenny Wolmark calls, Cybersexualities.

 

Dragan Tevdovski and Stuart Umpleby, A
Method for Designing Improvements in Organizations, Products, and
Services
Room 370
Sunday 3:00PM
A Quality Improvement Priority Matrix (QIPM) may be used for
identifying priorities for improving an organization, a product, or a service.
This paper reports on the use of the QIPM method by the members of the
Department of Management Science at The George Washington University and the
Department of Management at Kazan State University in Kazan, Russia in the year
2002. Features of a Department, such as salaries, teaching assistants, computer
hardware, etc. (a total of 52 features), were evaluated on the scales of
importance and performance. We also computed the importance/ performance ratio
(IPR) and clustered the items by their IPR scores into four groups – urgent,
high priority, medium priority and low priority. Recent research has
significantly improved the method as a way of determining priorities,
monitoring progress, identifying consensus or disagreement, and comparing two
organizations. This paper discusses additional statistical improvements and
ways of presenting the results of statistical analysis. The QIPM method is a
way of achieving agreement among a group of people on the most important
actions to be taken.

 

Andy Schneider-Munoz, Academy of Child and
Youth Care,
The Emergence of Civic Structures In The World of Domestic
Non-Profits and Global Non-Governmental Organizations.
Room 375
Sunday 3:30PM
This presentation uses the theories of non-linear system dynamics
to examine the emergence of the structures that operate large domestic
non-profits and global non-governmental organizations. Events like 9/11 and
Katrina have served as strange attractors for the large upsweep of young people
willing to give volunteer service in response to these extreme situations.
Rather than becoming cynical or afraid, the youth are more hopeful and desire a
level of new engagement in youth service organizations and in the democracy as
a whole. Many non-profits and ngo’s have been unprepared to provide such
intensified opportunities in the chaos and in fact have rigidly maintained
outmoded standard operating procedures to the loss of making large scale social
change. In fact, large scale social sector research, and the resulting
organizational implementation in non-profits and ngos remains an artifact of
the operating platforms that were established during the War on Poverty Years.
These frameworks for gathering data and recommending action fail to address the
“unit of analysis” that exists today driven by the complexities and diversities
of cultures and socio-economics. Models for understanding social problems as
addressed by the services of non-profits and ngos are locked in the old
coherence of one race compared to another or rich compared to poor when the
reality of every day life engenders boundary-crossing; trans-migration within
urban areas, bicultural experience, and so forth. For example while we could
once think of the poor Hispanic in relationship to the rich suburban white,
there are more than sixteen Hispanic American experiences living across many
micro-climates including cross-state and cross-border and interwoven in the
complexity of suburban, urban and rural environments. In these situations
strange to the old platforms, the events of every day life cannot be controlled
but immediate response for the youth to help one another across cultural and
economic boundaries provides fractals of structure within the chaos, fairly
stable relational structures that entrain within the unstable environment to
provide youth opportunities and experiences in which youth reclaim a sense of
identity and control over their most immediate environment through the
developmental activities that are generated. Operating at the nexus of
non-profit and business, this paper will utilize the organizational history and
data from City Year, a large non-profit and ngo in which 1,200 youth serve
89,000 children in 16 American cities and South Africa through a full time year
of peace-corps like volunteer service focused on community building, mentoring
and tutoring. Examples will be drawn both from 9/11 and from the most current
work in the Katrina relief effort with the large number of children under
twelve years old still living in youth shelters and emergency housing. City
Year operates in new and emergent organizational design called an action tank,
which merges the policy research of an institute with the action research of
making social change and transformation in the field. The action tank is
supported by a core of interdisciplinary researchers and practitioners who
utilize non-liner dynamic systems to apply solutions to social problems. These
solutions are generated in a knowledge core that incorporates the ability to
gather data, transfer knowledge, and diffuse best case practice strategies
across the nation’s social infrastructure including transformation in the
change over time of the civic skills of the youth through the ecology of youth
networks, that configure and reconfigure as affinity groups, as well as, at the
organizational and national levels through large scale service events that draw
thousands of people together to take action by serving the nation. The
intervention is not only entrepreneurial but inclusive, every youth adult team
doing service for the nation through City Year is constructed to reflect the
diversities of the nation in 2050. City Year’s knowledge core, called Research
and Systematic Learning, is grounded not only in non-linear dynamics but
propelled by organic frameworks such as the organization differentiates and
then integrates much like a biological organism and patterns directly on some
of the same pathway analysis as neural networks as they frame ecological
interactions. The presentation will also observe the phenomenon in it’s history
of having reversed from an unstable organization in a stable strategic
environment to maintaining and sustaining stability in an unstable strategic
environment by consistently configuring and reconfiguring, therefore producing
reverse model drift, the core operate much faster than the part but the parts
at the furthers reach are the most stable. Typically in a large organization,
the headquarters has the greatest cohesion and structure while the constituent
sites the furthest removed least represent the culture. In this case formal and
informal feedback systems which configure and reconfigure around affinity
groups in a large organization has resulted in a rapidly spinning center with
great stability in the implantation at the furthers reaches. This paper
synthesizes developmental frameworks such as attachment theory, biological
models such as theory of the development of organisms, and applies these
constructs to organizational thinking as new models emerge for the leadership
and implementation on non-profits and NGOs grounded in complexity as the lens
which drives growth.

 

Richard Evans George Mason University
Department of Computer Science,
Design Design: The Design and Designing
of Systems
Room 370
Sunday 3:30PM
Designs originate in the mind as a mental image. The system design
task is to perfect that image. While design can be greatly assisted by
machines, it is seen as a wholly human activity. People are the only source of
ideas and are conduits of ideas rather than containers. Ideas somehow flow
through us; albeit it is admittedly not at all clear where the ideas, the
mental images, come from or how they enter. Given humans as pipes and wires
through whom ideas flow, a focus in design is on enabling, refining, and
maturing that flow of ideas. Designing is seen as a holistic set of concurrent
design decisions among alternatives that emerge from within whole spheres of
interactive, interrelated, and interdependent perspectives; including
perspectives about the designing activity itself: an overall self-design that
is called Design Design. A central Design Design idea is the aim of
comprehensive consideration of viable alternatives on all applicable
perspectives. To enable that aim, one of the ideas in Design Design is to
recognize that every design decision exists in a sphere of potential
perspectives. To address a sphere of multiple possible perspectives in some
tractable way, it is accepted that any sphere can at least be “represented ” by
three orthogonal elements, or, in the case of a sphere of perspectives, by
three orthogonal perspectives.