Capital Science 2004
The first Cap Sci was held on the weekend of March 20-21, 2004 at the National Science Foundation in Arlington, Virginia. Following are abstracts of the presentations. Please contact the authors for more details about these excellent presentations.
Abstracts
A Paradigm Shift? by Dr. Harold Morowitz, George Mason University | Sunday 4:00PM |
The recent discovery from DNA sequences that genes are rather promiscuously horizontally transferred among prokaryotes refocuses the already difficult question of what we mean by a “species” within this taxon. This species problem may extend to eukaryotic taxa. We are simultaneously discovering that the core of the chart of intermediary metabolism is very old, ubiquitous and extremely robust to change. This results in a phenotype (the chart of intermediary metabolism) that is stable to a fluctuating noisy genetic background. The stability forces a reexamination of the dogma of molecular biology and asks in which way the information flows. This possible shift of paradigm has implications for evolutionary theory, biogenesis, Lamarkianism and our general philosophic view of biology. |
Engineering is for Children and Not Just for Adults. by Dr. Leigh Abts, Deputy Director of Outreach, Johns Hopkins University, Whiting School of Engineering |
Sunday 9:00AM |
Engineering is a bridge for pre-college students to play and tinker with mathematics and science concepts. Experiences that allow pre-college students to design and build projects that illustrate principles learned in a classroom can strengthen a child’s understanding of abstract concepts. However, before children can learn, their teachers need to understand the connections between engineering and the mathematics and science learning goals that are taught in classrooms. The National Science Foundation has funded programs at higher education institutions, such as Johns Hopkins University, to support in service programs for teachers that focus on engineering and research. This program couples research faculty and secondary school educators. These educators become engaged in real research and engineering experiences that can be applied to classrooms. The presentation will describe the JHU program, its partnership with AAAS, and practices transferred into classrooms across the country. |
ACOUSTICAL SOCIETY OF AMERICA. WASHINGTON
CHAPTER
Underwater Sound from the Whale’s Point of View. by Paul Arveson, The Balanced Scorecard Institute |
Saturday 9:00AM |
There have been numerous reports in the recent literature of apparently stressful effects on marine mammals due to sonar experiments. But another man-made source — the radiated noise from ships — contributes significantly to the ocean ambient, nearly everywhere and all the time. The technical basis for this talk is a set of accurate and detailed measurements of the radiated noise of a typical cargo ship [“Radiated Noise Characteristics of a Large Cargo Ship”, P. Arveson and D. Vendittis, J. Acoust. Soc. Am. Jan. 2000]. However, the talk will be a popular-level demonstration and a (necessarily) fictitious narrative of acoustical experiences from a humpback whale’s point of view. Room acoustics permitting, the audience should be able to gain an experiential insight into the environmental impact of shipping noise on the life and habits of these creatures. |
Nuclear Explosion Monitoring: An Overview with a Focus on Acoustics. by Zachary Upton, BBN Technologies |
Saturday 10:00 AM |
The Comprehensive Nuclear Test Ban Treaty was adopted by the United Nations on September 10, 1996. The treaty bans all nuclear weapons testing by its member countries. Statistics of signature and ratification include:
Once the Comprehensive Nuclear Test Ban treaty |
AMERICAN ASSOCIATION OF PHYSICS TEACHERS,
CHESAPEAKE SECTION
Goals of Physics Education for Non-Majors by Dr. Robert Ehrlich, George Mason University |
Saturday 9:00AM |
We have surveyed three groups on what they believe the appropriate goals should be for a college-level (algebara-based) physics course for nonmajors. These groups include: (a) physics faculty at George Mason (a more or less conventional department), (b) a group of physics faculty who make use of physics education reform curricular materials, and (c) the students taking the algebra based physics at George Mason. We compare the results of the goals survey for these three groups, and various subgroups. |
The World Year of Physics 2005-An Opportunity to Increase the Public’s Awareness and Appreciation of Physics. by Dr. Warren W. Hein, American Association of Physics Teachers |
Saturday 9:35AM |
The year 2005 marks the 100th anniversary of Albert Einstein’s “miraculous year” in which he published three important papers describing ideas that have since influenced all of modern physics. This year provides the opportunity to celebrate Einstein, his great ideas, and his influence on life in the 21st century. The World Year of Physics (WYP 2005) is a worldwide celebration of physics and its importance in our everyday lives. Physics not only plays an important role in the development of science and technology but also has a tremendous impact on our society. The goal of the WYP is to raise the public awareness of physics and physical science and this presentation will discuss ways in which everyone in the physics community can help achieve this goal. |
Improving Teacher Preparation, Society Style by John W. Layman, University of Maryland |
Saturday 10:10AM |
PhysTEC (Physics Teacher Education Coalition) is an NSF FIPSE sponsored collaborative effort between the American Physical Society, the American Association of Physics Teachers, and the American Institute of Physics to get physics departments in collaboration with their colleagues in education to create more and better prepared science teachers, K-12. We are in our third year of operation and a report of our progress and aspirations will be provided. |
Science and Technology in James Madison’s World by Dr. William H. Ingham, James Madison University |
Saturday 10:45AM |
Many scholars have rightly examined and celebrated the scientific interests and activities of Franklin and Jefferson. In this presentation, we examine James Madison’s scientific and technological interests during his long life. By doing so, we hope to illuminate the role and progress of science in a young and rapidly expanding nation. |
Model of the 2003 Tour de France by John Eric Goff, Lynchburg College |
Saturday 11:20AM |
Working with Lynchburg College senior Benjamin Hannas, we modeled the 2003 Tour de France bicycle race using stage profile data for which elevations at various points in each stage are known. Each stage is modeled as a series of inclined planes and we accounted for aerodynamic drag and rolling resistance on the bicycle-rider combination. Our calculated total of the stage-winning times differed from the actual total by just 0.03%. |
Scanning Probe Microscopy in the Undergraduate Curriculum: Nanolithography by David Schaefer,Towson University and Brian Augustine, James Madison University |
Saturday 2:00PM |
Scanning probe microscopy has become an established tool for performing nanoscale research in industry and academia. These instruments have also proven to be excellent instruments for undergraduate instruction. This presentation will focus on applications of scanning probe microscopy for nanolithography studies and their applications. |
Barcode Reader by James O’Connell, Frederick Community College |
Saturday 2:30PM |
The Universal Product Code (UPC) printed on containers and packages of commercial products is a barcode identifying the contents. The code is represented by black and white stripes of varying widths. When the bars are scanned with a laser beam the code is translated into a serial number, which gives relevant information about the product, its description, price, etc. This talk describes a demonstration with physics laboratory equipment that mimics a commercial laser UPC reader and illustrates the application of simple physics to a modern technology used in everyday life. |
Scientific Reasoning Competency Assessment in Higher Education in Virginia by Harold Geller, George Mason University |
Saturday 3:10PM |
In 2002 the State Council of Higher Education of Virginia (SCHEV), requested all state institutions of higher education to submit institutional plans for assessing scientific and quantitative reasoning competency of students. These institutional plans or proposals must include the institution’s definition of competency and a description of how the institution will assess it. I will address the approach being taken at George Mason University, including the primary goals in the definition of scientific reasoning competency, and the implementation plans being considered to meet the SCHEV requirements. |
Iron Artifact Conservation: Science on the Cheap by Rhett Herman, Radford University |
Saturday 3:45PM |
Radford University has worked with the town of Saltville, Virginia, to help understand and publicize their role in US history, including the Civil War. As a major producer of salt in the 1700s and 1800s, Saltville built a number of ‘saltworks’ to remove salt from the high-salt-content groundwater found in certain locations in the area. Much of this involved large iron vats and other iron artifacts that today are either buried or are rusting out in the open. We are using a simple electrolysis process–the same as that used on the Monitor submarine–to remove the years of encrustation as well as to reconstitute the iron structure of the artifacts themselves. The equipment for this procedure is easy to assemble and can be used on iron artifacts of various sizes and ages. Calculations involving this reaction will show why these processes take such a long time to complete. |
A US-Canadian Aquatic Inventory and Invasive Species Warning System by Donna Turgeon and Gary Matlock |
Saturday 9:30AM |
Natural resource managers need to know when an alien species is introduced to their region and where they can get information to help formulate response strategies. Although there are many nuggets to be gleaned from the literature and answers can be found within a myriad of databases and websites, currently there is no comprehensive website that integrates the data, synthesizes it in a usable format, and makes it readily available. For that reason, the National Ocean Service, the U.S. Geological Survey, the Smithsonian Museum, and many other partners initiated in FY02 a project that will result in a credible inventory of U.S. and Canadian aquatic species, a reporting and verification system for species not on the inventory, timely warnings for species new to aquatic ecosystems, risk assessments, and other information on alien species. Implemented as a Hawaiian Pilot Inventory and Warning System, the Pilot is now being tested. Data is already being added from other regions of the United States and Canada to those databases to enhance the effort. A draft U.S. and Canadian inventory and warning system could be ready as early as FY08. Visitors to the Pilot website can ‘ground truth’ new collections against an inventory of existing U.S. and Canadian species, map distributions, and get in-depth information on invasive species. If a species not on the inventory is confirmed as alien, a warning will be posted automatically to managers. With such warnings and information, managers will be better prepared to prevent alien species and mitigate impacts. Reducing the potential for a species becoming established in aquatic ecosystems should also help maintain habitat structure, function, and diversity for critical fisheries habitats. |
AMERICAN
METEOROLOGICAL SOCIETY
Education in the Atmospheric and Related Sciences in the Washington Area. by Dr. Eugenia Kalnay, Distinguished University Professor, Department of Meteorology, University of Maryland |
Saturday 3:00PM |
The Washington area has one of the strongest concentrations in atmospheric, oceanic and environmental sciences education programs in the country. It includes the graduate programs in atmospheric and oceanic sciences at the University of Maryland (College Park), and the programs in atmospheric physics at Howard University and University of Maryland (Baltimore County), as well as a climate program at George Mason University. We will discuss these programs, and their relative strengths, as well as other opportunities that they provide in related areas. |
The Research Enterprise in the Atmospheric and Related Sciences in the Washington Area by Dr. Franco Einaudi, Director, Earth Sciences Directorate, NASA Goddard Space Flight Center |
Saturday 3:35PM |
The Washington area has one of the largest concentrations of research activities in the country, covering a broad range of pure and applied research. These activities take place in the numerous universities and in several government laboratories. An attempt will be made to distribute some of these research efforts that are taking place in this area with the Department of Commerce, the Department of Energy, the Department of Defense, and NASA. Some of the efforts designed to fill the gap between research and operations activities will be discussed. |
Advances in Weather and Climate Prediction: Local Contributions by Dr. Louis Uccellini, Director, National Center for Environmental Prediction, National Oceanic and Atmospheric Administration |
Saturday 4:05PM |
The application of Newton’s law, the laws of thermodynamics, Planck’s law, the Stefan-Boltzmann law, and advanced numerical techniques to the creation and successful applications of numerical weather forecast models is one of the major intellectual achievements of the 20th century. Today, data from around the globe are assimilated into global numerical models run on one of the most powerful computers in the world with grid resolutions of 33 miles (55 km) that predict weather out to 15 days in advance. The accuracy of today’s 5 day forecast is equivalent to that of a 2 1/2 day forecast issued 15 years ago. Extreme weather events including hurricanes, snowstorms and severe weather outbreaks are predicted routinely 5 days in advance. Snowfall amounts and rain-snow-ice boundaries are predicted days in advance with a resolution down to the county level. In this presentation, the role of the NOAA/National Weather Service’s National Centers for Environmental Prediction (NCEP — located in Camp Springs, MD) in fostering these breakthroughs in numerical weather prediction will be discussed. The nature of numerical models and how they have advanced hurricane and snowfall prediction will be emphasized, with the linkages to the scientific advances related to observing systems, data assimilation and numerical prediction highlighted. Prospects for improved short-range climate prediction (seasonal to interannual forecasts) will also be discussed. |
Policy Considerations for the Atmospheric and Related Sciences by Dr. William Hooke, Director, Atmospheric Policy Program, American Meteorological Society |
Saturday 4:35PM |
Most people might think of atmospheric science and public policy existing in two separate realms. In fact, atmospheric policy issues are numerous, threaded throughout the national agenda, and stubbornly resistant to resolution. This talk provides an overview of the stark and forbidding atmospheric policy landscape, and its implications for mankind. |
AMERICAN SOCIETY FOR MICROBIOLOGY
Identifying Clostridium Perfringens Toxins by Microarray Hybridization by K. M. Myers, Center for Food Safety and Applied Nutrition, Food and Drug Administration, College Park, MD (presenter), D. Villanueva, Center for Food Safety and Applied Nutrition, Food and Drug Administration, College Park, MD, S. F. Al-Khaldi, Center for Food Safety and Applied Nutrition, Food and Drug Administration, College Park, MD, D. Volokhov, Center for Biologics Evaluation and Research, Food and Drug Administration, Rockville, MD, A. Rasooly, Center for Food Safety and Applied Nutrition, Food and Drug Administration, College Park, MD, V. Chizhikov Center for Biologics Evaluation and Research, Food and Drug Administration, Rockville, MD |
Saturday 2:05PM |
Multiple oligonucleotide microarray hybridization is a relatively new technology that has shown potential in genotyping and characterizing pathogenic bacteria. C. perfringens, a pathogenic bacteria found in soil and the intestinal tract of vertebrates, was characterized by the microarray technology here in 17 isolated strains. The strains produce many toxins, six of which were genotyped by the multiple oligonucleotide microarray technology; iA (iota toxin), cpa (alpha toxin), cpe (enterotoxin), etxD (epsilon toxin), cpb1 (beta toxin 1), and cpb2 (beta toxin 2). Three oligonucleotide probes (oligoprobes) were developed for each toxin from complementary sequences and were immobilized on a glass slide. Multiplex PCR was performed to provide amplified regions of ssDNA from each virulence gene and fluorescently labeled (Cy5 or Cy3) DNA was hybridized to its complementary oligoprobe on the microarray chip. Fluorescence on the chip was analyzed in order to determine the presence or absence of toxins in each strain of C. perfringens. Results of the study, verified by single PCR amplification shown in gel electrophoresis, indicate the reliability and potential usefulness and efficiency of the multiple microarray-based technology in the genotyping of microorganisms. |
Mathematical modeling of in vitro systems to predict the outcome for in vivo exposure to biological threat or infectious pathogenic agents by Rasha Hammamieh, Chanaka Mendis, Shuguang Bi, Sachin Mani, Roger Neill, Rina Das and Marti Jett (presenter), Walter Reed Army Institute of Research, Division of Pathology, Silver Spring, MD |
Saturday 2:25PM |
The system we are studying focuses on use of “host” gene responses to various biological threat agents in order to obtain early assessment of exposure. Our aim is to identify signature genes that will identify exposure to each one of these threat agents even soon after exposure. Already we have demonstrated with B. anthracis exposure in non-human primates (NHP) that using gene response profiles, there is a unique signature of exposure within 24 h and are moving back to 6 h post exposure. Classical methods have been unable to detect such exposure until 3 days post exposure. Obtaining patient samples for HIV, flu, malaria is not difficult and is a critical part of designing new detection and therapeutic approaches. Human samples from exposure to biothreat and emerging pathogenic agents are rare to non-existent. For many, the only animal models that exist are non-human primates (monkeys) and there are enormous problems in their use especially since they are in such short supply, are quite expensive and they retain their wild characteristics, hiding signs of illness. The efforts of our program are to determine how we can use mathematical predictive modeling from data obtained with in vitro exposures to human white blood cells. In this in vitro system, we can investigate dose, exposure times and other variables in order to thoroughly understand the host responses. We have NHP studies with which to model these predictive algorithms. Obviously, those animal studies are impractical for the range of studies that we carry out in vitro. Mathematical and bioinformatics approaches are providing the means to identify gene patterns that have been identified in vitro that predict in vivo progression of illness/exposure. |
The Presence of Influenza Viral RNA and Cytokine mRNA in the Lungs and Brains of Mice at the Onset of the Hypothermic Response to Virus by Jeannine A. Majde, Cottrell Consultants, LLC, Arlington, VA, and Stewart G. Bohnet, Georgeann A. Ellis, Abdur Rehman, Deborah Duricka and James M. Krueger all from Department of VCAPP, Washington State University, Pullman, WA |
Saturday 2:45PM |
Influenza pneumonitis causes severe systemic symptoms in mice, including hypothermia, anorexia and excess slow wave sleep. The association of extrapulmonary virus, particularly virus in the brain, with the onset of such disease symptoms has not been investigated. C57BL/6 male mice were infected intranasally with high doses of purified mouse-adapted influenza virus (strains PR8 or X-31) under inhalation anesthesia. Core body temperatures were monitored continuously by radio-telemetry, and tissues (whole blood, spleens, brains and upper lung lobes) were harvested at the time of onset of hypothermia (13-24 hr post infection). In some experiments brains were dissected into cortex, hypothalamus and brainstem. All tissues were examined either by single-step reverse-transcriptase polymerase chain reaction (RT-PCR), two-step nested RT-PCR (nRT-PCR), or quantitative real-time RT-PCR (RT2-PCR) for negative polarity (genomic) or positive polarity (replication intermediates, cRNA or mRNA) strands of PR8 or X-31 nucleoprotein. RT-PCR detected viral genomic RNA and viral replication intermediates only in the lung. However, nRT-PCR and RT2-PCR detected both viral genomic RNA and viral replication intermediates in all tissues examined except brain cortex. RT2-PCR also revealed increased mRNA for proinflammatory cytokines in lung, and the same brain regions expressing viral replication intermediates also expressed interleukin-1? mRNA. Controls receiving completely heat-inactivated virus expressed only viral genomic RNA, and that only in lung. Therefore the onset of virus-induced hypothermia is associated with possible viral replication in selected brain regions. In addition, hypothermia onset is associated with increased inflammatory mediator mRNAs in lung and in the same brain regions expressing viral replication intermediates. We propose that viral symptoms may result from rapid transport of the virus or viral RNA into the central nervous system from the respiratory tract. |
Bacteria and Phytoplankton, Nutrients and Organics; Or How Estuaries Really Work: Evidence from The Chesapeake Bay and the Chao Phraya River, Thailand by R.B. Jonas, (presenter), Envr. Sci. and Policy, George Mason Univ. Fairfax VA, L.J. Hamdan, US Naval Research Laboratory, Washington, DC, and K. Ruchiwit, Faculty of Allied Health Science, Thammasat Univ., Bangkok, Thailand |
Saturday 3:05PM |
Estuaries are known to be very productive ecosystems. Aspects of both grazer and detrital food webs play a major role in various estuarine trophodynamics. While bacteria are major catalysts driving detrital food webs, evidence from the Chesapeake Bay, the Potomac River and the Chao Phraya River estuary in Thailand indicate that the roles of microbially labile dissolved organic matter produced in situ (autochthonous) and the bacterioplankton community, as a consumer of that dissolved organic matter, have been very seriously underestimated. The magnitude of the misunderstanding is such that on average more than 50% of the organics fuelling community respiration are not accounted for in many predictive ecosystem models. These three estuaries were studied because each is highly enriched with inorganic nutrients (N and P), each exhibits severe, long-term seasonal hypoxia and anoxia, but none develop unusually high phytoplankton abundances. How is it then that severe oxygen depletion occurs? Data from these estuaries indicate that high concentrations of functionally dissolved, microbially labile organic matter (DiMLOC), composed largely of dissolved carbohydrates and dissolved amino acids, occur in the water column. Experimental (inhibition of bacteria activity with antibiotics) and microscopic evidence indicate that in situ phytoplankton production is the source of this DiMLOC. Dissolved carbohydrates and amino acids were significantly correlated with phytoplankton biomass, bacterial abundance and bacterial production. In the absence of bacterial metabolism high concentrations of these dissolved organics accumulated in experimental microcosms. However, while unprecedented bacterial abundances occur in the Chesapeake and the Potomac (often > 30 x 106 cells/ml) ecosystems, abundances in the Chao Phraya estuary do not exceed 7 x 106 cells/ml. It seems clear that microbially-labile, dissolved organic matter in these ecosystems must be accounted for directly in order to develop realistic models of their function. |
Temporal and Spatial Variations in Bacterial Community Composition in the Mesohaline Potomac River Determined by Amplicon Length Heterogeneity by J.M. Classen (presenter), P.M. Gillevet, M. Sikaroodi and R.B. Jonas, all at Dept. Envr. Sci. and Policy, George Mason Univ. Fairfax VA |
Saturday 3:30PM |
The Chesapeake Bay ecosystem has changed drastically in the past 50 years, mostly due to anthropogenic causes. The Bay, the largest estuary in the United States and once considered the nation’s most productive, has become less biologically diverse and more susceptible and less resilient to disturbances. Data from the mainstem and Potomac River show unprecedentedly high bacterial abundances and rapid metabolism. In situ produced microbially labile dissolved organic matter fuels this bacterial dominated ecosystem. Elucidating bacterial community structure and dynamics is vital to understanding this ecosystem and to developing management models of overall function. The goal of this work was to investigate bacterial composition of that community and its temporal and spatial dynamics. The mesohaline Potomac River estuary was selected for this work because it is a physical, hydrodynamic and biological model of the mainstem Chesapeake Bay. Water samples were collected between March and November 2002 from the top, middle, and bottom depths of one mid-river station and two near shore sites located along a cross river transect near Ragged Point. The Length Heterogeneity Polymerase Chain Reaction (LH-PCR), which interogates mixed bacterial communities based on amplicon length of variable regions of the 16S rDNA gene, was used to investigate the biocomplexity of the bacterial community on temporal and spatial scales. The results suggest that there is only limited spatial variation in bacterial community composition within this ecosystem and that there is more variability on a temporal scale. However, community composition was clearly quite different in zones of anoxic, deep water in mesohaline Potomac as compared with both oxic and hypoxic areas. This finding may be important in understanding how the Chesapeake functions under the anoxic conditions which develop each summer. Amplicons from the bacterial genomic DNA were cloned and sequenced. Comparison of these cloned sequences with known bacterial rDNA sequences (at 90% identity) indicated that most of the bacteria in these samples appear to be novel, uncharacterized species. |
AMERICAN SOCIETY OF PLANT
BIOLOGISTS
Structure and function of a water-soluble carotenoid-binding protein by Cheryl Kerfeld, UCLA |
|
Carotenoid-binding proteins function in light-harvesting and in photoprotection. The structure of the orange carotenoid-binding protein (OCP) isolated from the cyanobacterium Arthrospira maxima has been determined at a resolution of 2.1A. OCP is the first structural example of a protein that binds exclusively carotenoids. The OCP appears to be involved in photoprotection; microarray analysis indicates that levels of the OCP transcript increase more than 600-fold when cells are transferred from low to high light. Data from our lab also indicates the OCP is an avid quencher of singlet oxygen. The structure of OCP is a novel composite of two domains. A proteolytic product of OCP, isolated in the laboratory, appears red instead of orange. OCP can also be converted into a red protein by exposure to low pH. Our data suggests that low pH changes the structure of the protein. Details of the interaction between the pigment and protein will also be discussed in the context of OCP’s putative function in photoprotection. We have also transformed Arabidopsis with the OCP. The potential for using OCP to enhance photoprotection in plants will also be discussed. |
Sources of metabolic urea in plants: What does it have to do with soybean performance? by Joe Polacco, University of Missouri, Columbia |
|
Soybean expresses two active ureases, an abundant seed isoform and a tissue-ubiquitous form, expressed at a much lower level. The ubiquitous urease has an assimilatory function: mutants which lack it cannot utilize urea nitrogen (N) in cell culture and, at the whole plant level, they accumulate urea in leaves and in seeds. What is the source of urea? Much of it is from breakdown of arginine (Arg). We showed, by independently manipulating the urease phenotype of the developing embryo and maternal seed coat, that the arginase reaction does not operate in developing embryos. This situation prevails in spite of measurable arginase activity in vitro and the generation of Arg-derived urea from cotyledons cultured with Arg as sole N source. An examination of mitochondrial Arg carrier proteins indicated that they were not a barrier to Arg entry into the mitochondrial matrix, the site of arginase. A second potential source of urea is ureides, the major form of N transported out of fixing nodules. It has been suggested that there are two routes of ureide degradation in soybean, differing in the generation of urea vs ammonia, with plants exhibiting the urea-generating pathway also having more drought-tolerant N-fixation. We have used a urease-negative mutant and urease inhibitors to derive data contrary to this hypothesis, i.e. a tolerant (‘Maple Arrow’) and a sensitive cultivar (‘Williams 82’) exhibited no major difference in their ureide degradation routes. However, in both cultivars urea is indeed a product of ureide degradation, along with direct generation of ammonia, consistent with the scheme: allantoin ‘ allantoate ‘ ureidoglycolate (+ ammonia) ‘ glyoxylate + urea. |
Late-acting stylar factors in the self-incompatibility system in Nicotiana, by Bruce McClure Department of Biochemistry, University of Missouri-Columbia |
|
Many plants possess genetically controlled self-incompatibility (SI) systems that allow them to recognize and reject self-pollen and pollen from closely related individuals. Nicotiana alata has a gametophytic SI system; pollen is rejected when its single S-allele is the same as either of the two S-alleles present in the diploid pistil. S-allele-specific pollen rejection is determined by S-locus products expressed in the pollen and the pistil. S-RNase is expressed on the pistil side. Each S-allele expresses a specific S-RNase that is secreted into the stylar extracellular matrix. S-RNase enters pollen tubes and acts as an S-allele-specific cytotoxin; pollen RNA is degraded in incompatible but not compatible pollinations. It has recently been shown that the pollen product is likely to be an F-box protein. The popular model for SI is that the pollen product provides for resistance to S-RNase, perhaps causing its ubiquitination and subsequent degradation in compatible pollinations. Pollen rejection is thought to result from a failure to degrade or otherwise inactivate S-RNase allowing its cytotoxic action to be felt. While it is clear that S-RNase is the sole determinant of allelic specificity on the pistil side of the SI reaction, it is also known that other factors are required. For example, antisense inhibition of a small asparagine rich protein called HT-B causes breakdown of SI but does not affect S-RNase expression. Absence of a factor designated 4936-factor results in a similar breakdown without affecting S-RNase expression. Pollination tests show that HT-B and 4936-factor only affect the pistil side of SI. We used florescence immunocytochemistry to test whether these factors affect the interaction between S-RNase and the pollen tube. Pollen from wild-type SI N. alata was used to pollinate plants defective for 4936-factor and antisense suppressed HT-B plants. Styles were fixed, sectioned and treated with S-RNase antibodies and anticallose antibodies. The results show large amounts of S-RNase uptake in both normal and defective plants, even those where pollen rejection does not occur because of defects in pistil factors. Thus, HT-B and 4936-factor act late in the pollen rejection pathway. Since pollen tubes contain large amounts of S-RNase but exhibit no ill effect, we conclude that S-RNase initially taken up in an inactive form. Thus, pollen rejection is more complex than is commonly appreciated. It is possible that the target of the F-box protein may not be S-RNase itself. |
ASSOCIATION FOR
SCIENCE TECHNOLOGY AND INNOVATION
Stimulating Space Development through NASA’s Space Exploration ProgramStimulating Space Development through NASA’s Space Exploration Program by Eric Dahlstrom, InternationalSpace.com |
Saturday 9:00AM |
Government funded space science and exploration programs provide an opportunity to initiate large-scale space development. To accomplish a lasting effect on humanity, the new NASA Moon and Mars programs must be implemented in ways that stimulate more activity in space, rather than becoming the only activity. If we can implement the appropriate role for the government in the exploration program, even current NASA funding levels can be used to initiate the large-scale economic development of space. Taking the appropriate steps now can complete “the giant leap” for humanity, and fulfill the promise of Apollo. |
The Carbon Fuel Cell’s Impact on the Future Global Energy Equation, by John Bosma, Synthesis Partners, LLC. |
Saturday 9:30AM |
The carbon fuel cell’s impact on the future global energy equation, including climate change, matches or exceeds that of the nuclear reactor because of its extraordinary end-to-end efficiency (65% to 80%=), its electrochemical simplicity and its enormous fuel diversity. Using NO hydrogen, these fuel cells run DIRECTLY on carbon from diverse sources: coal, carbon black (e.g., from recycled tires), oil-refining chars and petroleum coke, and chars from fast-pyrolysis conversion of waste wood and high-cellulose organics (garbage) into liquid fuels and chemicals. Automotive CFCs using graphitized carbon have 4x the range of internal-combustion cars for the same tank volume. A CFC-powered Navy warship with electric propulsion would have 4-6 times the range of current gas-turbine ships with their costly engines and propeller shafts. CFC repowering of steam-cycle coal-fired generating plants could double or triple their efficiency while eliminating air toxics, carcinogenic particles and tens of millions of now-lagooned coal ash and wet-scrubber sludges. CFC-repowered turboprop aircraft and helicopters could radically boost range and payload while replacing costly, massive gas turbines, gearboxes and shafting with mechanically simple electric motors. CFCs let poor nations power themselves with carbon waste (including fast-growing fuel-biomass crops) without importing costly conventional plants. A CFC-ization of US baseload power generation, cars and trucks, locomotives and diesel generators could turn a low-tech ‘commodity steam coal’ industry into very profitable producers of carbon electrodes and low-ash coal slurries. |
The Innovation Gap: The Russians are Far Ahead with TRIZ – We can’t beat them; we should join them by Bob Kolodney, Klimek Kolodney & Casale P.C. |
Saturday 10:00AM |
An overview of TRIZ and its potential impact on our technology, our new products, our new ventures, the development of our economy. The Russians have something that is not only world-class, it is world-leading, and they are considerably in advance of the United States when it comes to innovation. They have a practical problem-solving discipline named TRIZ (Russian acronym for “The Theory of Inventive Problem Solving”) with 60 years of development behind it. Various countries with developed economies are ahead of the US in the adoption of TRIZ. Although the approach is starting to take hold in Corporate America, it has yet to be adopted to the extent that it merits. In the Capital region TRIZ is practically unknown. TRIZ makes it possible to maximize the ability of the inventor or product developer to achieve an optimum new product. It allows the orderly solving of problems by using techniques to: · understand problems; · make use of existing resources; · establish practical objectives; · perceive the direction that improvements must take; · identify constraints and use generic approaches to resolve them; · apply standard solutions; · access many thousands of patent claims and scientific principles to provide solutions. At ASTI we see the potential usefulness of TRIZ in business development to help optimize innovative products and services, determine the feasibility of new ventures and plan them effectively. In these days of globalization it is important for the United States to marshal its resources to be competitive. Hopefully, TRIZ will help us to do this. |
Materials Research to Meet 21st Century Defense Needs, by Arul Mozhi, Ph.D., National Materials Advisory Board, The National Academies |
Saturday 10:30AM |
This paper presents the results of a recently completed National Materials Advisory Board study that examined the Department of Defense materials needs and research and development(R&D) priorities in the five classes of materials: Structural and Multifunctional Materials, Energy and Power Materials, Electronic and Photonic Materials, Functional, Organic and Hybrid Materials, and Bioderived and Bioinspired Materials. This paper integrates the R&D priorities from all five materials areas and presents the study’s R&D ecommendations. The study committee recognized that realizing the revolutionary new defense capabilities that materials science and engineering offer will depend on more than just R&D; innovative management will also be needed to reduce risks in translating fundamental research into practical materials, and to promote cross-fertilization of scientific fields. This paper also discusses these issues and presents the study’s recommendation for needed innovations in management. |
The Speed of Gravity: The End of the Universal Speed Limit, by Tom Van Flandern, Ph.D., MetaResearch |
Saturday 11:00AM |
No one disputes that the speed of gravitational waves must be the speed of light. However, the speed of propagation of the gravitational force is apparently very much faster than light according to all six available experiments that bear on the subject. We will discuss those experiments, their physical interpretation, and how this can be reconciled with the propagation speed of gravitational waves and with special and general relativity. Although nothing we will discuss implies the need for any changes in the math of relativity, we will see that the physical interpretation is not unique. The currently popular geometric interpretation may no longer be the best available way to understand the nature of gravitation. Meanwhile, the Le Sage model for the field interpretation has answered all challenges and provides an intuitive understanding of the phenomenon and distinguishing predictions. Naturally, this choice has implications for everything from quantum physics to Galactic exploration. This talk will use multimedia elements, and will be based in part on: “The speed of gravity What the experiments say” , T, Van Flandern, Phys.Lett.A 250, 1-11 (1998); and “Experimental Repeal of the Speed Limit for Gravitational, Electrodynamic, and Quantum Field Interactions” , T. Van Flandern and J.P. Vigier, Found.Phys. 32(#7), 1031-1068 (2002). |
The National Nanotechnology Initiative (and, by the way, what is nanotechnology?) byRichard Smith, Nanoverse, LLC, The Nanotechnology Network, Nanotechnology Policy Foundation |
Sunday 9:00AM |
It is estimated that over $10B will be spent on nanotechnology R&D this year (about 1/3 of it in the United States.) The hope for nanotechnology is immeasurable but the hype is also massive. Today’s nanotechnology products are useful but mundane: see-through high-SPF sunscreen and longer-lasting tennis balls. Tomorrow’s products are more interesting: high-strength, low-weight formable steel and highly sensitive diagnostic devices for medical care and homeland defense. The next decade will see more and more complex and valuable products entering the marketplace: cures for some cancers, materials that allow super-light and super-strong cars and ever-taller buildings, disassemblers to mine air and water pollution for valuable raw materials. And within the next fifty years, we MIGHT see molecular-scale computers and even robots that can manufacture macro-scale products or go inside the body to perform DNA repair. This presentation will define the likely stages of nanotechnology as it begins to permeate all areas of R&D and manufacturing and will help distinguish between the hype and the hope. The speaker will identify who’s spending what on what and how the audience can begin to take advantage of the coming nanotech boom. |
Back-Engineering Biological Information Processing Systems in the Human Body via Evolutionary Psychology, by Thomas Meylan, Ph.D., EvolvingSuccess |
Sunday 9:30AM |
Four primary information processing systems have been identified in the human body by utilizing primitive Darwinist principles for the analysis. All four of these systems have evolved to provide an animal system increasingly effective mastery over the environment, but each of these systems operationalizes the principles of natural selection in vastly different ways. All four of these systems will be outlined, but this presentation will emphasize the relationship between the system which produces complex emotional signaling and the system which supports symbol-based abstraction. Practical implications of this interplay will be discussed briefly toward the end of the presentation. |
Space Business Entrepreneurship and Exponential Innovation: Linking Public Sector Space Investments to Private Sector Economic Stimulus by Guillermo Sohnlein, Fortivo Consulting |
Sunday 10:00AM |
With President Bush establishing a new vision for NASA in the midst of the election-year focus on the national economy and a post-recession boom in global entrepreneurship, the stage is set to critically quantify the economic justifications for continued public investment in space programs. However, in order to conduct a comprehensive evaluation, a new analytical paradigm must be implemented to accurately capture the significant role of technological innovation and space business entrepreneurs. Traditional analytical methods amount to nothing more than a tracing of the lineal diffusion of public funds throughout the supply chain or the expanded impact of this diffusion at each link in the supply chain. However, only by exploring the unique level of exponential innovation associated with space initiatives can one gain a complete and accurate assessment of the economic stimulus return on a public sector space investment. |
Technology Transfer Trends: A 2004 Perspective by Richard Leshuk, P.E., President, Xfer Tech |
Sunday 10:30AM |
The modern era of technology transfer is usually dated from the Stevenson -Wydler act of 1980. By the early 1990’s, a series of legislative actions had significantly formalized the activity. This paper examines the maturation of technology transfer thinking over the past decade and quantifies evolving trends; these trends include changing attitudes towards metrics and a shift in corporate strategies. |
A New Engineering Process by Gene Allen, MSC Software Corp |
Sunday 11:00AM |
Mr. Allen will be providing background on a new engineering analyses process being used that takes advantage recent advances in computer capabilities. The process incorporates the natural variability and uncertainty that exists in reality into computer simulations. The process, referred to as stochastic simulation, uses advanced Monte Carlo techniques. The results of a stochastic simulation are displayed in a cloud of points that represents the reality of the physics being modeled, with each point representing a possible situation. Design improvements can be realized by using the Stochastic Design Improvement (SDI) process to move the cloud towards design targets. Some companies are using stochastic simulation in product design with significant success. EADS-CASA has reduced weight of a satellite launch dispenser was reduced from 500 to 337 lbs by changing the composite layup. Application of this process in the auto industry has resulted in improved crash worthiness with weight reduction. BMW reduced weight in a car model by 33 pounds, Nissan – 35 pounds, other cars at other companies had weight reductions of 55, 40, and 13 pounds. |
Making Information Systems Intelligent by Geoffrey P. Malafsky, Technology Intelligence International |
Sunday 11:30AM |
Artificial intelligence is often portrayed in mass media as machines being able to reason and sense their surroundings. While this long-term vision will continue to evolve, there is much greater need and opportunity for systems that proactively ingest, digest, and personalize great quantities of information into succinct and targeted knowledge for humans to use confidently and in real time. This capability is not provided by current technologies, however sophisticated and powerful. A new breed of intelligent systems is being developed that melds humans and computers into a single system where each complements the other. This design uses both state-of-the-art science and technology methods, like bio-MEMS and ontological reasoning engines, as well as the critical success factors that are most often overlooked in information systems, namely the reality of organizational dynamics and business process states. This talk will describe the components and architecture of this type of intelligent system, and show examples of early attempts and successes to use this in an operational environment. |
ASSOCIATION FOR
WOMEN IN SCIENCE. DC-METRO CHAPTER
Leaving the Ivory Tower: Social-Structural Causes of Doctoral Student Attrition by Barbara E. Lovitts, Ph.D., Research Scientist, Department of Sociology, University of Maryland |
Saturday 2:05PM |
Graduate schools have faced attrition rates of approximately 50 percent for at least the last half century. This study focuses on the social-structural factors that cause so many people to leave their programs without obtaining the Ph.D. The study involved surveying individuals who enrolled in doctoral programs in 1982-84 (both completers and noncompleters). The sample was drawn from two universities that are among the top Ph.D.-granting universities in the United States and, within each university, from nine departments. Telephone interviews were held with 30 noncompleters, approximately two from each department, in order to explore issues that could not be addressed adequately by the survey instrument. Telephone interviews were conducted with the Directors of Graduate Study from each participating department in order to obtain background information on the departments’ formal and informal structures and processes for educating graduate students. In addition, site visits were made to each university, and two senior faculty members from each participating department—one who had produced many Ph.D.s and one who had produced few or none—participated in face-to-face interviews in order to discern systematic differences in attitudes, beliefs, and behaviors of those most responsible for training graduate students. Differences between completers and noncompleters, and at-risk completers (completers who seriously considered leaving without completing their degrees) and on-track completers (completers who never considered leaving) were found to lie in the differential distribution of structures and opportunities for integration and cognitive map development within departments. |
Promoting Leadership among Undergraduate Faculty in Science, Technology, Engineering and Mathematics (STEM) by Jeanne Narum, Director, Project Kaleidoscope |
Saturday 2:30PM |
A crucial factor in building and sustaining strong undergraduate STEM programs is the quality of leadership within the faculty and administrators taking responsibility for tackling this work. Project Kaleidoscope (PKAL) has spent more than a decade fostering such leadership within the undergraduate STEM community, taking a multifaceted approach through which the range of leadership opportunities for the STEM community have been explored. PKAL has spotlighted the work of leaders who understand the “why” and the “how” of transformation of undergraduate STEM, distilled lessons learned from these experiences, and drawn out promising practices that can inform the work of emerging generations of leaders. Considering the “why” calls for attention to the external context that affects the work of leaders: changing student demographics, new directions in science and technology, emerging societal demands and opportunities for the scientific and technological communities. Considering the “how” calls for understanding about the politics of institutional renewal, how-to: build informed collaborating communities; keep connected to larger institutional visions; provide requisite resources of people, time, and space to support such efforts. Finally, the fundamental dimension of PKAL’s focus on leadership is that individuals take personal responsibility for making a difference. This presentation will highlight PKAL activities and approaches that have been particularly successful and could be adapted by other organizations and used in other environments. |
Institutional Resources and Family Strategies among Early Career Academics by Roberta Spalter-Roth, American Sociological Association |
Saturday 2:55PM |
Research on scientific disciplines suggests that academic careers are based on male breadwinner models. In order to promote gender diversity in these disciplines, work/.family policies such as stopping the tenure clock, family leave, or modified teaching and service loads have been instituted at many universities. These policies tend to be underutilized by academic parents because they are afraid that they will not be considered to be serious scholars if they use these policies. This paper examines access to other sorts of institutional resources, resource-based, or family-based strategies (such as child spacing strategies) and the effect on the early career success of a cohort of new PhDs, especially those that become mothers. The research is based on data from a longitudinal study of a cohort of sociologists who received their degrees from U.S. universities between July 1996 and August 1997. The results show that access to institutional resources and the ability to use institutional resources in professional activities increase the odds of early career success. Child spacing strategies are significant for women who want careers at research universities. |
PROGRESS for Women Chemists/Chemical Engineers by Felicia Dixon, American Chemical Society and Helen Free, Bayer Corporation |
Saturday 3:20PM |
PROGRESS is an American Chemical Society three-year pilot project designed to support the advancement, participation, and leadership of women chemists and chemical engineers in the workplace. Its goals are to assist entry-level professionals find employment and support early- and mid-career professionals seeking advancement. Seven programs make up the PROGRESS Project: 1) Corporate Recognition; 2) Web-based Resource Center; 3) Be Visible: Funding Speaker Opportunities; 4) ACS Course on Business & Leadership Skills for Women in the Chemical Workforce; 5) Thriving in the Workplace Road Shows: An Awareness Program; 6) GROW: Grants for Renewal Opportunities for Women; and 7) Academic Awareness/Site Visits. Each program addresses the following issues in facilitating women’s participation and advancement in chemistry: Partnerships: Cooperative interaction with other organizations to share success stories and best practices for advancing women’s careers; Reflection: Use of data to monitor the level of involvement of women in the activities of the Society and the percentage of women in senior-level positions as it reflects their general representation in the graduate school population; Openness: Communication about success strategies critical to career advancement and the challenges faced at career transitions; Grants: Funding for new opportunities and experiences that can increase visibility and enhance scientific reputation; Resources: Career development information, statistical information, development tools, etc.; Education: Structured career development; Site Visits: Advisory teams to assist in identifying and addressing barriers to the attraction and retention of women; Successes: Recognition of successes and best practices. The talk will focus on the activities and expected results of each PROGRESS program. |
Supporting Women’s Employment Success: Findings From IWPR’s Research and Organizational Experience by Vicky Lovell, Ph.D., Institute for Women’s Policy Research |
Saturday 3:45PM |
The Institute for Women’s Policy Research works to improve women’s employment outcomes and promote women’s leadership capacity through its organizational practice and research portfolio. IWPR’s internship and research fellow programs and its networks throughout the women’s movement provide leadership development opportunities to the Institute’s employees. Strong relationships with grassroots activists offer additional avenues for strengthening and diversifying the research and advocacy communities. The Institute’s research supports the development of public and private employment practices to expand women’s job opportunities and improve women’s employment outcomes, highlighting barriers in job training and education, private-sector practices, and public policies that impede women’s professional success and earnings stability. Policies designed to promote low-income women’s economic security are a particular focus of the Institute’s work |
The Talent Imperative in Science and Engineering – A Two Year Net Assessment Wanda Ward, National Science Foundation |
Saturday 4:10 |
The talent imperative in science and engineering remains a major challenge to the sustained vibrancy of the U.S. scientific enterprise, an enterprise that has fueled America’s economic competitiveness and standard of living for all Americans. This presentation describes the efforts of BEST (Building Engineering and Science Talent), a public-private partnership established to help build a stronger, more diverse U.S. workforce in science, engineering and technology. It addresses why the talent imperative matters, the case for meeting this national imperative, and major lessons-learned from a two-year assessment of what works at the PreK-12 level, higher education and the workforce to broaden S&E participation. The assessment represents the rigorous examination of evidence to identify what works and the subsequent distillation of design principles underpinning effective programs. These principles provide the tools for expanding and adapting what works. The presentation then makes priority recommendations and describes the challenge to policy makers, educators, and the private sector for nation-wide action to build needed S&E capacity through broadening participation. |
Best Practices for Recruiting, Retaining, and Advancing Women Scientists and Engineers in Industry by Mary C. Mattis, Ph.D., National Academy of Engineering |
Saturday 4:35PM |
Historically, women scientists and engineers have been disadvantaged compared to men in similar careers in academia and industry. Recently, industry has taken the lead in recognizing and acting on the need to identify and implement strategies to attract, retain, and advance women scientists and engineers, while academic institutions have been slow to recognize the need for greater diversity, and to adopt initiatives to address the chilly climate for women on science and engineering faculties. The presentation will: (1) contrast the cultures and work environments of academia and industry that impact the retention and advancement of women scientists and engineers; (2) examine possible differences in the motivations of academic and business organizations to undertake cultural transformation to eliminate barriers to the retention and advancement of women scientists and engineers; and (3) review characteristics of best industry practices for recruiting, retaining and advancing women scientists and engineers, along with the potential challenges of importing these practices into academic institutions. The presentation draws on quantitative data on women in engineering and science, as well as qualitative data from individual interviews and focus groups with women engineers and scientists, corporate representatives, and men and women engineering faculty and administrators. |
BOTANICAL
SOCIETY OF WASHINGTON
Botany in the Washington, D.C., Region: A Historical Overview by J. Douglas Ripley, Air National Guard, Andrews AFB |
Saturday 9:00AM |
The plants and ecological habitats of the Washington, D.C., region have long been studied by residents and visitors alike. American Indians relied on many native plant species, as well as farm crops such as corn and squash that came from elsewhere in North America. In 1607, Captain John Smith marveled at the Potomac’s rich, dense forests with many unfamiliar kinds of trees. Early botanists such as John Clayton, David Warden, and Samuel Rafinesque described many of the Washington area’s plants scientifically. More comprehensive views of the local flora emerged by the late 19th Century, and have been refined to the present day. For more than a hundred years, the Botanical Society of Washington has provided a forum for local botanists, addressing both diverse aspects of the local flora as well as the wide-ranging, worldwide interests of many Washington botanists. |
Botanical Diversity in the Washington, D.C., Region by Larry E. Morse, NatureServe |
Saturday 9:20AM |
The Washington, D.C., region, extending from the High Alleghenies of West Virginia to the ocean shore of New Jersey and the Delmarva Peninsula, offers diverse natural habitats and more than two thousand native plant species. The region’s geological and topographic variety, a range of climates and microclimates, and historical factors such as sea-level changes and proximity to the Pleistocene ice-age glaciers all contribute as factors underlying the geographical distributions of the region’s native plants. Two areas of particular note are the world-class freshwater-intertidal estuarine shores of Chesapeake Bay, and the Potomac River Gorge (from Great Falls to Georgetown and Rosslyn), one of eastern North America’s premier riverbank bedrock floodscour ecosystems |
Fall-Line Magnolia Bogs: A Distinctive Plant Habitat by Rod Simmons, Parks and Recreation Dept., Alexandria, Virginia |
Saturday 10:20AM |
In 1918, W. L. McAtee described the “magnolia bogs” as a distinctive habitat present in a few dozen places on the innermost Coastal Plain (near the Fall Line) in the Washington, D.C., area. Occurring where cool water seeps from hillside gravel deposits, these specialized wetlands are characterized by the presence of the native sweetbay magnolia (Magnolia virginiana) as well as other distinctive plants, such as peat moss (Sphagnum) and poison sumac (Toxicodendron vernix). While many of McAtee’s localities have been destroyed or badly degraded by development over the past decades, a few good examples remain (including one within the District of Columbia itself), and several additional high-quality magnolia bogs have been recently located, particularly in Charles Co., Maryland |
Invasive and Other Non-Native Plants in the Washington, D.C., Region by Elizabeth F. Wells, George Washington University |
Saturday 11:00 |
Most wild plants of the Washington, D.C., region are readily characterized as native (naturally occurring), or as non-native (present only due to direct or indirect human intervention, also called exotic, alien, or non-indigenous), with a few interesting cases such as the black locust (Robinia pseudoacacia) still being debated. Many additional kinds of plants from other lands have been grown in the Washington area for gardening and landscaping, agriculture, forestry, and other purposes, and yet others have arrived accidentally as weeds. However, length of cultivation in the local area does not correlate well with invasiveness; some plants grown by George Washington at Mt. Vernon are not known as escapes, while several species only recently introduced horticulturally have quickly spread to natural habitats. Once escaped, wild non-native plants can be further dispersed by wind, animals, and other means, as are native plants. Major floods have further contributed to the spread of non-native (as well as native) plants along the Potomac River |
Resources for Botanizing in the Washington, D.C., Region by Edward M. Barrows, Georgetown University |
Saturday 11:30 |
Information resources on the plants of the Washington, D.C., region range from traditional field guides to various Internet web sites. Illustrated field guides (such as Peterson or Newcomb) help with identification of the more common species, and various technical works are more comprehensive. Finding Wildflowers in the Washington-Baltimore Area provides directions to scores of publicly accessible sites, with notes on trees, shrubs, and wildflowers to be expected at each. Nature centers at many regional parks offer local expertise, and the Smithsonian Institution’s Naturalist Center provides a regional reference collection of pressed herbarium specimens for consultation. Various web sites provide more depth, including those of NatureServe (with information on classification, distribution, and conservation status); Georgetown University’s Biodiversity Database (including many photos); and the Botanical Society of Washington and other organizations sponsoring meetings and field trips. |
CHEMICAL
SOCIETY OF WASHINGTON (CSW)
Green Chemistry: Principles and Practice |
Saturday 9:00AM |
This symposium will introduce the principles of green chemistry, provide specific examples of greener technologies, and highlight the economic benefits of adopting environmentally friendly processes. Recent advances in green chemistry education will also be discussed. Green chemistry, the design of chemical products and processes that reduce or eliminate the use and generation of hazardous substances, is the most fundamental approach to pollution prevention. Green chemistry addresses the need to produce the goods and services that society depends on in a more environmentally benign manner. Examples of green chemistry approaches include the use of alternative feedstocks, the use of alternative solvents and reaction conditions, and the design of safer chemicals. Through the design and implementation of one or more of these green chemistry approaches, chemists have found ways to remove millions of pounds of hazardous substances from the products and processes that society needs, without sacrificing scientific innovation and creativity. Pfizer, for example, eliminated 140 metric tons of TiCl4, 150 metric tons of 35% HCl, and 100 metric tons of NaOH in redesigning the synthesis of sertraline, the active ingredient in the antidepressant drug Zoloft®. The implementation of green chemistry technologies has eliminated waste, improved safety, and saved industry money. Equally important is the incorporation of green chemistry concepts and principles into the curriculum. Providing examples of green chemistry technologies throughout the curriculum is one of the best ways to promote the adoption of green chemistry across the chemical enterprise. |
ILYA
PRIGOGINE TRIBUTE (Panel organized by WESS)
Prigogine’s Theories by Andrew Vogt, Department of Mathematics Georgetown University |
Sunday 9:10AM |
The speaker will briefly review some of Ilya Prigogine’s theoretical accomplishments, including the law of minimum entropy production for near-equilibrium systems, the concept of dissipative structure in far-from-equiliirium systems, his formulation of nonequilibrium statistical mechanics, and his research on nonintegrability in Hamiltonian systems |
Prigogine’s Concepts of Relations between Science and Society by Joseph E. Earley, Sr., Professor and Head (ret.) Department of Chemistry, Georgetown University |
Sunday 9:35AM |
Ilya Prigogine (1917- 2003) had an unusual notion of the relationship of science to the rest of human culture. This non-standard view was made clear in La Nouvelle Alliance (Gallimard, 1979) co-authored by (then) graduate student Isabelle Stengers. The English version of this work, Order Out of Chaos (Bantam, 1984), is probably the best known of Prigogine’s many books for general readers. A number of physicists and chemists and have found aspects of Prigogine’s work troublesome – even, for some, highly objectionable. This presentation summarizes some points of Prigogine’s general outlook and examines objections raised against them. |
Science, Hope and History by Robert Artigiani, Professor and Head, History Department U.S. Naval Academy |
Sunday 10:00AM |
Prigogine’s pursuit of a “New Alliance” indicates just how revolutionary his scientific vision was. Convinced that the classical paradigm–even as modified by relativity and quantum physics–was unsatisfactory, he argued for a historical turn that would enable science to track qualitative change. To do so he introduced the idea of “dissipative structures” that emerged and evolved at symmetry-breaking discontinuities. Describing a nature of processes rather than things, Prigogine hoped his science would be as applicable to human history as to nature. This presentation will explore one way to interpret human history as a succession of self-organizing systems, whose products–conscious, free, and moral individuals–give the process meaning. It will also argue that the purpose of Prigogine’s revolution was to establish a “New Rationality” with the potential to reground ethics in science. |
INSTITUTE OF
ELECTRICAL AND ELECTRONICS ENGINEERS (IEEE), D.C. AND NORTHERN VIRGINIA
SECTIONS
Nano-Bio Technology – An Overview by Dr. Anantha Krishnan, Program Manager, Defense Advanced Research Projects Agency (DARPA) |
Sunday 10:30AM |
This talk will review recent developments and future directions of nano-bio technology, a rapidly developing area of convergence of biotechnology, micro/nano technology and information technology. |
MAGLEV by Philiip Holmer, Titan Systems and frequent invited speaker at IEEE regional, technical and University chapters. |
Sunday 11:15AM |
Maglev refers to super high-speed transport systems with a non-adhesive drive system that is independent of wheel-and-rail frictional forces. The acronym is derived from Superconducting Magnetic Levitating Vehicles. |
INSTITUTE OF
INDUSTRIAL ENGINEERS, NATIONAL CAPITAL CHAPTER
Business Improvement Methodology Integration by Sharon M. Valencia |
Saturday 2:00PM |
Today’s competitive business environment requires constant improvement in business processes in order to maintain competitive position. This presentation provides an overview of business process management and compares improvement methodologies such as Lean, Six Sigma, Design For Six Sigma (DFSS), Business Process Reengineering (BPR), Total Quality Management (TQM), and Kaizen. Techniques for enabling project success through methodology integration will also be covered. |
Rapid Development for Migration of Legacy Applications to a Web Environment by Dr. David L. Danner, P.E, IDEAMATICS, Incorporated |
Saturday 3:00PM |
Prior to Operation Iraqi Freedom, the United States Navy (USN) had demonstrated the viability of a mobilization processing and manpower requirement tracking system for the activation of reservists based on a Microsoft Windows client-server system developed for the United States Marine Corps (USMC). The need to immediately deploy a system for use throughout the USN dictated a rapid development effort to migrate the legacy USMC application to a Web-based USN application. The Navy-Marine Corps Mobilization Processing System (NMCMPS) was programmed in four weeks time, and was deployed Nation-wide in less than ten weeks from project start. The development team migrated 80,000 lines of code and added USN-specific functionality while reducing the amount of code to 10,000 lines. This presentation provides an explicit methodology using existing productivity-improvement tools for a rapid development approach to the migration of a legacy application to a web environment. Unlike approaches which simply try to replicate the application in a Web shell, this approach converts the application, providing for efficiencies in processing, database storage and communications. The key to the methodology is the use of the .NET programming tools in concert with a systematic work plan and a formalized decision protocol. |
Various Types of Possible and Feasible Means of Ergonomics Training by Jeffrey Fernandez, PhD, PE, CPE, JFAssociates, Inc. |
Saturday 4:00PM |
When employees are well trained in ergonomics, businesses can realize economic benefits in less lost time and fewer worker compensation claims. Research demonstrates numerous positive benefits from ergonomics solutions, including increased productivity and work quality, and decreased absenteeism. This presentation talks about the various types of possible and feasible means of ergonomics training, the focus of the training, and the contents of such training. |
INTERNATIONAL
ASSOCIATION FOR DENTAL RESEARCH, WASHINGTON SECTION
Structure-Property Relationships of Thermoset Methacrylate Composites as a Function of Resin Matrices, Nanofillers, and Nanofiller Surface Chemistries by Kristen Wilson, Elizabeth Wilder, Joseph Antonucci NIST Polymers Division (Support was provided from NIDCR/NIST Interagency Agreement Y1-DE-1021-03) |
Saturday 9:15AM |
The goal of this research is to better understand the interactions and relationships between nanoparticle fillers, their surface treatment chemistries, various dental resin matrices, and the resultant properties of these thermoset methacrylate composites. These methacrylate composites are primarily designed for use as restorative and sealant dental materials. For optimal clinical performance, properties such as high strength, facile polymerization, high degrees of vinyl conversion, low polymerization shrinkage, and processable viscosities are desirable. One of the composites of interest was a visible light-curable system containing a 50:50 by mass mixture of 2,2-bis[p-(2′-hydroxy-3′-methacryloxypropoxy)phenyl] propane (Bis-GMA) and tri(ethylene glycol) dimethacrylate filled with 40 nm clustered silica particles that were silanized with various blends of two silane agents, 3-methacryloxypropyltrimethoxysilane (MPTMS – a typical coupling agent for glass-filled, acrylic composites) and n-octyltrimethoxysilane (OTMS). Methacrylate conversion after two minutes of light-irradiation was measured by Near-IR spectrometry. Mechanical properties of these nanocomposites 24 hours after photopolymerization were measured by three-point bend and biaxial flexural strength tests. In general, it was found that increased concentrations of OTMS and decreased concentrations of MPTMS in the surface treatments of the nanosilica particles lowered the moduli and flexural strengths of the cured composite materials. However, the composites containing silica silanized with a 50:50 mixture of MPTMS and OTMS resulted in slightly higher moduli compared to the other composites. A second composite system of interest was a bioactive composite capable of sustained release of calcium and phosphate ions into aqueous environments. It consisted of an ethoxylated bisphenol-A dimethacrylate matrix filled with surface-treated amorphous calcium phosphate (ACP) fillers (40% by mass), which are clustered nano-sized fillers analogous to the nanosilica fillers. It was found that adsorption of poly(ethylene glycol) (PEG), PEG-dimethacrylate, or MPTMS onto ACP particles resulted in small increases in the moduli and flexural strengths compared to composites filled with untreated ACP. Ongoing experiments are investigating conversion and polymerization shrinkage as a function of filler concentration and surface treatment. Also, AFM is being used to probe the microstructure of these composites; fracture surfaces also are being assessed to determine the effect of filler types and surface treatments on fracture behavior. Finally, non-clustered, surface-treated, colloidal silica particles will be investigated as fillers for the purpose of assessing their effects on thermoset methacrylate composites. |
Synthesis and Characterization of PEG and PEG Urethane Dimethacrylate Hydrogelsby S. Bencherif, J.A. Cooper, N.R. Washburn, J.M. Antonucci, S. Lin-Gibson, NIST Polymers Division and Ferenc Horkay, Sect Tissue Biophys & Biomimet, Lab Integrat & Med Biophys, NIH (Support was provided from NIDCR/NIST Interagency Agreement Y1-DE-1021-03) |
Saturday 9:30AM |
A key area in the repair and regeneration of tissues is optimizing the polymeric caffold-tissue response. This study is designed to better understand the relationships between polymer matrix structure and properties to cell response. The current study includes the preparation/characterization of a series of polyethylene glycol (PEG) dimethacrylates and PEG urethane dimethacrylates, their conversions in aqueous solution to hydrogels by photopolymerization, and a preliminary assessment of the correlation of mechanical and cell response to hydrogel structural variations. MALDI-TOF MS confirms the formation of oligomers of high purity and narrow mass distribution (PD < 1.02). Aqueous dimethacrylate solutions were photopolymerized to hydrogels. The gel structures are probed by small angle neutron scattering (SANS) and are correlated to the mechanical properties determined by rheology and uniaxial compression tests. Bovine chrondrocytes, seeded in hydrogels, were used to assess the cell responses to the hydrogels. Preliminary studies showed varied mechanical response but that cells were completely viable in both types of hydrogels after two weeks. |
Improved Bioactive Polymeric Composites for Mineralized Tissue Regeneration by Walter G. McDonough, NIST Polymers Division, Drago Skrtic and Janet B. Quin, American Dental Association Foundation – Paffenbarger Research Center, NIST, and Da-Wei Liu and Joseph M. Antonucci, NIST Polymers Division |
Saturday 9:45AM |
Crystalline hydroxyapatite (HAP) is the structural prototype of the major mineral component of teeth and bones. In contrast to HAP, amorphous calcium phosphate (ACP), a postulated precursor to biological HAP, shows high solubility/degradability in aqueous media, readily liberating calcium and phosphate ions, and transforming readily to crystalline apatitic calcium phosphate. These properties suggested its use as a bioactive filler in polymeric dental composites derived from the photopolymerization of dental monomers such as 2,2-bis[p-(2′-hydroxy-3′-methacryloxypropoxy) phenyl] propane (BisGMA), triethylene glycol dimethacrylate (TEGDMA) and 2-hydroxyethyl methacrylate (HEMA). The objectives of this study were to evaluate the effect of: 1) using acidic comonomers in the resin matrices of ACP composites, especially with regard to the degree of vinyl conversion (DC) and mechanical strength after polymerization, and 2) using small amounts of short polyethylene fibers (2 µm to 3 µm) on the mechanical behavior of ACP composites, especially with regard to their ability to arrest crack propagation. The DC was not significantly affected by the addition of the acidic comonomers. The acidic monomers with relatively hydrophobic chemical structures improved the strength and durability of ACP composites. ACP composites containing polyethylene fibers maintained their strength while exhibiting improved fracture toughness by their ability to resist catastrophic failure. (Support by NIDCR/NIST Interagency Agreement Y1-DE-1021-04 and NIDCR grant 13169-05). |
TBA | Saturday 10:00AM |
TBA | Saturday 10:15AM |
Discussion Panel | Saturday 10:30AM |
It’s One Ocean After All by Barry Stamey, Chairman, Washington D.C. Section, Marine Technology Society |
Sunday 10:30AM |
We have “One Ocean” shared by the world’s community, and no matter what you do as an individual or organization in your involvement with our ocean, your actions influence all other actions that affect our global ocean. The future of our ocean is today at a critical juncture, and the specific action, or inaction, of those throughout our global community in the next very few years will determine our ocean’s vitality and value for at least the next several generations of mankind. There are many wonderful programs that are working through key aspects of this global challenge, but it is time for all of us across the entire ocean spectrum – government, industry, academia, researchers, educators, the public, and others – who truly want to make a difference – to engage the future now and come together as colleagues of our global ocean community. We must stress the necessity of maximizing our understanding of our ocean, balancing our use and stewardship, and looking forward with new and exciting science and technology. The role of “ocean” professional societies, such as the Marine Technology Society, in both advancing our knowledge of our ocean and helping work through the science that supports myriad policy issues, has now reached an unprecedented level of critical need. And even the definition of an “ocean” professional society is no longer as expected, because the future of our ocean touches on every aspect of science and technology and affects every member of our global society. We invite you to participate in this interactive session that will examine these challenges, solicit your inputs and recommendations, and outline the planning that will culminate in an international landmark conference in Washington, D.C. in September, 2005 – OCEANS 2005 MTS/IEEE – to share the knowledge that will benefit the world’s community and the future of our “ONE OCEAN.” |
NATIONAL
CAPITAL SECTION/OPTICAL SOCIETY OF AMERICA
The Instrument Synthesis and Analysis Laboratory At NASA Goddard Space Flight Center by Dr. H. John Wood, NASA Goddard Space Flight Center |
Saturday 9:15AM |
This paper will address the development of a new instrument design laboratory at the NASA Goddard Space Flight Center. Engineering studies for pre-proposal space-science and earth-science instrument designs had taken typically 6 months or more in the past. A rapid design capability was clearly needed in the new competitive arena in which Goddard finds itself today. The Instrument Synthesis & Analysis Laboratory (ISAL) has unprecedented resources and can provide a rapid and sustainable instrument-development environment. Typical studies are complete in two weeks. The ISAL supports instruments at different maturity levels such as direct AO response, trade studies in advance of AO and Instrument Incubator Program projects. The ISAL has been an operational facility since the spring of 1999 and has completed more than 50 studies (X-ray telescopes to microwave radiometers) since its inception. A cadre of highly skilled discipline engineers is put together with the customer science team to develop the customer’s instrument concept. Detailed designs are derived with significant analysis of the design performed during the study. In Spring 2001, ISAL management and operations were unified with the Integration Mission Design Center (IMDC) to form the Integrated Design Capability (IDC). |
Modeling Studies for the MODIS Solar Diffuser Attenuation Screen by Eugene Waluschka, Xiaoxiong Xiong, B. Guenther, William Barnes and Vincent V. Salomonson, NASA Goddard Space Flight Center |
Saturday 9:35AM |
On-orbit calibration of the reflected solar bands on the EOS Moderate Resolution Imaging Spectroradiometer (MODIS) is accomplished by having the instrument view a high reflectance diffuse surface (SD) illuminated by the sun. For some of the spectral bands this signal proves to be much too bright and results in the saturation of detectors designed for measuring low reflectance (ocean) surfaces signals. A mechanical attenuation device in the form of a pin hole screen is used to reduce the signals to calibrate these bands. The sensor response to solar illumination of the SD with and without the attenuation screen in place will be presented. The MODIS detector response to the solar diffuser is smooth when the attenuation screen is absent, but has structures up to a few percent when the attenuation screen is present. This structure corresponds to non-uniform illumination from the solar diffuser. Each pin hole produces a pin-hole image of the sun on the solar diffuser, and there are very many pin hole images of the sun on the solar diffuser for each MODIS detector. Even though there are very many pin-hole images of the sun on the solar diffuser, it is no longer perfectly uniformly illuminated . This non-uniformly illuminated solar diffuser produces intensity variation on the focal planes. The results of a very detailed simulation will be discussed which show how the illumination of the focal plane changes as a result of the attenuation, and the impacts on the calibration will be discussed. |
CHARMS: The Cryogenic, High-Accuracy Refraction Measuring System by B.J. Frey, D.B. Leviton, NASA Goddard Space Flight Center |
Saturday 9:55AM |
The success of numerous upcoming NASA infrared (IR) missions will rely critically on accurate knowledge of the IR refractive indices of their constituent optical components at design operating temperatures. To satisfy the demand for such data, we have built a Cryogenic, High-Accuracy Refraction Measuring System (CHARMS) which, for typical IR materials, can measure the index of refraction accurate to ±5 x 10-5. This versatile, one-of-a-kind facility can also measure refractive index over a wide range of wavelengths, from 0.105 µm in the far-ultraviolet to 20 µm in the mid-IR, and over a wide range of temperatures, from 10 K to 100°C, all with comparable accuracies. We first summarize the technical challenges we faced and engineering solutions we developed during the construction of CHARMS. Next we present our “first light,” cryogenic, IR index of refraction data for LiF, BaF-2, and CaF-2. Finally, we compare our data to previously published results for these materials. |
Electromagnetic Scalar Potentials:Overview and Historical Perspective by Martin J. Lahart, U.S.Army Research Laboratory |
Saturday 10:30AM |
It has been known for about a hundred years that electric and magnetic fields that satisfy Maxwell’s equations can be derived from two scalar functions. Scalar potentials have been investigated many times in the hope that analyses that are made in terms of the six interrelated components of the electric and magnetic fields could be simplified by describing fields in terms of two quantities that are relatively simple to compute. This paper reviews investigations of scalar potentials that have been made over the years. It discusses possible definitions of scalar potentials and describes their properties, including a requirement that scalar potentials have a preferred direction, for which their functional form is different from that in the other two directions. It describes applications to boundary value problems and to the computation of electric and magnetic fields generally. Restrictions of coordinate systems in which calculations involving scalar potentials are discussed. These restrictions, combined with the requirement of a preferred direction, limit the use of scalar potentials to specific geometries. The relationship of scalar potentials to charge and current sources of electromagnetic is described. It is shown that scalar potentials cannot always be used in regions where charges or currents are present. The limitations that these restrictions impose are discussed. Some computational examples on the use of electromagnetic scalar potentials are given. It is demonstrated that, when it is possible to describe a problem in terms of scalar potentials, their use can lead to a considerable simplification of the calculations. |
Ordered Transactions Strategy for Optically Connected Multiprocessor Systems by Neal K. Bambha, US Army Research Laboratory and Shuvra S. Bhattacharyya, University of Maryland, College Park |
Saturday 10:50AM |
This paper presents techniques for efficiently mapping digital signal and image processing (DSP) applications onto processing architectures that are specifically streamlined to accomplish these tasks. These applicationspecific embedded systems provide significant advantages in terms of weight, power, cost, and computational power compared to generalpurpose computers. As VLSI feature sizes shrink, interconnects between processing elements are becoming a limiting factor for high performance systems. One way to solve this problem is to utilize optical interconnects to replace the longest metallic interconnects. Such hybrid optical/electronic designs are particularly promising for systems that have large computational requirements and that must satisfy these requirements in real time. DSP applications often possess a high degree of parallelism, and thus can potentially benefit from parallel processors. However, we show that interprocessor synchronization and communication (IPC) costs can quickly negate these advantages for architectures using electronic interconnects. Because DSP applications are characterized by limited control flow, we have the opportunity to perform extensive compiletime analysis in order to minimize IPC cost. We introduce a class of fiberberbased architectures utilizing wavelength division multiplexing, and an efficient graph theoretic ordered transaction technique for optimizing communication patterns in such architectures. We show that significant performance advantages for DSP applications can be achieved through the combination of
We compare simulation results for systems |
OE Interconnects and OE Processing Based on VCSEL-CMOS 2-D Arrays by G. J. Simonis, W. Chang, J. J. Liu, P. Shen, P. Newman, N. Das, G. Dang, and M. Taysing-Lara Army Research Laboratory |
Saturday 11:10AM |
This paper reports our results regarding the development of vertical-cavity surface-emitting-laser (VCSEL) 2-D arrays at 850- and 980-nm wavelengths with oxidized-aperture diameters of between 5 and 15 µm. It further reports their incorporation into dense optoelectronic flip-chip (OE) interconnect arrays hybridized onto CMOS drivers along with III-V-detector-CMOS OE receivers. These interconnects are being integrated into experimental OE processing architectures such as a digital-half-tone image compressor. The VCSEL hetero-structures are designed at ARL and grown in our MBE growth facilities. Individual VCSELs incorporate either GaAs quantum wells (850 nm and top-emitting) or InGaAs quantum wells (980 nm and either top- or bottom-emitting). The normal configuration of the VCSEL 2-D array is in an 8×8 geometry with a 125-µm spacing between VCSELs for an interconnect density of 64 interconnects/mm2. The VCSEL arrays are flip-chip mounted onto silicon-based CMOS (980-nm bottom-emitting VCSELs through GaAs substrate) or silicon-on-sapphire (SOS) CMOS (850-nm top-emitting VCSELs through SOS sapphire substrate). Individual VCSELs have been found to have 3-dB bandwidths as high as 6 Gb/s/ch for an array data flux density as high as 384 Gb/s/mm2 with the employment of appropriate CMOS or III-V drivers. The near-term incorporation of carbon p-type dopant and semi-insulating substrates will substantially increase the operational bandwidth of the VCSELs, reduce optical cavity losses, and reduce VCSEL current thresholds. Interconnect coupling is presently achieved with free-space lens optics but could also be accomplished with other media such as fiber image guide. The photo-receivers are based upon GaAs flip-chip PIN detector arrays for 850 nm and InGaAs/InP flip-chip PIN detector arrays for 980 nm. Optical cross-talk between channels is less than -20 dB. These dense optical interconnect arrays will be of interest for the movement of large bandwidths of data on various functional multi-sensor platforms and within OE processing architectures. |
Light Activated Medical Diagnostics and Therapies by Dr. Ronald W. Waynant, FDA/CDRH, Electro-Optics Branch, Dr. Ilko K. Ilev, FDA/CDRH, Electro-Optics Branch and Dr. Juanita J. Anders, Uniformed Services University of the Health Sciences |
Saturday 11:30AM |
Light has taken an important role in both the information age and in the medical field. Not only does it have a dominant role in modern communications and information transmission as well as in information storage such as CDs and DVDs, it is now moving to play an important role modern medical diagnostics and therapeutics. In diagnostics mid-infrared wavelength laser sources and quantum well detectors are beginning to play a role in identifying endogenous trace gas biomarkers that signal abnormal conditions in the body. By non-invasively analyzing human breath, numerous indicators of disease and health status can be identified. Light may also play a role in the therapy needed to heal the body. Although more research needs to be done to determine precise dosage, already nearly one hundred ailments that respond to light therapy have been identified. Light treatment may have few side effects. In addition in some cases it can help heal wounds or relieve pain that will not resolve with the traditional medical practices. |
The Rayleigh-Sommerfeld Diffraction Integral is Superior to Fresnel-Kirchhoff by R. Lucke, Naval Research Laboratory |
Saturday 2:00PM |
The venerated Fresnel-Kirchhof (FK) diffraction integral gives a different description of Poisson’s spot than does the less-familiar Rayleigh-Sommerfeld (RS) formulation. FK is obviously wrong, while RS is obviously right(?). Basically, FK is restricted to small diffraction angles (which is usually not an important limitation). The flaw in the derivation of FK is discussed (the FK solution is well-known not to satisfy the boundary conditions under which it is derived), as is the fact that a derivation using Fourier propagation leads to RS, not to FK. Both FK and RS use the scalar wave approximation, which means that an acoustics experiment could prove which is right, an experiment that could be done by an enthusiastic undergraduate |
Hyperspectral and Multi-spectral Remote Sensing of Atmospheric Water Vapor and Cirrus Clouds by Bo-Cai GAO Naval Research Laboratory, Washington, D.C. |
Saturday 2:20PM |
Through analysis of hyperspectral imaging data collected with the Airborne Visible Infrared Imaging Spectrometer (AVIRIS) during the late 1980s and early 1990s, it is found that narrow channels located within and around the 0.94-micron water vapor bands are useful for the remote sensing of the columnar amount of atmospheric water vapor. It is also found that narrow channels located within the 1.38-micron and 1.88-micron strong water vapor absorption bands are useful for remote sensing of cirrus clouds. Based on these observations, several narrow channels near 0.94 micron have been selected and implemented on the Moderate Resolution Imaging Spectrometer (MODIS) instrument for remote sensing of water vapor from space, and a narrow channel centered at 1.375 micron has also been implemented on MODIS for global measurements of cirrus clouds. Sample water vapor images derived from AVIRIS and MODIS data will be presented. Examples of global cirrus cloud reflectance images obtained from the MODIS data will also be presented. It will be shown that if an additional narrow channel near 1.88 micron is implemented on a future multi-channel meteorological satellite sensor, our ability in global remote sensing of cirrus optical depths and effective particle sizes will be improved significantly |
Satellite Measurements of Multi-Decadal Trends in Noctilucent Clouds by Eric P. Shettle, Naval Research Laboratory, Remote Sensing Division, Matthew T. DeLand.,SSAI, Gary E. Thomas, LASP, Sharon P. Burton, SAIC, MS 475, NASA Langley Research Center, John J. Olivero, Embry-Riddle University, Dept. of Physical Sciences, and Larry W. Thomason, NASA Langley Research Center |
Saturday 2:40PM |
Noctilucent clouds (NLCs) have been observed from the ground since 1885 and from satellites since 1969. They occur at high latitudes during a three month period starting about one month before the summer solstice. They form in the upper mesosphere, at altitudes of 80 to 85 km. NLCs are composed of small ice particles, which form during the polar summer when the upper mesosphere reaches temperatures of less than 130 K, so that even the few parts per million of water vapor available at those altitudes becomes highly supersaturated. There are now several sets of satellite measurements of NLCs, each of which cover a decade or longer with the same instrument or multiple copies of the same instrument on different satellites. We will focus on two of these datasets from the Solar Backscatter UltraViolet [SBUV and SBUV/2] instruments on the NOAA polar-orbiting meteorological satellites and from the Stratospheric Aerosol and Gas Experiment [SAGE II]. While both of these instruments were developed primarily to measure stratospheric ozone, and in the case of SAGE II also measure other trace gases and aerosols in the stratosphere, they have proven sensitive enough to measure NLCs. These measurements show that the number of NLCs observed each season exhibit a strong anti-correlation with the solar Lyman-alpha flux. While there is no clear indication of a long-term trend in the frequency of occurrence of all NLCs, there is evidence that the frequency of brightest NLCs are increasing. There is also clear evidence that the average brightness or albedo of the NLCs observed by the SBUV instruments has increased over the past quarter century. Recent modeling results by Thomas et al. have shown that this increasing brightness is consistent with the increase in mesospheric water vapor over the same period. We will discuss the implications of these findings in terms of our understanding of global change. |
Linear and Nonlinear Magneto-Optical Rotation in Ultra-cold Sodium by J. Nash, Naval Air Systems Command C. Adler, St. Mary’s College, Dept. of Physics, and F. A. Narducci Naval Air Systems Command and St. Mary’s College, Dept. of Physics |
Saturday 3:00PM |
Recently, there has been an increased interest in polarization rotation in atomic media immersed in a magnetic field (see, cf. [1]). Due to atomic coherence effects, resonances as narrow as 2 x1Hz [2] and rotation angles as large as 10 radians [3] have been reported in hot rubidium cells. These experiments rely on special wall coatings or buffer gas cells to preserve the coherence by either preserving the polarization as the atoms collide with the cell wall or by keeping the atoms in the coherence-inducing laser fields by non-depolarizing collisions with buffer gas atoms. In this paper, we report on our measurements of polarization rotation in a novel medium that does not rely on either of these techniques. We report on our measurements of both linear and non-linear polarization rotation in ultra-cold sodium and compare to our theory, which includes the effects of atomic recoil. |
Exploring Sun-Earth Connections: A Physical Science Program for (K-8) Teachersby D.J. Michels, The Catholic University of America and Code 7660M, Naval Research Laboratory, S. M. Pickert, The Catholic University of America, C. J. Montrose, The Catholic University of America, and and J. L. Thompson, The Catholic University of America. |
Saturday 3:20PM |
An experimental, inquiry-based and standards-referenced physical science curriculum for undergraduate, pre-service K-8 teachers is under development at the Catholic University of America in collaboration with the Solar Physics Branch of the Naval Research Laboratory and NASA’s Sun-Earth Connection missions. A feature of this program will be its use of solar data in the form of images and movies from ongoing space missions, to illustrate basic concepts of physics and to inspire student interest and curiosity. Courses will be team-taught by faculty from the University’s Departments of Education and Physics with active participation by researchers from the local solar physics community. Teaching goals will include pedagogical methods appropriate to the physics content. This is a progress report. |
Coffee break and Don Michels SOHO demonstration |
Saturday 3:30PM |
Twenty-five Years of Interferometric Fiber Optic Acoustic Sensors at Naval Research Laboratory by James H. Cole, Clay Kirkendall, Anthony Dandridge, Gary Cogdell and T. G. Giallorenzi |
Saturday 4:00PM |
Interferometric fiber optic acoustic sensors are based on measuring the phase change of light traveling in an optical fiber due to the strains developed in the fiber by a pressure field. Fiber interferometry is extremely sensitive allowing detection of periodic length variations on the order of a few hundred femtometers (~10-13 meters). This paper will cover the development of these sensors from 1977 to the present. A brief introduction will describe the operation of interferometric fiber optic sensors including component development and interferometric demodulation techniques. A discussion of the transduction mechanism from the pressure field to phase modulation in the fiber will follow. The mechanical design of various sensor configurations including coated fibers, fiber coils, solid and air-backed mandrels will be reviewed. Recent results measured on the acoustic sensors employing fibers coated with air-included polymers will also be presented. The significance of fiber optic sensor multiplexing for use in multi-element arrays will be discussed. Finally, a photograph of the prototype sensors similar to those deployed on the new Virginia Class submarine will be presented. |
PHILOSOPHICAL SOCIETY OF WASHINGTON
The Economics of the Moon by Dr. Klaus P. Heiss, High Frontier, Inc. |
Saturday 11:00AM |
Applying von Thünen’s economic laws on the location of various economic activities (1826 and 1850) the author concludes that the principal economic uses of the Moon for benefiting Earth will be limited to observations, communications, Cis- and Trans-lunar Space transportation and, potentially, energy applications.With the ability to deploy vast distributed aperture observatories across the electromagnetic spectrum on the Moon looking toward Earth and deployed on a stable platform – the Moon – a revolutionary new era in Earth observations will be opened: long-dwell, high resolution observations from infra-red, near infrared, optical, microwave up to x-ray and gamma ray observatories, passive and active, of the earth’s land and ocean areas. Specific applications and benefits deriving therefrom are discussed – based not on simulations but quantitative statistical measurements and estimates, including crop measurements, monitoring and forecasting, with ‘feedbacks’ to crop distribution and production decisionsThe potentially vast economic uses of the Moon for Earth should not be prejudged by narrow preconceptions held by some in the scientific community because of a lack, or unwillingness, to explore these potentials by ‘looking back to earth’ from the moon, rather than only out to the stars. |
POTOMAC
CHAPTER OF THE HUMAN FACTORS AND ERGONOMICS SOCIETY
Panel Discussion: Human Factors and Ergonomics: Improving the Technology of our Times Panelists from the Potomac Chapter Human Factors and Ergonomics Society: Douglas Griffith, Richard L. Horst, Gerald P. Krueger, Jack I. Laveson, and John W. Ruffner |
Saturday 11:00AM |
Panel members will provide an introduction to and an overview of the field of human factors and ergonomics. The relevance of human factors and ergonomics to national security, public policy, safety, and consumerism will be discussed. Computer usability issues will be examined and solutions offered. The panel will field questions from the audience. There will be lively, interactive discussions among the panelists and between the panel and audience. |
SCIENCE AND
ENGINEERING APPRENTICE PROGRAM, GEORGE WASHINGTON
UNIVERSITY
The Effects of United States Ground Water Levels on J2 by David Price, Thomas Jefferson High School for Science and Technology Mentor: Dr. Thomas Johnson Lab: United States Naval Observatory |
Sunday 10:30AM |
This project examines whether continental United States net groundwater movements could account for part of the mass redistribution required to explain observed changes in J2 rate. The J2, also called “dynamic oblateness” or “flatness”, is a measure of the deviation of Earth’s shape from the spherical and has important effects on satellite orbits and length of day. In 1998, the J2 of the Earth inexplicably began to increase after having decreased since measurements began around 1980. This research is based on well depth data from the US Geological Survey (USGS). The quality of the data varies, so only high-quality readings are selected for analysis. Next, meaningful geographic regions are defined such that changes in groundwater levels within each region have similar effects on J2. The USGS data are then used to estimate groundwater mass in each region. The effect of observed groundwater mass on J2 is then calculated. Continental US groundwater mass varied by approximately 3 x 1015 kilograms between 1993 and 2003; the change in J2 rate due to this change in mass would be approximately 4 x 10-12 per year. The observed global change in J2 rate over the same period was 3.2 x 10-11 per year. Therefore, the continental US, with about 5% of the Earth’s land mass, contributed approximately 12% of the change in J2. Extrapolating these results to the global distribution of groundwater could explain much of the observed changes in J2. |
On Calculating the Moon’s Times of Perigee and Apogee and on the Minimum Distance Between Two Keplerian Orbits by Andrei Munteanu, Benjamin Banneker High School Mentor: Dr. Marc Murison Lab: United States Naval Observatory |
Sunday 10:45AM |
A web application was developed which reads JPL’s DE405 Ephemeris file and determines times of perigee and apogee of the moon. This interactive web page will be included in the suite of astronomical data applications available at the ISNO Astronomical Applications website. Moreover, research was continued on the problem of finding the minimum orbital intersection distance (MOID) between two Keplerian orbits. Specifically, a 2D Newton-Rhapson solver was implemented numerically to find the roots of a set of tow bivariate trigonometric polynomials that are at the heart of the problem. |
Neurotrophin Gene Expression Profiling in the Hippocampus and Amygdala of Acutely Stressed Mice by Renee Park, Montgomery Blair High School Mentors: Wenling Eileen Chang, CPT Jose Pizarro, Dr. Lucille Lumley, Dr. James Meyerhoff Lab: Walter Reed Army Institute of Research (WRAIR |
Sunday 11:05AM |
The amygdala and the hippocampus are known to be involved in the regulation of anxiety and stress responses. Studies have shown that cell proliferation is inhibited by psychosocial stress, which may lead to the down-regulation of neurotrophin genes. Previously, our lab presented the evidence that the levels of brain-derived neurotrophic factor (BDNF) mRNA are significantly decreased in the mouse hippocampus after social defeat. In this study, total mRNA was extracted from both the hippocampus and the amygdala and used to examine changes on neurotrophin expression in these brain areas. We used the GEArray Q Series Mouse Neurotrophin and Receptors Gene SuperArray Kit to profile the expression of 96 neurotrophic genes that may be involved in social stress. ScanAlyze v2.50 and GEArrayAnalyzer v1.3 were utilized to analyze the SuperArray for each brain region. The Onto- program was used to create a functional profile of the 96 genes. The results support our findings as well as the findings of several other studies that link a decrease of hippocampal BDNF expression with stress. BDNF was also found decreased in the amygdala. Other genes, such as corticotropin-releasing hormone (CRR), were determined to be down-regulated in both the hippocampus and amygdala. Conversely, persephin (Pspn) and neurotrophic tyrosine kinase receptor type 1 (Ntrkl) were found to be up-regulated in the amygdala and neuropeptide Y receptor Y5 (Npy5r) and neurotrophic tyrosine kinase receptor type 2 (Ntrk2) were up-regulated in the hippocampus. It is interesting, however, to note that Pspn was down-regulated in the hippocampus despite being up-regulated in the amygdala. |
Automated Seizure Detection Using Principal Comvonent Analvsis and Discriminate Analvsis by Sherri Geng, Montgomery Blair High School Mentor: Dr. Lucille Lumley Lab: Walter Reed Army Institute of Research (WRAIR |
Sunday 11:25AM |
We present a new method of automated seizure detection in EEG data based on discriminant analysis for signal pattern classification, preceded by principal components analysis (PCA) for seizure feature extraction. We have developed all necessary algorithms and implement fully functional software via MATLAB. We use as our training dataset 100 sets of EEG records (0.5 second epochs with a digitization sampling rate of 250 Hz) consisting of both seizure and non-seizure signal patterns. Feature patterns from this set are extracted by statistical analysis within PCA procedures. The dimensionality of the feature space is reduced by selecting the first 50 dominant eigenvectors. The extracted feature patterns are then classified with a Student discriminant analysis approach. After training the system with the dataset of 100 mixed EEG signals, we feed into the software a new and independently collected testing dataset of 100 EEG signals. Both the training and testing datasets are verified by experts. Our experiments show the false alarm rate to be 2%, the false rejection rate at 0%, the selectivity at 98%, the sensitivity at 100%, and the specificity at 98%. Automated systems promise significant reduction of the amount of EEG data that must be reviewed or stored for further analysis, and may serve especially well in the long-term EEG to simultaneously reduce susceptibility of human error, increase efficiency in reading extensive records (weeks and months of continuous EEG waveforms), and potentially improve detection sensitivity. They may corroborate with the physical-observational approach of clinical EEG experts to achieve maximum specificity, selectivity, and sensitivity. The method we present may be paired with existing seizure detection systems such as the Dataquest Acquisition and Analysis System from Data Sciences International to achieve improved accuracy in seizure detection. Automated detection of seizure activity has become an increasingly prevalent theme in the domain of EEG analysis. We hope the study presented herein will contribute to the advancement of this promising but very challenging field. |
WASHINGTON
CHAPTER OF THE INSTITUTE FOR OPERATIONS RESEARCH AND MANAGEMENT
SCIENCE
Competing Models: Mathematics of Context by Russell Vane, President, WINFORMS |
Saturday 9:00AM |
This talk provides a way to tradeoff the uncertainty of the modeling process with possible interpretations. Adding more and more factors to the model is not the answer. |
Early Operations Research in Washington by John Honig, Cofounder of WINFORMS |
Saturday 9:45AM |
This talk involves the early history of the Operations Research field in the seat or power of the US government |
Ops. Research from Britain to New Battlefields by Adjunct Professor Gene Visco, George Mason University |
Saturday 10:30AM |
Part One: How Operational Research Crossed the Pond and Became Operations Research. A brief account of the origins of modern military analysis and its first few years in the New World. Part Two: Clausewitzian Friction is Alive and Well on the New Battlefields. A brief attempt to answer the question: Is network centric warfare likely to be the panacea? |
What Hath Reverend Bayes Wrought: Powerful Probabilistic Inference on Your Laptop by Dennis Buede, Principal, Innovative Decisions, Inc |
Saturday 11:15 |
This seminar will reacquaint everyone with Bayes rule and its natural applications to medical diagnosis, systems testing, sensor fusion, spam filters, etc. Several examples will be given for medical diagnosis, systems testing and sensor fusion. Finally, the topic of learning the joint probability distribution (i.e., the Bayesian network) from data will be presented and illustrated. |
Surviving as an Analyst in an NQPT World by . Kirk Yost, Senior Analyst, MITRE |
Saturday 2:00PM |
Major decisions are largely made by a class I refer to as Non-Quantitative Policy Types” (NQPTs). Quantitative analysts tend not to understand NQPTs very well, which results in their analyses leading nowhere. This presentation offers advice on how to succeed (and fail) in an NQPT environment. |
Validation Challenges and Ethical Issues in Simulation Models of Organizations by Douglas A. Samuelson, Owner, InfoLogix, Inc |
Saturday 2:45PM |
Multi-agent simulation models enable us to study complex social and cognitive phenomena, including leadership and influence in organizations. In these studies, it is easy to generate experimental hypotheses that would be difficult to validate because any sufficiently powerful experiment would change the system of interest. In some cases, merely stating the research question could affect the object of study. This raises serious ethical concerns, as well. |
Bell, Sherlock, Mycroft and Homeland Security by Professor David Schum, George Mason University |
Saturday 3:30PM |
This talk explores the way that the reasoning techniques of Dr. Joseph Bell, Sherlock Holmes and Mycroft Holmes can be used to address homeland security issues. |
Human-Computer Symbiosis by Douglas Griffith, Scientist, General Dynamics |
Saturday 4:15PM |
This talk merges Licklider’s, Heuer’s and Kahneman’s work on mixed initiative systems to propose requirements for design using psychological principles to account for a human being’s cognitive shortfalls and a computer’s semantics/context errors. |
WASHINGTON
EVOLUTIONARY SYSTEMS SOCIETY
Opening Remarks, The Mysterious Nature of Abstractionsby Jerry LR Chandler, Krasnow Institute for Advanced Study, GMU |
Saturday 9:00AM |
Story Telling and the Story of Man by Andrew Vogt, Department of Mathe matics, Georgetown University |
Saturday 9:05AM |
To some extent man’s place in the universe and man’s future can be understood though man’s history, the evolutionary path from the beginnings of life, through the proliferation of life forms, and along our particular branch in the great tree of life. For example, how does the human brain do the things that it does? The answer proposed by scientists is partly to be found in evolution. We are products of the evolutionary process, and our bodies and minds, our motor and sensory apparatuses, our feelings, our thoughts, our impulses for good and evil, our art and our science all result from how we have met the challenges and opportunities that we have survived to be here today. Although our knowledge of the distant past is speculative, the stories that we put together and the lessons we draw from the evidence illustrate how our minds work and at the same time offer intriguing glimpses of how we might have gotten to be the way we are. The relationship of motor activity to sensation and the processing of sensation, the development of memory and the higher intellectual functions, our internal models of the world, the stories and myths we tell ourselves about the world, our awareness of death can all be seen as arising from episodes in the struggle to survive. Contributing to survival is a sense of purpose or joy in life, and the question arises whether our stories and myths reflect a literal reality or a self-serving set of delusions. To what extent is true knowledge possible or desirable? My presentation will examine some of these issues, and is motivated chiefly by writings of H. J. Jerison. |
Mystery, Abstraction, Dynamics by Frederick David Abraham, Blueberry Brain Institute, Waterbury Center, VT and Psychology, Silliman University Dumaguete City, Philippines |
Saturday 9:30AM |
A central theme in most philosophical lineages is that of the difference between the complexities of that which is observable and the presumed ideal unified forms and laws which may lie hidden and from which the complexities spring. These issues tend to recur in the philosophies of science and of society. From the early Greek enlightenment, through the European Renaissance and Enlightenment, and to the present, examples abound in myth, the Galileo affair, operationism, and the history of psychology such as in Wundt’s distinction of two psychologies. Abstraction and various mathematical and computer methods have been used in attempting representation of the “truth”. Nonlinear dynamics attempts the resolution of holistic and analytical approaches, of convergent and divergent tendencies in representing change (bifurcation) and emergence (self-organization) when multiple factors are interacting, but does not resolve the basic differences. I will just graze briefly in these pastures. |
Mapping the Human World: Abstraction, Evolution, and Ethics ) by Robert Artigiani, History Department, U.S. Naval Academy |
Saturday 10:00AM |
Maps representing hills, valleys, plains, and rivers are familiar examples of abstraction. Because they represent geography in the simplified language of color and shape, we think maps are uniquely human modes of abstraction. But nature maps itself all the time. Perhaps the most obvious example is DNA, which maps organisms by abstracting all the information stored in them into a few chemical molecules. Uniquely human ways of abstracting involve symbols, which emerged to map social realities. Societies self-organized when population growth created problems individuals could not solve for themselves. Once biological survival depended on correlated behaviors, however, the rules of the evolutionary game changed. From this point on, to survive, individuals had to Afit in@to networks of interacting behaviors, and natural selection acted on social systems as well as organisms. The symbols that map social spaces and guide individuals into them are Values, Ethics, and Morals (VEMs). Morals map the meaning of action, for they symbolize how local actions are translated into global states. Since what the members of societies do depends in part on the symbols guiding their choices, competition between societies measures the value of moral information the way biological evolution measures the value of DNA. It is, therefore, possible to speak of an evolution of morals paralleling the evolution of genes. But now the evolution of morals leads to conclusions radically different from those drawn in the past. Previously, evolution seemed to authorize individually barbarous activities like those supposedly advantaged in jungles. But an irony appears once the focus of moral evolution shifts to societies, for it seems that, in the Darwinian world of inter-system politics, societies compete best that individuate, liberate, and empower their members most. This presentation will explore the relationship between abstraction and evolution, hopefully showing that a humanistic morality grounded in natural processes logically follows. |
Notational Systems and Abstractions by Jeffrey G. Long |
Saturday 10:30AM |
The notion of “abstractions” is used in many different ways. Before developing a taxonomy of abstractions it will be necessary to clarify the various kinds of entities that are often subsumed under the rubric of “abstractions.” This paper makes an attempt at defining the notion of abstraction, and distinguishing it from the many other kinds of entities that are often called abstractions, by looking at several notational systems that seem to reify or tokenize abstractions. |
Emergence, Thinking, and Abstraction in Hominid Evolution by Ann M Palkovich Krasnow Associate Professor Krasnow Institute of Advanced Study George Mason University |
Saturday 11:00AM |
Probing the deep history of humanity is a process where sweeping claims are grounded in the fragments of the few, literally. Our understanding of hominid evolution is based on a small group of individuals, scattered through millions of years of time, separated by hundreds of thousands of miles. And while each year the ranks of this fossil lineage modestly swell by a few specimens, we still confront the notion that most of our presumed evolutionary path is in fact an abstraction. Based on analogy, comparison and metaphor, hominid evolution as we currently understand it is the product of abstract notions of evolutionary change, constrained and confounded by peculiar bits of empirical evidence. Shaped by the fashions of scientific inquiry and the happenstance of fossil preservation, hominid evolution is the story of scientific abstraction and interpretation. The abstractions relevant to paleoanthropology generally are drawn from evolutionary biology, geology, and argument structure. These abstractions allow us to account for fossils in time and space. Stratigraphic context places fossils in time. Spatial location is the basis for geographic variability of specimens and their ecological settings. Comparative anatomy provides a reference for morphological variability. Evolutionary models then frame individual specimens as members of populations subject to the processes of natural selection. Finally, highly focused studies draw our attention to specific features of these hominid ancestors, presumably add insight or challenging old truisms. I will briefly explore the nature of a few current trends in abstraction – ideas about emergence and cognition — as they are invoked to create the story of hominid evolution. For biology, these concepts are deeply embedded in contemporary evolutionary thought about the nature of change, relational characteristics, and the issues of categorization. Yet, paleoanthropology characteristically stands apart from other biological inquiry. The arguments, ideas, explanations of evidence, and competing views appear to be bounded by scientific traditions within the field. How do new abstractions derived from emergence theory and issues of cognition fit with similarly sketched scientific abstractions in parallel fields? How do the trends in “evolutionary thinking” – novel in many contemporary scientific arenas – resonate with the well-worn abstractions that have traditionally formed our core understanding of own origins in paleoanthropology? How do notions of emergence create new domains of abstraction from which we now consider the origins of thinking? How do we think about the origins of thinking? |
Universal Darwinism and Global Change by George Modelski |
Saturday 11:30AM |
Universal Darwinism is the idea that Darwinian principles of evolution are fundamental to all life anywhere. These core principles therefore also apply, jointly with auxiliary material bearing on specific domains, to the organization of social life on this planet, and each of these domains may be regarded as the locus of an equal instance of an evolutionary process and not just as analogous to biological evolution. The powers of abstraction embodied in these concepts, and their universality has of course, been noticed for quite some time but their recent specification by Henry Plotkin (1997) is particularly cogent. It may be summarized in the following terms: it deploys (1) an internalized notion of evolution that extends evolutionary principles as operating within, as well as without the entities under analysis (2) a hierarchical conception that maps the multi-level nature of evolutionary processes; (3) the Lewontin-Campbell (g-t-r) heuristic that accounts for evolutionary change and its innovative character; and (4) the Williams-Dawkins (R-I-L) formulation that brings out the role of replication, of continuity and of reproduction and that accounts for stability. The wide range of these principles will be illustrated by reference to global political evolution, and world system change. |
Categorical Models of Abstraction with Possible Relevance for Biology by Dr. Paul C Kainen, Department of Mathematics, Georgetown University |
Saturday 2:00PM |
A view of category theory will be presented that regards commutative diagrams as a representation of “facts”. The speaker will give a brief review of the main ideas in category theory – focusing on the case of abelian categories (where the morphism sets have the natural structure of an abelian group). The central theme is that the concept of adjointness, which requires categorical language for a precise formulation, gives sufficient richness to explore abstraction as an aspect of evolutionary systems. The speaker has shown that under certain conditions, partial commutativity of diagrams implies full commutativity – that is, under suitable constraints, a partial model of some cognitive situation must be extendable to a complete model. This may allow an extension of the concept of adjointness. |
Flirting with Paradox: Emergence, Creative Process, and Self-transcending Constructionsby Jeffrey Goldstein, Ph.D., Adelphi University |
Saturday 2:30PM |
I am proposing a new formalism that would cover the varied processes involved in emergence in complex systems, namely, that of self-transcending constructions (STC). Although this is a new formalism, it can also function as generalization wide enough to include the many species of emergence such as the emergence of new orderly regimes in simple, self-organizing physical systems, bifurcations and the emergence of new attractors in dynamical systems, the emergence of novel robust patterns with novel properties exhibited in artificial life, as well as other instances of collective level emergence in evolutionary natural and social systems. STC’s are many and varied but the prototype I am using is derived from the anti-diagonal construction that was critical to the set theoretical investigations of Georg Cantor as well as the limitative theorems in mathematical logic achieved by Godel and Turing (Machover, 1996). In fact, the anti-diagonal construction is implicated directly in several important approaches in the study of emergence, namely, Ian Stewart’s and Jack Cohen’s so-called Existence Theorem for Emergence (Cohen and Stewart, 1994), Charles Bennett’s construct of logical depth put forward as a complexity metric to remedy the limitations of algorithmic or Kolmogorov/Chaitin/Simonoff complexity (Bennett, 1986), John Holland’s call for a new mathematics showing a change in cardinality (Holland, 1998), and indirectly in John von Neumann’s work on self-reproducing automata (Burks, 1987) and Walter Fontana’s and Leo Buss’s work on algorithmic chemistry using proof theory (Fontana and Buss, 1996), The notion of STC is also meant as a replacement for the common meaning of self-organization since I will demonstrate why the former expression is inadequate for understanding emergence. The idea of construction as such was suggested by Phillip Anderson’s Constructionist Hypothesis put forward as a strong statement of emergence at the birth of modern complexity theory: the ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe (Anderson, 1972). Instead, at each new level of complexity, new properties appear that require new theoretical constructs appropriate for that new level as fundamental as any other level. The “self-transcending” part of STC refers to manner in which these kinds of constructional processes of emergence “construct” the universe along the lines of building-up new levels of complexity. This focus on construction rather than self-organization can also be understood in the light of Fontana’s and Buss’ (1996) critique of an unmodified phase space model for understanding the coming into being of the computational emergence of artificial life. Since STC’s will be understood as fundamentally creative processes, I will be discussing certain research on creativity which highlights how truly original outcomes can come about (Finke, Ward, and Smith, 1992; and, Rothenberg, 1979). As creative processes, STC’s will be shown to flirt with, but not embrace paradox, a crucial distinction necessary in order to avoid logical inconsistency (Hofstadter, 1985; and, Melhuish, 1973). I will demonstrate how this flirtation with paradox is essential for the coming into being of novel entities and dynamics. Conceptual advantages accruing to the notion of STC will be elaborated by showing how it improves upon earlier generalizations of emergence such as that devised by Alfred North Whitehead with his construct of process. In particular I will show how process philosophies and theologies cannot really account for the kind of radical novelty that self-transcending constructions can |
Complementarity in Evolution by Dorothy Kurth Boberg |
Saturday 3:00PM |
In 1859, when the Origin of Species was published, the logic of science remained the deductive reasoning of Plato and the inductive reasoning of Aristotle. These logics remain the reasoning of much of science today. However, deductive logic requires an all inclusive major premise and students know that any test question that begins with the word “all” raises a red flag. Also the minor premise must be a part of the major premise and this may be problematical. Inductive logic also has a serious problem. How can it be known that all or even enough parts of a whole can lead to a proper hypothesis? These classical systems of monistic logic require either/or, true/false decisions, restricting thought to alternatives that may obscure both quantitative and qualitative fine distinctions. Other logics have been developed since then and some of them may be more productive than classical logics in understanding evolution issues. Charles Darwin said that the competition of natural selection was the “main means of evolution” and cooperative processes were subordinate to, and part of, natural selection. This reasoning can be accommodated deductively but it may not represent the logic needed today to understand the new facts about microorganisms and their origin and my discovery of the role of viruses in evolution.. Niels Bohr, in considering atomic processes, developed the concept of complementarity to define the relationships of elementary particles, and suggested that the concept of complementarity should be considered in other fields, especially psychology and biology. The complementarity of two or more concepts or facts involves a logic of complementarity that I have developed as “complementarity dynamics.” In using this reasoning, I propose that the competition of natural selection and the symbiotic or cooperative processes we see in nature, are complementary processes. This challenges the Darwinian prevalence in evolution theory. When we consider that competition is complementary to cooperative processes, we can no longer accept that the aggression of males of many species leads to the most progeny – the long accepted test of “survival of the fittest.” Unless the females cooperated with the aggressive test of maleness, and cooperated with the responsibility of nurturing the young, there may be no progeny at all for the most aggressive male. In fact, in many cases today among humans, the most nurturing males have more progeny. Even the concept that those having the most progeny survive is now being challenged by the reality of the finite earth. The overpopulation of the human species currently gives evidence that this is not the key to long term survival of either our own species or the species biodiversity necessary for all life. Can it be that a new theory and a new logic of the complementarity of competitive and cooperative processes, in dynamic equilibrium, is required not only to understand evolution but also to promote long term survival on this planet? |
Multi-Stage Evolution on Earth: Empirical Evidence for Entropy and Information Changes by Richard L. Coren Emeritus Professor Drexel University |
Saturday 3:30PM |
Verhulst’s equation, or Logistics, is a robust, widely used, phenomenological model of the growth of a system; its simplicity avoids many of the detailed complexities of most such descriptions. In this paper it is extended to describe growth, through the emergence of several distinct stages, e.g., species changes. From the extended mathematical expression a distinctive relation is derived that serves as a test of the interrelatedness of the stages and the continuity of the evolutionary parameters throughout. It is shown that a sequence of critical events, extending from the Big Bang, through biological evolution, through technological development conform to this description. This establishes continuity of this entire evolution as stages of a related, underlying process. The extent of the changes involved, and the tremendous, overall time scale lead one to expect a relation of the evolutionary parameter to entropy. The distinctive, altering events and, in particular, the fact that the last few of them are inventions of mankind and therefore transparent in their nature, indicate that information is the entropic related variable. The mechanism for this is discussed; it suggests an unrecognized feature of the Second Law of Thermodynamics. The historic time scale of appearance of the critical events leads to an estimate of when the next major transition will occur but the form of that transition remains obscure. |
Keynotes for a new ecodynamic vewpoint as suggestions for the environmental wisdom by Riccardo M. Pulselli, Massachusetts Institute of Technology and University of Sienna, Italy |
Saturday 4:00PM |
The theory of evolutionary physics by Ilya Prigogine has changed expectations and perspectives in science. Thermodynamics and entropy are the tools necessary for studying the complexity of living systems and evolving ecosystems. Asserting that energy and mass are intrinsically conserved and entropy is intrinsically evolutionary, the asking question is how can entropy be calculated on the basis of energy and mass quantities. This question is still unanswered and all we can do is to note that the ecodynamic viewpoint is different from that of classical physics and classical ecology. According to the fact that the history and the succession of events are of scientific relevance, the concept of function of state should be revised at a higher level of complexity. At least, the issue is also that state functions do not work and cannot exist in a evolutionary network. By an evolutionary viewpoint it is necessary to use goal functions. Suggestions coming from such a discussion on entropy and living systems strikingly condition human approach on those problems affecting the planet in order to pursue an issue of environmental wisdom. . |
Record warfare in the global system and the next magnitude µR > 7.1 record-setting war: A preliminary analysis by Claudio Cioffi-Revilla, Ph.D. Professor of Computational Sciences and Director Center for Social Complexity George Mason University |
Saturday 4:10PM |
The long-term process of warfare involving great powers in the global system is well-documented for the past five hundred years of history (Levy 1983; Small and Singer 1983) with complete data and no missing observations. This study demonstrates the existence of a previously undetected process with exponentially increasing record values in war fatalities, or constantly increasing record magnitudes for warfare produced by the great power core of the global system. Thirteen wars since 1495 have set unprecedented battle-fatality records, not counting civilian and postwar fatalities. Based on these findings, the next record-setting war, which is now statistically overdue by several decades, is estimated to be of at least magnitude 7.35 (approximately 22.5 battle fatalities), global in scale, and waged with weapons of mass destruction. Civilian and postwar casualties would significantly increase this estimate. |
A Common Basis for Abstraction, Computing with Words by John E. Gray, Code B-32, Naval Surface Warfare Center, Dahlgren Division |
Sunday 10:30AM |
Science has a unique requirement, the abstractions it uses must display some form of familiarity with reality if they are to live within the community of science. This is has been noted by Feynman who has observed that imagination in science is quite different than that in other creative disciplines “It is surprising that people do not believe that there is imagination in science. It is a very interesting kind of imagination, unlike that of the artist. The great difficulty is in trying to imagine something that you have never seen, that is consistent in every detail with what has already been seen, and that is different from what has been thought of; furthermore, it must be definite and not a vague proposition. That is indeed difficult.” Since the nature of scientific imagination is quite different than other forms of human imagination; is it reasonable to expect that the abstraction that results from the successful exercise of the imagination to have common grounds? One might argue that the common ground cannot be found in the realm of numbers, but in a different arena—perceptions, which are expressed in terms of words. In order for the “word” metaphor for science to be a useful abstraction, we must consider both the limitations of what has gone before us and what awaits us. The usage of words allows us arrive at a common language for discussion, however there remain problems with trying to consider them as an universal language. The price to be paid for the usages of words is that we can no longer rely on predicting the outcome of experiment by the usage of numbers as the criteria for success or failure. The crispness of numbers for the outcome of experiments is not universal in science, contrary to claims by many authors. “With all objects represented in a uniform way, computing reduces to just one fundamental operation: transforming one expression into another.” Abstractions begin and ends with “words” and the symbols that form them. Once that it realized, scientific question can then be reduced to one of the above questions. Expressing questions as computations can become as concrete or abstract as the objects drawn from “reality” that form the words we compute with. It is our thesis that a common basis for science should be to view all theories and experiments as questions posed in terms of words and the symbols that form them. Thus abstraction is first and foremost words made flesh and the new basis for the common language of science. This new language comes from viewing all of science as words and questions about allowable transformations of words. Using this common language forming questions would allow all who participate in science to play the role of poets with language games that would new forms to be created and allow cross fertilization of interactions to form a common basis for discussion. |
Abstraction and Software Design by Alexander Shostko, Research Engineer, Simulation Concepts Inc. |
Sunday 11:00AM |
One of the definitions of abstraction is “the act of considering separately what is united in a complex object” [Webster Dictionary]. The goal of abstraction is to provide a better understanding of phenomena and make it appear less complex. It is paradoxical that the high-tech software field, which brought rapid advances to virtually every domain from manufacturing to finance, still relies mainly on manual labor. The explanation is that it takes considerable human intelligence and skills to overcome complexity. The phrase ‘paradigm in crisis’ was coined decades ago to indicate that software takes too long to develop, costs too much, and does not work very well once delivered. This paper tries to sketch the past evolution of software design from an abstraction and complexity point of view. |
Evolved Introspective Abstraction by H T Goranson, Advanced Enterprise Research Office |
Sunday 11:30AM |
Problem solving, especially in science, is a matter of finding the correct abstractions for the problem, the so-called problem space. These abstractions when found are hosted in a formal framework, usually mathematical with some intuition-friendly metaphors some of which are implicit. One might remark that there are so many long open grand challenges because our vocabulary of abstractions is so limited. Nowhere might this be more apparent than in our inability to reason well about introspective emergent behavior. One limit is the restriction of the intuitive metaphors employed (with resulting semantic confusion); another is the implicit default to set-theoretic abstractions. Probably a third is the unfriendliness of first order logic in accommodating context and soft dynamics. Supposing that an expanded abstraction vocabulary would be useful in addressing these problems, we assume it to be based on category theoretic mechanisms because they allow an introspective awareness: abstraction mechanisms that can describe abstraction mechanisms. We also presume it to employ the situation theory, as that provides a semantics to reason about semantics. This approach has been suggested elsewhere, most forcibly by John Barwise who subsequently developed some useful results. What is missing is the appropriately novel intuition-friendly metaphoric link. We suggest that society as an evolutionary system is evolving new methods of metaphoric abstraction quite apart from the community of theorists. We seek to identify these emergent notions of abstraction and relate them to category/situation formalisms.A program is underway that identifies a few common notions: Folding which is a metaphor of systems that describe systems and that also resemble the systems described. This is a broad concept, widely used in popular art that has evolved a few rigid rules that are being surveyed. Embedded reflections, where the view of the system includes the viewer: more precisely, the agent of abstraction is explicitly referenced in the abstractions themselves. These two work together as the meta-abstraction maintains a folded mechanism or narrative. This is not an ungrounded investigation, as there is a specific, well-formed problem: how can one reason about semantic conveyance in deeply introspective evolutionary systems like: Process-driven (business) virtual enterprises where each component acts selfishly, using some synthetic information conveyance metrics; Biological systems driven by molecular dynamics that depend in part on system context and unnatural information metrics; and, Issues related to emergent semantic hiding and forgetting of information, tasks now handled by cryptographic means resort to a central authority. A first goal of this agenda is to develop semantic conveyance metrics (characterizations of semantic distance) for use in semantically informed self-organizing systems. An interesting feature of this work is that is uses evolved abstractions to deal with evolutionary processes, deepening the introspection to the most basic level. |
The Evolution of Abstraction and the Abstraction of Evolutionby Richard J. Khuri |
Sunday 2:00PM |
Abstract unavailable |
A Religious Implication of the Concept of Matter for Static and Evolutionary Thinkers by James F. Salmon, S.J, Baltimore MD. |
Sunday 2:30PM |
The significance of abstraction seems obvious if one looks at interpretations of religious doctrines. This paper investigates abstraction that forms ideas or concepts of matter, and historical implications for the Christian doctrine of original sin. Abstraction is conceived here as “the formation of an idea, as of the qualities or properties of a thing, by mental separation from particular instances or material objects.” (Webster’s Dictionary) It is the thesis in this paper that the “mental separation” that forms the concept of matter has influenced theological interpretations of the biblical account of original sin. The paper follows briefly the career of the concept of matter, introduced first as a tool of speculative thought by Ionian physicists of the sixth century B.C. Abstraction permitted the development of metaphysical categories that eventually had a profound influence on interpretations of some Christian doctrines, including original sin as described in the Book of Genesis. A fascinating bifurcation of interpretations of the story between eastern and western theologians in the early Church will be proposed. It seems that interpretations are related to abstraction concerning respective concepts of matter. The seventeenth century was decisive, when the empirical sciences began to dissociate themselves from their parent natural philosophy. The role of abstraction is evident as important categories of the older natural philosophy are transformed by the new science. Examples of abstraction from “particular instances or material objects” will be cited. Up to the seventeenth century religious thinkers in Christian churches, in general, continued to apply concepts of matter derived from natural philosophy to interpret theological doctrines, including original sin. The Roman Church abided with this policy into the twentieth century. However, after the Reformation, reform thinkers tended to disregard traditional philosophical categories and emphasized biblical sources and personal experience. Before the late nineteenth century, examples of “evolutionary thinking” will be cited in the writings of certain Christian thinkers, but within the context of a static cosmology. Even today, in some Christian communities in the United States, Christian theology is based on a concept of a static cosmology as envisioned by biblical thinkers. The paper concludes with the proposal that contemporary evolutionary thinking is compatible with an interpretation of original sin found in writings of certain eastern “Fathers of the Church.” It also concludes that this interpretation is compatible with a concept of matter proposed in writings of Teilhard de Chardin. |
The Abstraction of Evolutionary Thinking in the Bicameral Mind of Self-Governance and Government: Discovering the Twelve Senses of Twisted Logic by William J. Wells Ph.D. Student, University of Humanistics, The Netherlands |
Sunday 3:00PM |
The challenge of the public discourse in a bicameral mind of governance becomes improved through a commitment towards mobility and accessibility by learning the structure of nature’s evolving design. In the era of e-based commerce, the challenge will be to effectively communicate the electrical nature of life’s deep matter at scalar levels of existence. Public policy awareness for the necessity of integrating green with universal design in the planning, architectural and engineering world will challenge our thought processes that “connect” the public to options for work and play. The political realities of a bicameral form of governance will be taxed to orchestrate and organize resources for commerce that affirms life. Phones, computer screens and smaller bytes of information, while increasing the rhythmic pace of life’s thoughts and opportunities for “discovery” may hinder our comprehension for the universal expression of human form and function. Moments for reflection may arise from gentle inquiry into the senses of one’s surrounding. Expressions of self-governance may be a function of rhythmic mental looping of abstraction with concreteness in the idea of existential space as a becoming in the now. Mature democracies can affirm an emergent consciousness for disability, and complexity science. |
Evolutionary Thinking in Past Scientific Theories: A Logical Analysis by Antonino Drago, Dept. Phys. Sci., Univ. “Federico II”, Naples, Italy |
|
Abstractions lead us to shape ideas, about which our minds argue by means of logic. An evolutionary thinking occurs when these ideas are not linked together by means of mechanistic deductions, but in a creative way. In this sense evolutionary thinking pushes us to shape a broader kind of logic. The phenomenon of a double negated statement whose corresponding positive statement is lacking of scientific evidence (=DNS) will be examined. It represents a failure of the double negation law; this law constitutes the borderline between classical logic and, broadly speaking, non-classical logic (in particular, intuitionistic logic). In fact, several scientific theories born in past times include in an essential way DNSs. In particular, quantum logic can be represented by means of DNSs inside intuitionistic logic. When DNSs pertain in an essential way to a theory, no more – as a comparative analysis upon the several instances shows – a deductive organization of the theory is possible; rather, the theory puts an universal problem by means of a DNS, then some double negated methodological principles (e.g.: “It is impossible a motion without an end”) follow in order to achieve a new scientific method, capable to solve the problem at issue. This arguing evolves through a cyclic pattern, according to the synthetic method as it was improved by L. Carnot. The crucial step in this pattern is an ad absurdum theorem (likely as in thermodynamics S. Carnot’s theorem is). This theorem reaches evidence for a possible conclusion, still enunciated by means of a DNS. Then by a move like Markoff principle this DNS is changed in a positive statement; it can now be put as a new hypothesis from which to develop a full deductive system. This move is illustrated at best in Lobachevsky’s – maybe first – presentation of a non-Euclidean geometry, but can be recognised also in S. Carnot’s thermodynamics, Avogadro’s atomic theory, Einstein’s founding special relativity. This pattern of arguing is examined by means of paraconsistent logic. In correspondence to the use by theoretical scientific research, of respectively paraconsistent logic, intuitionistic logic and classical logic about statements which are potentially principles for a theory, three kinds of principles are recognized; i.e., a guess, a methodological principle, an axiom-principle. These differences are expressed in a lucid way by Einstein again in his celebrated paper on special relativity: “We will raise the conjecture (the substance of which will be hereafter called the “[axiom-]principle of relativity”) to the state of a [methodological] postulate” |
Paper available, not scheduled for presentation |
Emergence of Evolutionary Thinking as a Change of the Time Scale by M. Burgin, Department of Mathematics, University of California, Los Angeles |
|
Anthropological studies show that primitive tribes had a static world comprehension: the world was created and since those time nothing was changing. It is easier to believe that nothing changes at all. However, this contradicts to everyday experience. Day is changed by night. Then night is changed by day and the whole process repeats all the time. Seasons (spring, summer, fall, and winter) are changing each other also in an infinite cyclic sequence. All this induces formation of the world comprehension in a cyclic time in social and individual mentality. Cyclic time, in which everything is repeating itself, has become a cornerstone of the world picture for millennia. Evolutionary thinking changes the world comprehension by breaking the temporal cycle. The new vision brings time that has a very different nature. For instance, Bergson lays special emphasis on the distinction between the reversible time of physics, in which nothing new happens, and the irreversible time of evolution and biology, in which there is always something new. To explain peculiarities of this process and its relation to physical time, we utilize the concept of the existential triad and related to it world stratification (Burgin and Milov, 1999). The system theory of time (Burgin, 2002) provides means for understanding geometry and topology of evolutionary time and its impact on people mentality. |
Paper available, not scheduled for presentation |
Double Symmetry and the Logic of Number Theory, by Karl Javorszky, Vienna, Austria |
|
Abstract unavailable | Paper available, not scheduled for presentation |
WASHINGTON
SOCIETY FOR THE HISTORY OF MEDICINE
The Road to Eleusis Again by Dr. Alain Touwaide, National Museum of Natural History, Smithsonian Institution ,Chair, History Committee, Washington Academy of Sciences. |
Sunday 2:30PM |
Ancient Greek culture claimed to be rational, a fact that was best symbolized by its most representative god, Apollo. Classical tradition accepted, confirmed and reinforced this claim, making of ancient Greece the paradigm of rationality. Recent studies, however, have challenged such a self-identification process, bringing to light phenomena of psychotropic drug consumption, magic, necromancy and similar practices. Research has even come to the point of interpreting the most representative religious rites of Greece – the Eleusis mysteries – as a psychotropic phenomenon. A re-examination of the question is not only necessary, but timely. Medical facts provide information not taken into account so far, facts that unequivocally show that the effects of psychotropic drug consumption were well known. This paper will examine the texts, showing that revisiting the Eleusis mysteries can still lead to interesting discoveries. |
Drug Smuggling: A Three Hundred Year History by Michael Harris, M.S., R.Ph. Historian and Curator, Drug Enforcement Administration Museum. |
Sunday 3:00PM |
The smuggling of drugs for healing, abuse or profit has existed for many centuries. This paper will concentrate on the past three hundred years and will focus largely on the smuggling of drugs of abuse. In times of war, countries suffering a blockade must smuggle in medicines that they need but cannot produce themselves. During the American Revolutionary War, for example, the colonists needed to smuggle opium and quinine past the British blockade. The Confederate States also had to smuggle drugs past the Union blockade during the Civil War. At the beginning of the twentieth century, many countries began to ban the non-medicinal use and abuse of narcotic drugs. This restriction has led to a drug trafficking of narcotics of over 400 billion dollars today. This paper will focus on how these drugs of abuse are smuggled. |
WASHINGTON STATISTICAL SOCIETY
Human Rights Issues Around the World: The Role of Good Data by Dr. Fritz Scheuren, Vice-President of Statistics, National Opinion Research Center and President-elect, American Statistical Association |
Sunday 10:30AM |
Dr. Scheuren will describe his work as a statistician investigating numerous human rights abuses, from an early study of Cambodian land mines to more recent conflicts in Guatemala, Kosovo, and Peru, regretfully predicting that there will be more such episodes in the future.Dr. Scheuren will detail the work that he and his colleagues performed for Peru’s Truth and Reconciliation Commission, which Peru’s new administration established in 2001 to investigate human rights abuses in the guerilla war that previous Peruvian governments had waged against Maoist and other rebel groups since the 1980s. Because the casualty figures were so politically charged, demonstrating the objectivity of the estimation methods to the Peruvian public was a critical component of the consulting project.The announcement of the Commission’s estimates in August 2003 was a major political event. The night before their release, women staged a candlelight vigil in Lima’s main square. Published in more than a dozen newspapers, the statistics showed that the number of killed was twice what the previous official count had said it was. Dr. Scheuren will emphasize the importance of contributing technical skills to such projects in emerging democracies, where they are often in scarce supply. |
Future Thinking at the U.S. General Accounting Office by Dr. Donna Heivilin, U.S. General Accounting Office |
Saturday 2:00 PM |
Foresight has taken on a new emphasis at the U.S. General Accounting Office-a legislative branch agency supporting Congress in its legislative, oversight and investigative roles. By playing a new early warning role that alerts Congress and other decision makers to important trends, GAO can help the government avoid crises and catastrophic costs. Dr. Donna Heivilin, director of GAO’s Applied Research and Methods group, will describe GAO initiatives to conduct environmental scanning and to assess policy issues ranging from the nation’s long-term fiscal situation to the future of the workforce. A special consideration for the organization is how it can balance the examination of an uncertain future with its longstanding reputation for conducting work that is fact-based and objective. Dr. Heivilin will focus her remarks on her direction of a new methodological approach which seeks to ground inherently risky work in GAO’s key values as an organization-accountability, integrity and reliability |
Future of U.S. Defense Planning by Dave Stein, Lt Col, USAFR (Ret) |
Saturday 2:25 PM |
Besides accomplishing more with fewer resources, the US military must ensure national security in an uncertain and rapidly changing geostrategic environment. Factors such as technology obsolescence, asymmetric-capable adversaries, and the advent of non-state actors magnify force structure planning challenges – especially in identifying the decisive force necessary to deter or defeat an unknown future threat. In addition, new modes of warfare including asymmetric warfare, informational operations, and non-lethal warfare require new modes of thinking. With these new modes of warfare come new concepts and interpretations of deception, denial, surveillance, concealment, mass, rules-of-engagement, deterrence, and even “peacetime” as well as new requirements for targeting, battle damage assessment, and doctrine. For all of these reasons – and as evidenced all too well by the tragic events of 9/11 – incrementalism and extrapolation from the present are not the desired approaches to force structure planning or to defense technology investment planning. A more viable approach, summarized in this presentation, begins with a time warp to a future characterized by any of several alternative geostrategic worlds, themselves postulated on the basis of geostrategic planning space drivers and representing discontinuous jumps from the present. The methodology identifies the threats that these worlds present and postulates capabilities (weapon systems) needed to counter the threats. The end product is a list of the enabling technologies, prioritized according to their relative utilities across the hypothesized weapons systems and alternative future worlds. The methodology also provides further insight on long-range threats and needed capabilities. Additional payoffs are improved understanding among the warfighting, acquisition, and technology communities, together with an alternative to “peanut butter spread” / “salami slice” management of technology budget cuts. |
Future of Planetary Defense by Martin Schwab, Homeplanet Defense Institute |
Saturday 2:50 PM |
This presentation will offer for consideration the idea that existing terrestrial political insecurity can be lessened by constructing a planetary defense regime (PDR). This regime would simultaneously detect, probe and prepare the best possible defenses for Earth and human interests in space against NEOs. The very acts of proposing, planning and beginning the construction of a PDR with other countries by the United States might reduce the probability that current space assets will be exploited as weapons; either by small or great powers. A conceptual diagram and world map will be used to present newly defined functions for effective planetary defense against NEOs. These include: Ground based optical tracking stations (GBOTs), Observe and respond constellations (ORCs), International launch infrastructure, Countermeasures beyond low Earth orbit, and Desirable political components of international joint command. The presentation will conclude by outlining current thinking on the advantages and disadvantages (both technical and political) of the following institutions in regard to planetary defense against NEOs: NORAD, NATO, NASA, United Nations Officefor Outer Space Affairs, Vienna; Canadian Space Agency; European Space Agency; China National Space Administration; Asian-Pacific Space Cooperation Organization (APSCO), International Space University |
Future of Self Driving Automobiles by John F. Meagher, CIH – International Center for Environmental Technology (INTERCET Ltd.) |
Saturday 3:15 PM |
Discussion of self-driving automobiles have been in the public domain since the GM pavilion at the 1939 World Fair in 1939 and have been discussed ever since. This presentation will explore future developments and current trends indicating that automated self-driving automobiles may be a reality within the next decade to two decades driven by factors at work now and emergent societal needs. Examples include recent technological advances and experiments in on-board computing power, radar sensing and robotics within automobiles and in combination with “smart highways” allowing cars that satisfy an American need for individual rather than mass or public transport will be presented. Social benefits of improved public health including reduced loss of life, injury and serious disability including the limits of human attention span and education for improving excellent driving behaviors and an aging population in need of driving assistance from artificially intelligent driving will be discussed. Environmentally, automated self-driving automobiles could be more fuel efficient and equipped with artificial intelligence to optimize fuel consumption reducing pollution from reliance on human driving behavior and traffic congestion. Politically the challenge of resistance from the U.S. public regarding robotic driving and lessons learned from government research efforts (e.g. Department of Transportation Intelligent Vehicle Initiative and adaptation of DARPA technology) will be examined. Economically the cost-benefits of automated self-driving automobiles and their potential impacts, positive and negative, on the U.S. economy in terms of jobs, manufacturing, increased free time devoted to non-driving activities, spin-off industries and insurance related losses will be discussed. |
Future of Education by Gary Marx, President, Center for Public Outreach, Vienna, Virginia |
Saturday 3:40 PM |
How can our education system connect to seismic shifts taking place in society? Marx will suggest two approaches. He will also briefly review a few of numerous trends that have direct implications for education at all levels. Those trends include the impact of technology on the pace of change, the move from information acquisition toward knowledge creation, and the possible demand for personalization, driven by the standards movement. Marx will also suggest that leaders, no matter how specialized, must also be generalists, constantly conceiving of the characteristics of the education system we need to prepare students for a multidisciplinary future. |
Future of Philanthropy in the U.S. by Natalie Ambrose, The Council on Foundations |
Saturday 4:05 PM |
At its most basic, philanthropy derives from the Greek meaning love for humanity. In these modern times, it describes the process of sharing private resources for public benefit. Organized philanthropy has been a unique — sometimes lauded, sometimes controversial — part of the American landscape since the beginning of the 20th Century. During the last 20 years, the number of philanthropic organizations and the variety of giving vehicles has burgeoned dramatically. Likewise, a significant support infrastructure (like where I work — the Council on Foundations) has emerged to meet their growing and specialized needs. And it has been predicted that over the next 45 years, there will occur in the US a massive intergenerational transfer of wealth – $40 trillion or more. Of course, depending on fiscal and public policy, much of this wealth could be purposely directed towards philanthropy and benefiting the public good. Despite this growing sector and its involvement in key areas such as public education, science, health care, the environment, food, energy, art and humanities, human services, and civil society, philanthropy – its sources, value, outcome, potential, threats – continue to be widely misunderstood amongst the American public, most influentials and policy makers. And organized philanthropy remains controversial and increasingly under threat from ideological influence, particularly as it impacts advocacy, public policy and priorities. More recently, instances of illegal and unethical use of foundation resources by trustees or staff have been uncovered and publicized by the media, creating calls for greater regulatory scrutiny and increasing the negative public perception. Frequently, foundations have been criticized for being slow to respond or for maintaining the status quo, or for reinforcing plutocracy. As a result, there is a growing trend to reimagine and rethink the whole philanthropic process in new ways — to make it more viable for twenty-first century needs and for global as well as local and national benefit. To leverage the strengths and experience of foundations – as learning centers, as “laboratories” to bring untested social experiments to scale, as aggressive creators of new orthodoxy — to be more responsive to the pressing social problems of our era. |
Future of Sex by Joseph F. Coates |
Saturday 4:30 PM |
Attendees will learn about the many social, economic, biological, pharmacological, psychological health, familial, and cultural factors shaping attitudes behaviors, manifestations, and activities tied to sex and sexuality. Among those who should attend are anyone who has a sex or is responsible for the health and education of young people, or who is concerned about changing sexual behavior over one’s lifetime. |