Modeling Science, Technology & Innovation Conference | Washington D.C. | May 17-18, 2016
Tuesday, May 17th 2016 | 9:30 AM – 11:00 AM
Government and policy researchers and staff present computational models they have implemented to optimize internal processes and to improve agency decision making.
CEO, The Millenium Project
Jerome C. Glenn co-founded and directs The Millennium Project, a leading global participatory think tank, which produces the State of the Future reports for the past 20 years, Futures Research Methodology 3.0, and the Global Futures Intelligence System. He invented the “Futures Wheel”, a futures assessment technique and concepts such as conscious-technology, transInstitutions, tele-nations, management by understanding, feminine brain drain, just-in-time knowledge, nodes as a management concept for interconnecting global and local views and actions, and definitions of environmental security, collective Intelligence, and scenarios.
He wrote about information warfare in the late 1980s in his book Future Mind, sent his first email in 1973, and was hired by the Quakers’ action arm to help organize the environmental movement New England 1971. In the mid-1980s he was instrumental in getting x.25 packet switching in 29 developing countries which was key to their later getting low cost access to the Internet. More recently he led the design and implementation of collective intelligence systems for the Global Climate Change Situation Room in South Korea, the Prime Minister’s Office of Kuwait, and now the Global Futures Intelligence System and ECISIS for Egypt. Other current work includes: Future Work/Technology 2050; the EC’s 2050 scenarios on innovation, research, and higher education; and the public’s roles in preventing individuals from deploying future weapons of mass destruction.
He was instrumental in naming the first Space Shuttle the Enterprise and banning the first space weapon (FOBS) in SALT II. He has published over 150 future-oriented articles, spoken to over 300 organizations, written several books (Future Mind, Linking the Future, and co-author of Space Trek), and other research is available at www.millennium-project.org.
National Institutes of Health
NIH Experiments with IBM Watson
Abstract: NIH uses text mining technologies (https://report.nih.gov/rcdc/process.aspx) to produce its Categorical Spending reports (https://report.nih.gov/categorical_spending.aspx). Consequently NIH is keenly interested in advances in text mining technologies that would improve capabilities and utility.
At its core, Watson is a robust implementation of a next generation technology for deriving facts and meanings from publications, data sets or other information sources. Watson “reads” documents and parses the text into keywords and sentence fragments in order to identify relationships among the different bits of information.
NIH has pursued a Watson “”proof-of-concept”” in collaboration with IBM and the latest experiments with Watson will be discussed.”
Bio: Richard Ikeda currently is the director of the Office of Research Information Systems (ORIS) in the NIH OD Office of Extramural Research. In this position, he oversees the operations of the NIH electronic Research Administration (including eRA’s IMPAC II and Commons systems), which support the mission-critical function of grants administration for NIH and other federal granting components. He also manages the ORIS Office of Data Quality, which is responsible for the integrity of the data stored in IMPAC II, and the Research, Condition, and Disease Categorization Program, which is responsible for categorizing NIH research activities. Prior to joining NIH in 1999 as an NIGMS program director, with a portfolio of research in the fields of enzymology and wound healing, Rick served on the faculty of the Department of Chemistry and Biochemistry at the Georgia Institute of Technology (Georgia Tech). He has a Ph.D. in Chemistry from the California Institute of Technology (Caltech).
National Academies of Sciences, Engineering, and Medicine
Systems Analysis for Global Health Planning
Abstract: Planning tools based on narrow efficiency metrics (e.g., cost-effectiveness) miss out on a number of important factors that often underpin final policy decisions. A comprehensive systems analysis approach is needed to improve our programmatic effectiveness in global health, especially with preparedness, response, and resilience as challenged by recent disease outbreaks. I will discuss a platform concept for strategic policy planning—expanding upon Strategic Multi-Attribute Ranking Tool for Vaccines (or SMART Vaccines), a decision support software based on multi-criteria systems analysis developed by the National Academies of Sciences, Engineering, and Medicine, now being enhanced into a web-based application for broad use by the U.S. Department of Health and Human Services. I will explore the broader potential of such a systems platform to consider and formally include many other factors affecting short and long-term planning and response (especially when disasters affect vulnerable communities), and demonstrate how a collaborative, transparent policy decision support system across stakeholders could be developed and deployed for public benefit.
Bio: Guru Madhavan, Ph.D., is a program director at the National Academies of Sciences, Engineering, and Medicine where he has led the R&D of SMART Vaccines—a prioritization software tool to help reduce barriers for vaccine innovation. He serves as a technical adviser to the U.S. Department of Health and Human services in the development of a fully web-based SMART Vaccines 2.0. Madhavan received his M.S. and Ph.D. in biomedical engineering, and an M.B.A. from the State University of New York. His professional experience includes working in the medical device industry as a research scientist developing cardiac surgical catheters for ablation therapy. He is a vice-president of IEEE-USA of IEEE, and has received the Innovator Award and the Cecil Medal from the presidents of the National Academies. He has been named as a distinguished young scientist by the World Economic Forum. Madhavan has co-edited six books, and is author of Applied Minds: How Engineers Think (W.W. Norton).
Centers for Disease Control and Prevention
Using graphs and maps to aid public health decision making during an emergency response
Abstract: Since 2009, the CDC has participated in emergency responses ranging from small domestic outbreaks involving a few hundred people, to international public health emergencies affecting millions of individuals from around the world. Each response required unique skills from the CDC’S Modeling Task Force to address public health officials’ questions and to help them make informed decisions. This presentation will discuss the roll of the Modeling Task Forces in the CDC’s Incident Management structure and some of the models used to assist officials in making decisions about the potential size of the public health crisis, how effective interventions could be, and what resources are required.
Bio: Dr. Martin I. Meltzer is the Lead of the Health Economics and Modeling Unit (HEMU), and a Distinguished Consultant in the Division of Preparedness and Emerging Infections, CDC in Atlanta, GA. He received his undergraduate at the University of Zimbabwe and his graduate degrees from Cornell university. He lead the modeling teams supporting CDC’s response to the 2009 H1N1 influenza pandemic, including producing monthly estimates of cases, hospitalizations and deaths, as well as estimating impact of the vaccination program and use of influenza anti-viral drugs. Other responses in which he lead the modeling activities include estimating the residual risk associated with the 2012 contaminated steroid injectable products that caused fungal meningitis among patients, and is the lead for CDC’s 2014 West Africa Ebola Response Modeling Unit,. Examples of his research include estimating the impact of influenza pandemics, the modeling of potential responses to smallpox as a bioterrorist weapon, and assessing the economics of controlling diseases such as rabies, dengue, hepatitis A, meningitis, Lyme, and malaria. He is an associate editor for Emerging Infectious Diseases. He also supervises a number of post-doctoral health economists at CDC.
Venkatachalam “Ram” Ramaswamy
National Oceanic and Atmospheric Administration
Prediction of Climate Extremes for Decision-making
Abstract: The mantle of understanding and predicting the state of weather and climate is a principal mandate of the National Oceanographic and Atmospheric Administration. NOAA’s mission objectives are Science, Service, and Stewardship, with the responsibility of providing credible, trustworthy forecasts of the state of the weather and climate system for the nation on timescales ranging from daily to seasonal to decadal to centennial. NOAA carries its mission through observations, and through scientific understanding and prediction using mathematical formulation of the processes and interactions occurring in the Earth system (atmosphere, oceans, land, and ice) that is solved on high-performance computers.
Before credible forecasts can be produced, the mathematical models have to engage in rigorous science that evaluates the theoretical and observational knowledge about the components of the system, the uncertainties in the science, and the range of solutions possible given the natural variations and the forced changes on the system. Ensemble solutions for the modeling are performed in which a wide variety of possible initial states of the system have to be accounted for in order to have realistic predictions with probabilistic outlooks. Even more challenging is the prediction of extremes (e.g., heat waves, excess or deficit rainfall, hurricanes) occurring in the days, seasons, and years ahead and that cause destruction of life and property. This challenge has to be addressed with increased rigor since numerous societal sectors need authoritative information to address population and economic risks.
The task of advancing the science of predictions to produce high-quality user-friendly data requires sustained expertise at an outstanding level. This, in turn, necessitates maintaining the highest standards through steady recruitment and retention of a creative, skilled workforce, and through advances in observational and computational infrastructure. Actionable information for practical decision-making also requires advances in technical skills to manipulate the increasingly vast amounts of climate data, and the concomitant need to improve visual algorithms for easy discernment of the significance of the science-based information..
The outcomes of weather and climate prediction science feed into national policy decision inputs, and also serve the nation in bilateral and multilateral exchanges. Thus, the NOAA activities include dissemination to national and international bodies such as the National Climate Assessment, Intergovernmental Panel on Climate Change, Federal, state, local and tribal agencies, and combination sectoral bodies such as the Western Governors Association. The paramount need is actionable information which facilitates risk assessment by the various sectors, and integrates with other factors besides climate that must be factored in sectoral decisions. The technology to propagate increasingly useful climate information, with quantified uncertainties, relies on the feedback from the stakeholders e.g., estimates of the gains from the information received, with accompanying calibration in the management of expectations.
Bio: Venkatachalam (“Ram”) Ramaswamy is Director of NOAA’s Geophysical Fluid Dynamics Laboratory (GFDL) since 2008. Ram received his undergraduate degree in Physics from Delhi University (India), and Ph. D. in Atmospheric Sciences from the State University of New York at Albany. He was a Fellow in the Advanced Study Program at the National Center for Atmospheric Research. He joined GFDL in 1985, and was a Senior Scientist before becoming Director. His principal interests are numerical modeling of the global climate system and advancing the understanding of the past, present and future states of climate including weather extremes. He directs one of the world’s leading climate modeling centers, with the mission to develop mathematical models for predicting climate. Ram is a Fellow of the American Meteorological Society and American Geophysical Union, a recipient of the Presidential Rank award, and has served on the Intergovernmental Panel on Climate Change and World Climate Research Program. View more general information about his work here or see his personal website here.