Modeling Science, Technology & Innovation Conference | Washington D.C. | May 17-18, 2016
Tuesday, May 17th 2016 | 9:30 AM – 11:00 AM
Case Studies
Government and policy researchers and staff present computational models they have implemented to optimize internal processes and to improve agency decision making.
Moderator
Jerome Glenn
CEO, The Millenium Project
He wrote about information warfare in the late 1980s in his book Future Mind, sent his first email in 1973, and was hired by the Quakers’ action arm to help organize the environmental movement New England 1971. In the mid-1980s he was instrumental in getting x.25 packet switching in 29 developing countries which was key to their later getting low cost access to the Internet. More recently he led the design and implementation of collective intelligence systems for the Global Climate Change Situation Room in South Korea, the Prime Minister’s Office of Kuwait, and now the Global Futures Intelligence System and ECISIS for Egypt. Other current work includes: Future Work/Technology 2050; the EC’s 2050 scenarios on innovation, research, and higher education; and the public’s roles in preventing individuals from deploying future weapons of mass destruction.
He was instrumental in naming the first Space Shuttle the Enterprise and banning the first space weapon (FOBS) in SALT II. He has published over 150 future-oriented articles, spoken to over 300 organizations, written several books (Future Mind, Linking the Future, and co-author of Space Trek), and other research is available at www.millennium-project.org.
Speakers
Richard Ikeda
National Institutes of Health
NIH Experiments with IBM Watson
Abstract: NIH uses text mining technologies (https://report.nih.gov/rcdc/process.aspx) to produce its Categorical Spending reports (https://report.nih.gov/categorical_spending.aspx). Consequently NIH is keenly interested in advances in text mining technologies that would improve capabilities and utility.
At its core, Watson is a robust implementation of a next generation technology for deriving facts and meanings from publications, data sets or other information sources. Watson “reads” documents and parses the text into keywords and sentence fragments in order to identify relationships among the different bits of information.
NIH has pursued a Watson “”proof-of-concept”” in collaboration with IBM and the latest experiments with Watson will be discussed.”
Bio: Richard Ikeda currently is the director of the Office of Research Information Systems (ORIS) in the NIH OD Office of Extramural Research. In this position, he oversees the operations of the NIH electronic Research Administration (including eRA’s IMPAC II and Commons systems), which support the mission-critical function of grants administration for NIH and other federal granting components. He also manages the ORIS Office of Data Quality, which is responsible for the integrity of the data stored in IMPAC II, and the Research, Condition, and Disease Categorization Program, which is responsible for categorizing NIH research activities. Prior to joining NIH in 1999 as an NIGMS program director, with a portfolio of research in the fields of enzymology and wound healing, Rick served on the faculty of the Department of Chemistry and Biochemistry at the Georgia Institute of Technology (Georgia Tech). He has a Ph.D. in Chemistry from the California Institute of Technology (Caltech).
Guru Madhavan
National Academies of Sciences, Engineering, and Medicine
Systems Analysis for Global Health Planning
Bio: Guru Madhavan, Ph.D., is a program director at the National Academies of Sciences, Engineering, and Medicine where he has led the R&D of SMART Vaccines—a prioritization software tool to help reduce barriers for vaccine innovation. He serves as a technical adviser to the U.S. Department of Health and Human services in the development of a fully web-based SMART Vaccines 2.0. Madhavan received his M.S. and Ph.D. in biomedical engineering, and an M.B.A. from the State University of New York. His professional experience includes working in the medical device industry as a research scientist developing cardiac surgical catheters for ablation therapy. He is a vice-president of IEEE-USA of IEEE, and has received the Innovator Award and the Cecil Medal from the presidents of the National Academies. He has been named as a distinguished young scientist by the World Economic Forum. Madhavan has co-edited six books, and is author of Applied Minds: How Engineers Think (W.W. Norton).
Martin Meltzer
Centers for Disease Control and Prevention
Using graphs and maps to aid public health decision making during an emergency response
Bio: Dr. Martin I. Meltzer is the Lead of the Health Economics and Modeling Unit (HEMU), and a Distinguished Consultant in the Division of Preparedness and Emerging Infections, CDC in Atlanta, GA. He received his undergraduate at the University of Zimbabwe and his graduate degrees from Cornell university. He lead the modeling teams supporting CDC’s response to the 2009 H1N1 influenza pandemic, including producing monthly estimates of cases, hospitalizations and deaths, as well as estimating impact of the vaccination program and use of influenza anti-viral drugs. Other responses in which he lead the modeling activities include estimating the residual risk associated with the 2012 contaminated steroid injectable products that caused fungal meningitis among patients, and is the lead for CDC’s 2014 West Africa Ebola Response Modeling Unit,. Examples of his research include estimating the impact of influenza pandemics, the modeling of potential responses to smallpox as a bioterrorist weapon, and assessing the economics of controlling diseases such as rabies, dengue, hepatitis A, meningitis, Lyme, and malaria. He is an associate editor for Emerging Infectious Diseases. He also supervises a number of post-doctoral health economists at CDC.
Venkatachalam “Ram” Ramaswamy
National Oceanic and Atmospheric Administration
Prediction of Climate Extremes for Decision-making
Before credible forecasts can be produced, the mathematical models have to engage in rigorous science that evaluates the theoretical and observational knowledge about the components of the system, the uncertainties in the science, and the range of solutions possible given the natural variations and the forced changes on the system. Ensemble solutions for the modeling are performed in which a wide variety of possible initial states of the system have to be accounted for in order to have realistic predictions with probabilistic outlooks. Even more challenging is the prediction of extremes (e.g., heat waves, excess or deficit rainfall, hurricanes) occurring in the days, seasons, and years ahead and that cause destruction of life and property. This challenge has to be addressed with increased rigor since numerous societal sectors need authoritative information to address population and economic risks.
The task of advancing the science of predictions to produce high-quality user-friendly data requires sustained expertise at an outstanding level. This, in turn, necessitates maintaining the highest standards through steady recruitment and retention of a creative, skilled workforce, and through advances in observational and computational infrastructure. Actionable information for practical decision-making also requires advances in technical skills to manipulate the increasingly vast amounts of climate data, and the concomitant need to improve visual algorithms for easy discernment of the significance of the science-based information..
The outcomes of weather and climate prediction science feed into national policy decision inputs, and also serve the nation in bilateral and multilateral exchanges. Thus, the NOAA activities include dissemination to national and international bodies such as the National Climate Assessment, Intergovernmental Panel on Climate Change, Federal, state, local and tribal agencies, and combination sectoral bodies such as the Western Governors Association. The paramount need is actionable information which facilitates risk assessment by the various sectors, and integrates with other factors besides climate that must be factored in sectoral decisions. The technology to propagate increasingly useful climate information, with quantified uncertainties, relies on the feedback from the stakeholders e.g., estimates of the gains from the information received, with accompanying calibration in the management of expectations.
Bio: Venkatachalam (“Ram”) Ramaswamy is Director of NOAA’s Geophysical Fluid Dynamics Laboratory (GFDL) since 2008. Ram received his undergraduate degree in Physics from Delhi University (India), and Ph. D. in Atmospheric Sciences from the State University of New York at Albany. He was a Fellow in the Advanced Study Program at the National Center for Atmospheric Research. He joined GFDL in 1985, and was a Senior Scientist before becoming Director. His principal interests are numerical modeling of the global climate system and advancing the understanding of the past, present and future states of climate including weather extremes. He directs one of the world’s leading climate modeling centers, with the mission to develop mathematical models for predicting climate. Ram is a Fellow of the American Meteorological Society and American Geophysical Union, a recipient of the Presidential Rank award, and has served on the Intergovernmental Panel on Climate Change and World Climate Research Program. View more general information about his work here or see his personal website here.