Showing posts with label STATISTICAL ANALYSIS. Show all posts
Showing posts with label STATISTICAL ANALYSIS. Show all posts

Sunday, July 5, 2015

COMPUTER MODELING THE U.S. ECONOMY

FROM:  NATIONAL SCIENCE FOUNDATION

Foreseeing US economic trends
Tim Kehoe's computer models gather international market data, predict impact of policy changes

Although economist Tim Kehoe's computer models are complex--analyzing numerous data sets related to the buying and selling of goods and services, trade and investment, saving and lending--the underlying premise of his research is simple.

He studies how people make economic decisions over time. Producers, for example, must look ahead and try to forecast prices, taking into account how consumer demand is going to change. At the same time, consumers making future buying choices must think about what might happen to their income during the same period.

Kehoe is developing computer models to try to accurately predict the likely outcomes.

"There is uncertainty about productivity and government policies, so many people uncertain about what's going to happen in the future make their plans contingent," says Kehoe, a professor of economics at the University of Minnesota and an adviser to the Federal Reserve Bank of Minneapolis. "If I am a firm and I want to sell my product in a foreign market, and I'm worried about what's going to happen to prices in that foreign market, I'll want to know how the price for those goods will translate into income in my home country."

Kehoe's research and teaching focus on the theory and application of general equilibrium models, which, in economics, attempt to explain the behavior of supply, demand and prices with several or many interacting markets with the assumptions that a set of prices exists that will generate an equilibrium.

These insights could prove especially valuable to policy makers and business leaders looking to foresee future U.S. economic trends.

Kehoe's work involves, among other things, creating models that predict the effects of trade liberalization on the allocation of resources across various sectors of the economy, in particular how this leads to a boom in foreign investment that could leave a country and its financial system vulnerable to a financial crisis.

He also develops models that analyze the impact of trade policies on the structure of industries, for example, the effects of the North American Free Trade Agreement, or NAFTA.

Among other things, Kehoe consulted with the Spanish government in 1986 on the wisdom of joining the European Community; the Mexican government in 1994 on the impact of joining NAFTA; and the government of Panama in 1998 on the effects of unilateral foreign trade and investment reforms.

He has been the recipient of nine National Science Foundation (NSF) grants starting in 1982--totaling about $1.5 million. More recently, he received a prestigious fellowship from the John Simon Guggenheim Memorial Foundation, which annually supports a diverse group of scholars, artists and scientists chosen on the basis of prior achievement and exceptional promise.

Since the early 1990s, the United States has borrowed heavily from its trading partners, and "we wanted to look at what would happen if the foreigners save less and, therefore, lend less to the United States, "he says. "Will it happen in an orderly way, or end in a crisis?"

He and his collaborators modeled U.S. borrowing resulting from a global savings glut--where foreigners sell goods and services to the United States, but prefer purchasing U.S. assets to purchasing U.S. goods and services--using four key data sets of the United States and its position in the world economy during a 20-year period beginning in 1992.

In the model, as in the data, the U.S. trade deficit first increases, then decreases; the U.S. real exchange rate first appreciates, then depreciates; a deficit in goods trade fuels the U.S. trade deficit, with a steady U.S. surplus in the service trade; and the fraction of U.S labor dedicated to producing goods--agriculture, mining and manufacturing--falls throughout the period.

Using their models, he and his colleagues analyzed two possible ends to the saving glut: an orderly, gradual rebalancing and a disorderly, sudden stop in foreign lending as occurred in Mexico in 1995–96.

"We found that a sudden stop would be very disruptive for the U.S. economy in the short term, particularly for the construction industry, "he says. "In the long term, however, a sudden stop would have a surprisingly small impact."

"As the U.S. trade deficit becomes a surplus, gradually or suddenly, employment in goods production will not return to its level in the early 1990s because much of this surplus will be exports of services and because much of the decline in employment in goods production has been, and will be, due to faster productivity growth in goods than in services," he adds. "We will probably produce more services, such as managerial or design services," to make up for the trade deficit decline.

"The United States is the biggest service exporter in the world, and, as services become more and more expensive, the U.S. can sell services at a higher price and buy goods from other countries more cheaply," he says.

Models are not perfect, and have been wrong in the past. His ongoing goal is to change that.

"When economists built models of the impact of trade and investment liberalization in North America 20 years ago...some of our predictions were right, but some predictions were wrong," he says. "We want to understand why, and what we still need to learn about building models, and especially think about what we got wrong and what we need to do in the future."

-- Marlene Cimons, National Science Foundation
-- Maria C. Zacharias
Investigators
Timothy Kehoe
David Levine
Brig 'Chip' Elliott

Sunday, May 31, 2015

SCIENCE/MATH HAVE LIMITS PREDICTING NATURAL DISASTERS LIKE EARTHQUAKES

FROM:  NATIONAL SCIENCE FOUNDATION
Earthquakes expose limits of scientific predictions

But math and science are refining ways to predict, limit impact of disasters
In 2012, six Italian seismologists were sent to prison because they failed to predict the 2009 L'Aquila 6.3 magnitude earthquake.

To some that may seem absurd but it points to the faith so many have come to place in science's ability to predict and prevent tragedies. Experts had for decades predicted that Nepal would experience a massive earthquake, but were unable to provide a more precise warning about the recent 7.8-magnitude quake that devastated the country. The Italian seismologists had similarly predicted earthquake probabilities but could not give an exact date.

Science and mathematics have not reached a point where they can forecast with certainty the exact time and specific severity of these cataclysmic events--and may never do so.

"The best we can do is make an assessment of there being a heightened risk in a certain geographic area over a certain window of time," said William Newman, a theoretical physicist at the University of California, Los Angeles, who has received funding from the National Sceince Foundation (NSF) for his work aimed at improving natural hazard predictions. "We can determine a sense of what is likely to occur, but we will never know exactly."

Newman has spent much of his 35-year career working in computational and applied mathematics but also has employed mathematics in applications to probe natural disaster issues such as earthquakes and climate change.

These days, mathematicians seem to be able to model almost anything, but, as Newman points out, the devil is not only in the details but in creating models that can be used for accurate prediction. In the case of tectonic plates, the randomness of their interaction limits the certainty of predictions, and those predictions become less certain as time passes. In much the same way that a weather forecaster can be more certain about predicting tomorrow's weather than next month's, Newman believes earthquake prediction accuracy has the potential to fall off.

"For mathematicians, three aspects come to mind," Newman said. "We like to think of the equations being well posed, well defined, and that we can run with them. In [Edward] Lorenz's case (whose model of turbulence celebrated its 50th anniversary recently), his equations about atmospheric behavior were, by and large, eminently reasonable. He supersimplified and saw that if he perturbed the initial conditions, after a certain amount of time, he could predict nothing."

Yes, you read that right: nothing.

The problem for mathematicians is that forecasting accuracy can only weaken as more variables cloud the equations and models they build. In the case of earthquakes, Newman says the prospects for good predictions are even more dismal than for atmospheric ones. Chaotic dynamics and complexity prevail.

In Los Angeles, where Newman lives, mathematicians and geophysicists have worked together and determined that sometime in the next 30 years, the area is likely to see a substantial earthquake due to its proximity to the San Andreas Fault. And as each year passes, the risk increases in this window of time. The mathematicians can only put so many pre-determined variables into their equations, including the patterns of tectonic plate changes and the environmental conditions that coincide with earthquake occurrences.

"We have to go into this realizing there are bounds," Newman said. "We are looking at complex systems that can produce patterns we just don't understand."

Additionally, while the news focuses on an earthquake and its aftershocks, there are also "foreshocks." But recognizing a a foreshock is impossible without seeing the seismic event that follows. So trying to formulate day-to-day seismologic predictions after any earthquake event can also be confounding.

Why even try to predict earthquakes?

One could easily draw the conclusion at this point that we walk away from the issue, shaking our heads. But mathematicians, computer scientists, physicists, geologists, engineers, and social scientists working together on this issue do provide value, each adding something that could improve the scientific community's understanding of this obviously complex issue.

As instruments become increasingly refined and data proliferate around the world, scientists also gain a better understanding of the consequences of earthquakes.

"It is true that scientists know very little about earthquake predictions," said Junping Wang, program director in NSF's mathematics division. "But this is exactly why we need to support earthquake research. Researching is the only way we can ever hope to build models that help to improve earthquake prediction and build a resilient society."

As they conduct more research in seismology, scientists are able to gain more and better knowledge that can benefit local policymakers looking to enhance preparedness and emergency response to earthquakes and cascading disasters.

"There are still plenty of opportunities where scientific and mathematical research can improve our knowledge," Wang said. "Understanding why an earthquake happened and how it happened helps us build better models, even if they can't tell us a specific date and time. With increased knowledge comes better preparedness."

Earthquake advice from a mathematician

"We can only tell people that there is a certain risk in a certain window of time," Newman said. "Then it's a matter of preparedness."

He cites the example of the Northridge earthquake that rocked the UCLA Mathematical Sciences Building in 1994. Architects designed expansion joints in different sections of the building because they knew that, at some point, it would have to cope with the trauma of earthquakes. In that case, some of the offices went through an "unexpected expansion," but Newman notes that ultimately the repairs were "essentially cosmetic."

Newman, who carries the distinction of being a member of UCLA's mathematics, physics and geology departments, routinely takes students to the San Andreas Fault--and specifically Vazquez Rocks, a set of formations exposed by seismic activity--for their research. He emphasizes that to prevent the fallout of earthquakes like the recent one in Nepal, policymaking that establishes building codes and individual preparedness are essential.

"If you live here, you have to earthquake-proof your home and your business. You need to be able to take care of yourself," he said. "And then when an earthquake does occur, hopefully, it will just be an inconvenience."

-- Ivy F. Kupec,
Investigators
William Newman
Vladimir Keilis-Borok
Related Institutions/Organizations
University of California-Los Angeles

Thursday, October 2, 2014

COMPUTER SIMULATIONS, STATISTICAL TECHNIQUES INDICATE CALIFORNIA DROUGHT LIKELY LINKED TO CLIMATE CHANGE

FROM:  NATIONAL SCIENCE FOUNDATION 
Cause of California drought linked to climate change

Extreme atmospheric conditions responsible for drought more likely to occur in current global warming.

The atmospheric conditions associated with the unprecedented drought in California are very likely linked to human-caused climate change, researchers report

Climate scientist Noah Diffenbaugh of Stanford University and colleagues used a novel combination of computer simulations and statistical techniques to show that a persistent region of high atmospheric pressure over the Pacific Ocean--one that diverted storms away from California--was much more likely to form in the presence of modern greenhouse gas concentrations.

The result, published today in the Bulletin of the American Meteorological Society, is one of the most comprehensive studies to investigate the link between climate change and California's ongoing drought.

"Our research finds that extreme atmospheric high pressure in this region--which is strongly linked to unusually low precipitation in California--is much more likely to occur today than prior to the emission of greenhouse gases that began during the Industrial Revolution in the 1800s," says Diffenbaugh.

The exceptional drought crippling California is by some measures the worst in state history.

Combined with unusually warm temperatures and stagnant air conditions, the lack of precipitation has triggered a dangerous increase in wildfires and incidents of air pollution across the state.

The water shortage could result in direct and indirect agricultural losses of at least $2.2 billion and lead to the loss of more than 17,000 seasonal and part-time jobs in 2014 alone.

Such effects have prompted a drought emergency in the state; the federal government has designated all 58 California counties as natural disaster areas.

"In the face of severe drought, decision-makers are facing tough choices about the allocation of water resources for urban, agricultural and other crucial needs," says Anjuli Bamzai, program director in the National Science Foundation's (NSF) Division of Atmospheric and Geospace Sciences, which funded the research.

"This study places the current drought in historical perspective and provides valuable scientific information for dealing with this grave situation. "

Scientists agree that the immediate cause of the drought is a particularly tenacious "blocking ridge" over the northeastern Pacific--popularly known as the Ridiculously Resilient Ridge, or "Triple R"--that prevented winter storms from reaching California during the 2013 and 2014 rainy seasons.

Blocking ridges are regions of high atmospheric pressure that disrupt typical wind patterns in the atmosphere.

"Winds respond to the spatial distribution of atmospheric pressure," says Daniel Swain of Stanford, lead author of the paper.

"We have seen this amazingly persistent region of high pressure over the northeastern Pacific for many months, which has substantially altered atmospheric flow and kept California largely dry."

The Triple R was exceptional for both its size and longevity.

While it dissipated briefly during the summer months of 2013, it returned by fall 2013 and persisted through much of the winter, California's wet season.

"At its peak in January 2014, the Triple R extended from the subtropical Pacific between California and Hawaii to the coast of the Arctic Ocean north of Alaska," says Swain, who coined the term "ridiculously resilient ridge" to highlight the persistent nature of the blocking ridge.

Like a large boulder that has tumbled into a narrow stream, the Triple R diverted the flow of high-speed air currents known as the jet stream far to the north, causing Pacific storms to bypass not only California, but also Oregon and Washington.

As a result, rain and snow that would normally fall on the West Coast were instead re-routed to Alaska and as far north as the Arctic Circle.

An important question for scientists and decision-makers has been whether human-caused climate change has influenced the conditions responsible for California's drought.

Given the important role of the Triple R, Diffenbaugh and colleagues set out to measure the probability of such extreme ridging events.

The team first assessed the rarity of the Triple R in the context of the 20th century historical record.

Analyzing the period since 1948, for which comprehensive atmospheric data are available, the researchers found that the persistence and intensity of the Triple R in 2013 were unrivaled by any previous event.

To more directly address the question of whether climate change played a role in the probability of the 2013 event, the team collaborated with scientist Bala Rajaratnam, also of Stanford.

Rajaratnam applied advanced statistical techniques to a large suite of climate model simulations.

Using the Triple R as a benchmark, Rajaratnam compared geopotential heights--an atmospheric property related to pressure--between two sets of climate model experiments.

One set mirrored the present climate, in which the atmosphere is growing increasingly warmer due to human emissions of carbon dioxide and other greenhouse gases.

In the other set of experiments, greenhouse gases were kept at a level similar to those that existed just prior to the Industrial Revolution.

The researchers found that the extreme heights of the Triple R in 2013 were at least three times as likely to occur in the present climate as in the preindustrial climate.

They also found that such extreme values are consistently tied to unusually low precipitation in California, and to the formation of atmospheric ridges over the northeastern Pacific.

"We've demonstrated with high statistical confidence that large-scale atmospheric conditions similar to those of the Triple R are far more likely to occur now than in the climate before we emitted large amounts of greenhouse gases," Rajaratnam says.

"In using these advanced statistical techniques to combine climate observations with model simulations, we've been able to better understand the ongoing drought in California," Diffenbaugh adds.

"This isn't a projection of 100 years in the future. This is an event that is more extreme than any in the observed record, and our research suggests that global warming is playing a role right now."

The research was also supported by the National Institutes of Health. Rajaratnam was also supported in part by DARPA, the Air Force Office of Scientific Research and the UPS fund.

-NSF-
Media Contacts
Cheryl Dybas, NSF,


Search This Blog

Translate

White House.gov Press Office Feed