Showing posts with label SUPERCOMPUTER. Show all posts
Showing posts with label SUPERCOMPUTER. Show all posts

Monday, November 24, 2014

NSF TOUTS USE OF SUPERCOMPUTER AND RESOUCES TO HELP PLASMA DYNAMIC RESEARCH

FROM:  NATIONAL SCIENCE FOUNDATION 
A deep dive into plasma
Renowned physicist uses NSF-supported supercomputer and visualization resources to gain insight into plasma dynamic

Studying the intricacies and mysteries of the sun is physicist Wendell Horton life's work. A widely known authority on plasma physics, his study of the high temperature gases on the sun, or plasma, consistently leads him around the world to work on a diverse range of projects that have great impact.

Fusion energy is one such key scientific issue that Horton is investigating and one that has intrigued researchers for decades.

"Fusion energy involves the same thermonuclear reactions that take place on the sun," Horton said. "Fusing two isotopes of hydrogen to create helium releases a tremendous amount of energy--10 times greater than that of nuclear fission."

It's no secret that the demand for energy around the world is outpacing the supply. Fusion energy has tremendous potential. However, harnessing the power of the sun for this burgeoning energy source requires extensive work.

Through the Institute for Fusion Studies at The University of Texas at Austin, Horton collaborates with researchers at ITER, a fusion lab in France and the National Institute for Fusion Science in Japan to address these challenges. At ITER, Horton is working with researchers to build the world's largest tokamak--the device that is leading the way to produce fusion energy in the laboratory.

"Inside the tokamak, we inject 10 to 100 megawatts of power to recreate the conditions of burning hydrogen as it occurs in the sun," Horton said. "Our challenge is confining the plasma, since temperatures are up to 10 times hotter than the center of the sun inside the machine."

Perfecting the design of the tokamak is essential to producing fusion energy, and since it is not fully developed, Horton performs supercomputer simulations on the Stampede supercomputer at the Texas Advanced Computing Center (TACC) to model plasma flow and turbulence inside the device.

"Simulations give us information about plasma in three dimensions and in time, so that we are able to see details beyond what we would get with analytic theory and probes and high-tech diagnostic measurements," Horton said.

The simulations also give researchers a more holistic picture of what is needed to improve the tokamak design. Comparing simulations with fusion experiments in nuclear labs around the world helps Horton and other researchers move even closer to this breakthrough energy source.

Plasma in the ionosphere

Because the mathematical theories used to understand fusion reactions have numerous applications, Horton is also investigating space plasma physics, which has important implications in GPS communications.

GPS signaling, a complex form of communication, relies on signal transmission from satellites in space, through the ionosphere, to GPS devices located on Earth.

"The ionosphere is a layer of the atmosphere that is subject to solar radiation," Horton explained. "Due to the sun's high-energy solar radiation plasma wind, nitrogen and oxygen atoms are ionized, or stripped of their electrons, creating plasma gas."

These plasma structures can scatter signals sent between global navigation satellites and ground-based receivers resulting in a "loss-of-lock" and large errors in the data used for navigational systems.

Most people who use GPS navigation have experienced "loss-of-lock," or instance of system inaccuracy. Although this usually results in a minor inconvenience for the casual GPS user, it can be devastating for emergency response teams in disaster situations or where issues of national security are concerned.

To better understand how plasma in the ionosphere scatters signals and affects GPS communications, Horton is modeling plasma turbulence as it occurs in the ionosphere on Stampede. He is also sharing this knowledge with research institutions in the United States and abroad including the UT Space and Geophysics Laboratory.

Seeing is believing

Although Horton is a long-time TACC partner and Stampede user, he only recently began using TACC's visualization resources to gain deeper insight into plasma dynamics.

"After partnering with TACC for nearly 10 years, Horton inquired about creating visualizations of his research," said Greg Foss, TACC Research Scientist Associate. "I teamed up with TACC research scientist, Anne Bowen, to develop visualizations from the myriad of data Horton accumulated on plasmas."

Since plasma behaves similarly inside of a fusion-generating tokamak and in the ionosphere, Foss and Bowen developed visualizations representing generalized plasma turbulence. The team used Maverick, TACC's interactive visualization and data analysis system to create the visualizations, allowing Horton to see the full 3-D structure and dynamics of plasma for the first time in his 40-year career.

"It was very exciting and revealing to see how complex these plasma structures really are," said Horton. "I also began to appreciate how the measurements we get from laboratory diagnostics are not adequate enough to give us an understanding of the full three-dimensional plasma structure."

Word of the plasma visualizations soon spread and Horton received requests from physics researchers in Brazil and researchers at AMU in France to share the visualizations and work to create more. The visualizations were also presented at the XSEDE'14 Visualization Showcase and will be featured at the upcoming SC'14 conference.

Horton plans to continue working with Bowen and Foss to learn even more about these complex plasma structures, allowing him to further disseminate knowledge nationally and internationally, also proving that no matter your experience level, it's never too late to learn something new.

-- Makeda Easter, Texas Advanced Computing Center (
-- Aaron Dubrow, NSF
Investigators
Wendell Horton
Daniel Stanzione
Related Institutions/Organizations
Texas Advanced Computing Center
University of Texas at Austin

Tuesday, July 22, 2014

RESEARCH USING SUPERCOMPUTER THAT COULD LINK GENES TO TRAITS AND DISEASES

FROM:  NATIONAL SCIENCE FOUNDATION 
"Bottom-up" proteomics

NSF-funded supercomputer helps researchers interpret genomes
Tandem protein mass spectrometry is one of the most widely used methods in proteomics, the large-scale study of proteins, particularly their structures and functions.

Researchers in the Marcotte group at the University of Texas at Austin are using the Stampede supercomputer to develop and test computer algorithms that let them more accurately and efficiently interpret proteomics mass spectrometry data.

The researchers are midway through a project that analyzes the largest animal proteomics dataset ever collected (data equivalent to roughly half of all currently existing shotgun proteomics data in the public domain). These samples span protein extracts from a wide variety of tissues and cell types sampled across the animal tree of life.

The analyses consume considerable computing cycles and require the use of Stampede's large memory nodes, but they allow the group to reconstruct the 'wiring diagrams' of cells by learning how all of the proteins encoded by a genome are associated into functional pathways, systems, and networks. Such models let scientists better define the functions of genes, and link genes to traits and diseases.

"Researchers would usually analyze these sorts of datasets one at a time," Edward Marcotte said. "TACC let us scale this to thousands."

Sunday, July 13, 2014

SUPERCOMPUTER USED FOR MATERIAL SCIENCE RESEARCH

FROM:  NATIONAL SCIENCE FOUNDATION 
Helping ideas gel

NSF-supported Stampede supercomputer powers innovations in materials science
Cornell University researchers are using the Stampede supercomputer at the Texas Advanced Computing Center to help explain a nanoscale mystery: How can a colloidal gel--a smart material with promise in biomedicine--maintain its stability?

Colloidal gels are comprised of microscopic particles suspended in a solvent. They form networks of chained-together particles that support their own weight under gravity. For this reason, the soft solids form an emerging class of smart materials such as injectable pharmaceuticals and artificial tissue scaffolds. However, they are also beset by stability problems.

Using Stampede, researchers conducted the largest and longest simulation of a colloidal gel ever recorded. Their simulations helped answer several questions, including: What are the concentration and structure of the network strands? How does the gel restructure itself over time? And how does its structure affect a gel's mechanical properties?

"We have been absolutely happy with our entire experience on Stampede," said Roseanna N. Zia, an assistant professor of chemical and biomolecular engineering at Cornell. "The support of the review panel in granting such a large series of requests was just fantastic. In addition, the help desk has been consistently outstanding, and our XSEDE [Extreme Science and Engineering Discovery Environment] Campus Champion was a huge help in getting started. There is no way this study could have taken place without XSEDE's computational support."

XSEDE is the most advanced and robust collection of integrated advanced digital resources and services in the world. Thousands of scientists use XSEDE each year to interactively share computing resources, data and expertise. The five-year, $121-million project is supported by the National Science Foundation.

Zia presented the results on her colloidal gel research at the Society of Rheology annual meeting in October 2013. Her work was featured on the cover of the January 2014 Rheology Bulletin, and she was asked to contribute to an invitation-only special issue of the Journal of Rheology in 2014.

-- Aaron Dubrow, NSF
Investigators
Roseanna Zia
Daniel Stanzione

Sunday, June 29, 2014

NSF-FUNDED SUPERCOMPUTER DOES WHAT LAB EXPERIMENTS CAN'T

FROM:  NATIONAL SCIENCE FOUNDATION SCIENCE 
A high-performance first year for Stampede
NSF-funded supercomputer enables discoveries throughout science and engineering

Sometimes, the laboratory just won't cut it.

After all, you can't recreate an exploding star, manipulate quarks or forecast the climate in the lab. In cases like these, scientists rely on supercomputing simulations to capture the physical reality of these phenomena--minus the extraordinary cost, dangerous temperatures or millennium-long wait times.

When faced with an unsolvable problem, researchers at universities and labs across the United States set up virtual models, determine the initial conditions for their simulations--the weather in advance of an impending storm, the configurations of a drug molecule binding to an HIV virus, the dynamics of a distant dying star--and press compute.

And then they wait as the Stampede supercomputer in Austin, Texas, crunches the complex mathematics that underlies the problems they are trying to solve.

By harnessing thousands of computer processors, Stampede returns results within minutes, hours or just a few days (compared to the months and years without the use of supercomputers), helping to answer science's--and society's--toughest questions.

Stampede is one of the most powerful supercomputers in the U.S. for open research, and currently ranks as the seventh most powerful in the world, according to the November 2013 TOP500 List. Able to perform nearly 10 trillion operations per second, Stampede is the most capable of the high-performance computing, visualization and data analysis resources within the National Science Foundation's (NSF) Extreme Science and Engineering Discovery Environment (XSEDE).

Stampede went into operation at the Texas Advanced Computing Center (TACC) in January 2013. The system is a cornerstone of NSF's investment in an integrated advanced cyberinfrastructure, which allows America's scientists and engineers to access cutting-edge computational resources, data and expertise to further their research across scientific disciplines.

At any given moment, Stampede is running hundreds of separate applications simultaneously. Approximately 3,400 researchers computed on the system in its first year, working on 1,700 distinct projects. The researchers came from 350 different institutions and their work spanned a range of scientific disciplines from chemistry to economics to artificial intelligence.

These researchers apply to use Stampede through the XSEDE project. Their intended use of Stampede is assessed by a peer review committee that allocates time on the system. Once approved, researchers are provided access to Stampede free of charge and tap into an ecosystem of experts, software, storage, visualization and data analysis resources that make Stampede one of the most productive, comprehensive research environments in the world. Training and educational opportunities are also available to help scientists use Stampede effectively.

"It was a fantastic first year for Stampede and we're really proud of what the system has accomplished," said Dan Stanzione, acting director of TACC. "When we put Stampede together, we were looking for a general purpose architecture that would support everyone in the scientific community. With the achievements of its first year, we showed that was possible."

Helping today, preparing for tomorrow

When the National Science Foundation (NSF) released their solicitation for proposals for a new supercomputer to be deployed in 2013, they were looking for a system that could support the day-to-day needs of a growing community of computational scientists, but also one that would push the field forward by incorporating new, emerging technologies.

"The model that TACC used, incorporating an experimental component embedded in a state-of-the-art usable system, is a very innovative choice and just right for the NSF community of researchers who are focused on both today's and tomorrow's scientific discoveries," said Irene Qualters, division director for Advanced Cyberinfrastructure at NSF. "The results that researchers have achieved in Stampede's first year are a testimony to the system design and its appropriateness for the community."

"We wanted to put an innovative twist on our system and look at the next generation of capabilities," said TACC's Dan Stanzione. "What we came up with is a hybrid system that includes traditional Intel Xeon E5 processors and also has an Intel Xeon Phi card on every node on the system, and a few of them with two.

The Intel Xeon Phi [aka the 'many integrated core (MIC) coprocessor'] squeezes 60 or more processors onto a single card. In that respect, it is similar to GPUs (graphics processing units), which have been used for several years to aid parallel processing in high-performance computing systems, as well as to speed up graphics and gaming capabilities in home computers. The advantage of the Xeon Phi is its ability to perform calculations quickly while consuming less energy.

"The Xeon Phi is Intel's approach to changing these power and performance curves by giving us simpler cores with a simpler architecture but a lot more of them in the same size package," Stanzione said

As advanced computing systems grow more powerful, they also consume more energy--a situation that can be addressed by simpler, multicore chips. The Xeon Phi and other comparable technologies are believed to be critical to the effort to advance the field and develop future large-scale supercomputers.

"The exciting part is that MIC and GPU foreshadow what will be on the CPU in the future," Stanzione said. "The work that scientists are putting in now to optimize codes for these processors will pay off. It's not whether you should adopt them; it's whether you want to get a jump on the future. "

Though Xeon Phi adoption on Stampede started slowly, it now represents 10-20 percent of the usage of the system. Among the projects that have taken advantage of the Xeon Phi co-processor are efforts to develop new flu vaccines, simulations of the nucleus of the atom relevant to particle physics and a growing amount of weather forecasting.

Built to handle to big data

The power of Stampede reaches beyond its ability to gain insight into our world through computational modeling and simulation. The system's diverse resources can be used to explore research in fields too complex to describe with equations, such as genomics, neuroscience and the humanities. Stampede's extreme scale and unique technologies enable researchers to process massive quantities of data and use modern techniques to analyze measured data to reach previously unachievable conclusions.

Stampede provides four capabilities that most data problems take advantage of. Leveraging 14 petabytes of high speed internal storage, users can process massive amounts of independent data on multiple processers at once, thus reducing the time needed for the data analysis or computation.

Researchers can use many data analysis packages optimized to run on Stampede by TACC staff to statistically or visually analyze their results. Staff also collaborates with researchers to improve their software and make it run more efficiently in a high-performance environment.

Data is rich and complex. When the individual data computations become so large that Stampede's primary computing resources cannot handle the load, the system provides users with 16 compute nodes with one terabyte of memory each. This enables researchers to perform complex data analyses using Stampede's diverse and highly flexible computing engine.

Once data has been parsed and analyzed, GPUs can be used remotely to explore data interactively without having to move large amounts of information to less-powerful research computers.

"The Stampede environment provides data researchers with a single system that can easily overcome most of the technological hurdles they face today, allowing them to focus purely on discovering results from their data-driven research," said Niall Gaffney, TACC director of Data Intensive Computing.

Since it was deployed, Stampede has been in high demand. Ninety percent of the compute time on the system goes to researchers with grants from NSF or other federal agencies; the other 10 percent goes to industry partners and discretionary programs.

"The system is utilized all the time--24/7/365," Stanzione said. "We're getting proposals requesting 500 percent of our time. The demand exceeds time allocated by 5-1. The community is hungry to compute."

Stampede will operate through 2017 and will be infused with second generation Intel Xeon Phi cards in 2015.

With a resource like Stampede in the community's hands, great discoveries await.

"Stampede's performance really helped push our simulations to the limit," said Caltech astrophysicist Christian Ott who used the system to study supernovae. "Our research would have been practically impossible without Stampede."

-- Aaron Dubrow, NSF
Investigators
Daniel Stanzione
William Barth
Tommy Minyard
Niall Gaffney
Fuqing Zhang
Roseanna Zia
Christian Ott
Edward Marcotte

Thursday, February 20, 2014

SUPERCOMPUTER SIMULATIONS RECREATE X-RAYS FROM AREA OF A BLACK HOLE

FROM:  NATIONAL SCIENCE FOUNDATION 
Let there be light
Simulations on NSF-supported supercomputer re-create X-rays emerging from the neighborhood of black holes
February 18, 2014

Black holes may be dark, but the areas around them definitely are not. These dense, spinning behemoths twist up gas and matter just outside their event horizon, and generate heat and energy that gets radiated, in part, as light. And when black holes merge, they produce a bright intergalactic burst that may act as a beacon for their collision.

Astrophysicists became deeply interested in black holes in the 1960s, but the idea of their event horizon was first intimated in a paper by Karl Schwarzschild published after Einstein introduced general relativity in 1915.

Knowledge about black holes--these still-unseen objects--has grown tremendously in recent years. Part of this growth comes from researchers' ability to use detailed numerical models and powerful supercomputers to simulate the complex dynamics near a black hole. This is no trivial matter. Warped spacetime, gas pressure, ionizing radiation, magnetized plasma--the list of phenomena that must be included in an accurate simulation goes on and on.

"It's not something that you want to do with a paper and pencil," said Scott Noble, an astrophysicist at the Rochester Institute of Technology (RIT).

Working with Jeremy Schnittman of Goddard Space Flight Center and Julian Krolik of Johns Hopkins University, Noble and his colleagues created a new tool that predicts the light that an accreting black hole would produce. They did so by modeling how photons hit gas particles in the disk around the black hole (also known as an accretion disk), generating light--specifically light in the X-ray spectrum--and producing signals detected with today's most powerful telescopes.

In their June 2013 paper in the Astrophysical Journal, the researchers presented the results of a new global radiation transport code coupled to a relativistic simulation of an accreting, non-rotating black hole. For the first time, they were able to re-create and explain nearly all the components seen in the X-ray spectra of stellar-mass black holes.

The ability to generate realistic light signals from a black hole simulation is a first and brings with it the possibility of explaining a whole host of observations taken with multiple X-ray satellites during the past 40 years.

"We felt excited and also incredibly lucky, like we'd turned up ten heads in a row," Noble said. "The simulations are very challenging and if you don't get it just right, it won't give you an accurate answer. This was the first time that people have put all of the pieces together from first principles in such a thorough way."

The simulations are the combined results of two computational codes. One, Harm3d, re-creates the three-dimensional dynamics of a black hole accreting gas, including its magnetohydrodynamics (MHD), which charts the interplay of electrically conducting fluids like plasmas and a powerful magnetic field.

"The magnetic field is important in the area outside the black hole because it whips the gas around and can dictate its dynamics," Noble said. "Also, the movement of the field can lead to it kinking and trigger a reconnection event that produces an explosive burst of energy, turning magnetic field energy into heat."

Though the MHD forces are critical near the black hole, it is the X-rays these forces generate that can be observed. The second component, a radiative transport code called Pandurata, simulates what real photons do.

"They bounce around inside the gas, they reflect off the disk's surface, and their wavelengths change along the way," he explained. "Eventually, they reach some distant light collector--a numerically approximated observer--which provides the predicted light output of our simulation."

The researchers' simulations were run on the Ranger supercomputer at the Texas Advanced Computing Center, built with support from the National Science Foundation, which also funded the group's research.

The simulations were the highest resolution thin disk simulations ever performed, with the most points and the smallest length-scales for numerical cells, allowing the researchers to resolve very small features. Varying only the rate at which the black holes accrete gas, they were able to reproduce the wide range of X-ray states seen in observations of most galactic black hole sources.

With each passing year, the significance of black holes--and their role in shaping the cosmos--grows.

Nearly every good-sized galaxy has a supermassive black hole at its center, said Julian Krolik, a professor of physics and astronomy at Johns Hopkins University. For periods of a few to tens of million years at a time, black holes accrete incredible amounts of gas ultimately released as huge amounts of energy--as much as a hundred times the power output of all the stars in a black hole host galaxy put together.

"Some of that energy can travel out into their surrounding galaxies as ionizing light or fast-moving jets of ionized gas," Krolik continued. "As a result, so much heat can be deposited in the gas orbiting around in those galaxies that it dramatically alters the way they make new stars. It's widely thought that processes like this are largely responsible for regulating how many stars big galaxies hold."

In this way black holes may act as cosmic regulators--all the more reason to use numerical simulations to uncover further clues about how black holes interact with gas, stars and other supermassive black holes.

Said Noble: "To see that it works and reproduces the observational data when the observational data is so complicated...it's really remarkable."

-- Aaron Dubrow, NSF
Investigators
Scott Noble
Julian Krolik
Jeremy Schnittman
John Boisseau
Karl Schulz
Omar Ghattas
Tommy Minyard
Yosef Zlochower
Manuela Campanelli
Related Institutions/Organizations
Rochester Institute of Tech
Johns Hopkins University
Goddard Space Flight Center
University of Texas at Austin

Search This Blog

Translate

White House.gov Press Office Feed