Skip to main content Skip to secondary navigation
Main content start

21st-century Earth science is computer intensive and data driven

The satellite and supercomputer are the rock hammer and compass of modern geoscientists whose research spans the gamut from climate change projections and earthquake simulations to energy resources optimization.

Greg Beroza showing off earthquake data.
(Photo credit: Stacy Geiken)

While there are geologists who specialize in traditional fieldwork, today’s Earth scientists (of which geologists, or geoscientists, are a subset) are a varied bunch whose scientific scope encompasses everything from predicting and mitigating the effects of human-caused climate change to designing defenses against natural disasters and creating a path toward a renewable energy future.

The methods employed by 21st-century Earth scientists are no less diverse: Nowadays, the scientists in Stanford’s School of Earth, Energy & Environmental Sciences (Stanford Earth) are as likely to be found in front of an electronic screen, analyzing torrents of remote-sensing data with algorithms or simulating nature with computer models, as they are to be drilling ice cores in Antarctica or gathering soil samples from mountains in Mongolia.

“The satellite and supercomputer are the rock hammer and compass of the modern geoscientist,” says Dustin Schroeder, an assistant professor of geophysics at Stanford Earth.

Advanced computing techniques have completely transformed the Earth sciences, affecting not only how the field’s scientists and students acquire their data, but also how that data is processed, modeled, and analyzed. “We finally have the ability to explore Earth in its real complexity,” says Eric Dunham, an associate professor of geophysics.

Dunham’s lab specializes in developing equations to describe physical processes such as seismic wave propagation through the Earth and solving them with the help of powerful computers. “We either make a prediction about how a system behaves, or we are motivated by some real-world observation and then try to tweak our computer simulations to better match the data. In the process, we learn something about Earth,” Dunham says.

The satellite and supercomputer are the rock hammer and compass of the modern geoscientist.

Jenny Suckale’s research also revolves around translating natural phenomenon into equations that can be manipulated and modeled to yield hidden truths about the world. Her lab is currently trying to explain puzzling satellite imagery that show Antarctica is melting from the inside out. "Ice from the continent's interior form streams that flow out to sea," said Suckale, who is an assistant professor of Geophysics. "These ice streams are almost like arteries that help transport ice from the center of Antarctica to the coast very rapidly. It's like a bleeding heart in some sense.”

Suckale hypothesizes that the unusual melting being observed is due to weaknesses along the sides, or margins, of some of those arteries. “It’s not the same story everywhere,” Suckale says. “The margins are strong in some places and very weak in others. Weak margins allow the ice to flow faster. They’re like handrails on a water slide — if the handrails are very slippery, they’re not going to do anything for you.”

To better understand how margin strength affects ice stream flow in Antarctica, Suckale reduces the problem to a well-known equation for describing fluid flow, known as the Navier-Stokes equation, which she then uses computers to solve. "There's a lot of subtle complexity to this equation," Suckale said. "It describes pretty much everything that flows. It describes why planes fly, why fish can swim, why volcanoes erupt, why tsunamis charge up onto shore. It describes sand storms in Africa, and as we’re showing, ice motion in Antarctica."

Influencing next-generation hardware

According to Hamdi Tchelepi, the co-director of Stanford’s Center for Computational Earth & Environmental Science (CEES), the use of computer resources by Earth scientists trails behind only the defense industry and the internet, and their computing needs can influence the designs of next-generation hardware. “Earth science is about geology, but it’s also about the flow of air, water, oil, gas, and heat. That’s a lot of physics that requires extensive computing resources to model,” says Tchelepi, who is also a Stanford Earth professor of energy resources engineering.

Tchelepi’s colleague Greg Beroza coordinates a research effort that uses supercomputers at the National Science Foundation and the Department of Energy to simulate ground motion for major earthquakes in Southern California. “Seismologists use their very largest computers at capacity,” says geophysicist Beroza, who is the Wayne Loel Professor at Stanford. “We require all of that computing power to carry out the large-scale three-dimensional wave propagation simulations that we’re using to push forward seismic hazard analysis.”

Climate scientist Noah Diffenbaugh’s research involves simulating natural processes on a global scale. His group uses supercomputers to model the complex interactions between air, water, and land that control Earth’s climate, and his team’s research has shown that soaring atmospheric CO2 levels are boosting the likelihood and severity of extreme weather events such as droughts, monsoons, and hurricanes. “It would be great if we could put the climate system in a lab and manipulate it in a controlled way. Unfortunately, we can’t do that with the entire planet,” says Diffenbaugh, a professor of Earth system science. “What we can do with the models is simulate the climate that we have now, and also simulate the climate from the preindustrial era to remove the influence of humans from the atmosphere.”

Making sense of the data deluge

Today’s Earth scientists must grapple with a deluge of information streaming in from myriad sources, including satellites, aircraft-based sensors, and ground-based arrays. In order to weave all of these complex data streams together to answer important scientific questions and inform policymakers, Stanford Earth researchers draw upon and develop expertise in digital analytics techniques across engineering, computer sciences, and mathematics.

We finally have the ability to explore Earth in its real complexity.

When Beroza was inspired by the popular song-matching app Shazam to create an algorithm that could quickly scan large ground motion databases for microearthquakes with similar shake signatures, or “waveforms,” he assembled a team of programmers, computer scientists, and mathematicians. “In the past decade or so, one of the major trends in seismology has been the use of waveform similarity to find weakly recorded earthquakes,” Beroza says. “We’ve learned amazing things from doing that.”

While microquakes don’t threaten buildings or people, monitoring them could help scientists predict how frequently, and where, larger quakes are likely to occur. Beroza’s group initially planned to tackle the problem in a brute force fashion by using high-performance supercomputers to compare thousands of waveforms simultaneously. “But we soon realized that even they wouldn’t be able to handle the amount of data we wanted to process,” said Ossian O’Reilly, a Stanford Earth postdoctoral researcher who specializes in numerical analysis and high-performance computing for geophysical applications.

“So we started learning about the ingenious algorithms devised by the computer science community for solving related problems,” O’Reilly says. “What excited me most was the process of discovering, tweaking, and applying these algorithms to our own particular problem.”

O’Reilly is a postdoc in Dunham’s lab, which consists of two different, but complementary, types of scientists. “We have this great synergy in the group between people who are more interested in applications and solving real Earth science problems, and the people who are more interested in the numerical algorithms and writing code,” Dunham says. “When you put those two groups together, you can make a lot of progress.”

To attract more students like O’Reilly, Stanford Earth recently launched a big data introductory course called Know Your Planet: Big Earth that teaches students how to harness the power of massive sets of information from archives, satellites, smartphones, and remote-sensing devices to help solve environmental and sustainability challenges. “These are data analytical skills in high demand across many disciplines and jobs today,” says Margot Gerritsen, an associate professor of energy resources engineering and one of the course’s co-lecturers.

Invested in cutting-edge computing

To keep up with the ever-increasing technology demands of its researchers, Stanford Earth has invested heavily in high-performance computing (HPC) resources that are many times more powerful than personal desktop or laptop machines.

Whereas a typical laptop might contain a single central processing unit (CPU) with two or four “cores,” an HPC cluster can collectively house thousands of CPU cores that can be programmed to run in parallel, vastly increasing their computing power. While the university maintains a central HPC cluster, dubbed “Sherlock,” as a general resource for Stanford researchers, Stanford Earth faculty also have access to a separate HPC cluster as part of CEES. “Stanford Earth stands out as one of the more computationally intensive schools at Stanford,” says John Freshwaters, Stanford Earth’s chief information officer, “and we fine-tune networking, computer architectures, data access, and algorithm design so that everything works efficiently together.”

Early adopters

Earth scientists have always kept a close eye on advances coming out of the computer and electronics industries, and they tend to be enthusiastic early adopters of new technologies that are applicable to their research. For example, Schroeder’s lab takes the big data approach that tech giants such as Facebook, Google, and IBM employ to work with large and ever-growing volumes of data and has applied it to the field of glaciology. His team works with ice-penetrating radar data, gathered from aerial surveys over Antarctica and Greenland, and by satellites orbiting icy moons such as Jupiter’s Europa.

Collecting radar sounding data from these places is extremely expensive and logistically difficult, so scientists want to extract as much information as possible out of the data. “What is unique about our approach is that we are not just interested in the images produced and the interpretations they enable, but also in the data itself,” Schroeder says. “We work on the instruments that make the measurement; the experiments which collect the data; the signal-processing algorithms that make it usable; and the analysis approaches that enable geophysical inferences about the ice sheet.”

Schroeder’s chosen approach requires a tremendous amount of computational resources, but he says it enables his team to extract a much richer array of information in the data than traditional radio glaciological data analysis techniques. “I also happen to think that it’s a lot more fun and interesting,” Schroeder says.

Stanford Earth scientists David Lobell and Marshall Burke recently applied machine learning — the science of designing computer algorithms that learn from data — to high-resolution satellite imagery in order to estimate crop yields and measure the poverty levels of African villages from space. “What I’m excited about is that this satellite imagery and this new source of big data allow us unparalleled opportunities to learn about parts of the world where we otherwise have very little data,” says Burke, an assistant professor of Earth system science and a fellow at Stanford’s Center on Food Security and the Environment. “Our hope is that this new data source will allow us to make breakthroughs that help alleviate global poverty.”

Lobell, a professor of Earth system science, looks forward to a future where data will be cheaper and even more plentiful. “In the past, if you wanted to monitor a small farm from space, you needed a school bus-sized satellite. But over time, the satellites have gotten smaller and smaller, so from a school bus size, they’ve shrunk to the size of a refrigerator, and now some of the new CubeSats are the size of shoe boxes,” he says. “As we can train our algorithms to use cheaper data like pictures from cellphones, CubeSats, and drones, we can generate a lot more data to do science with.”

The speed at which that data can be shuttled around is also skyrocketing, and that will have major consequences for Earth scientists, says Stace Maples, manager of Stanford’s Geospatial Center, which provides data visualization and spatial analysis support for many Stanford Earth faculty. “As the time it takes imagery to move from acquisition to analysis platforms like Google’s Earth Engine shrinks from weeks to days to hours, that information becomes ‘actionable intelligence,’” Maples says, “capable of being used to stop illegal activities such as fishing in restricted areas, or clearcutting rainforest for farming and ranching.”

Yet, despite the growing importance of technology in the Earth sciences, Beroza thinks there will always be a place for traditional “boots-on-the-ground” fieldwork — albeit with some high-tech upgrades. “Nowadays, laser-ranging LIDAR [light detection and ranging] is used to see through vegetation to map landscapes and earthquake faults, and to find places where real focused fieldwork will be most fruitful,” Beroza says. “So there is a really healthy synergy between high technology and traditional techniques.”

Explore More