Google Search

Google

Naptár

november 2024
Hét Ked Sze Csü Pén Szo Vas
<<  < Archív
1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30

koszonto helye...

Kedves Olvasó! Ezen az oldalamon találhatod a magyar vonatkozású híreket. Infók a világból, egyetemekről, szerelemről, s sok minden érdekes topicról. Kellemes időtöltést kívánok ezen a blogon... Szerkesztő

RSS - oliverhannak.com

Nincs megjeleníthető elem

Linkblog

Can Science Outwit Storms Like Katrina?

2007.05.29. 12:03 oliverhannak

Stand atop any levee in the New Orleans area, and one question will offer itself, unbidden, to the mind: Is this pile of dirt tall enough to stand up to the next storm?

The answer is complex, and a wary city has been waiting to hear it. After the New Orleans hurricane protection system failed under the onslaught of Hurricanes Katrina and Rita, the Army Corps of Engineers rethought the way it assesses hurricane risk. It devised new, flexible computer models and ran countless simulations on Defense Department supercomputers to help it understand what kind of storms the region can expect, how the current protection system might perform against them, and what defenses will be needed in the future.

Skeptics say the corps has bitten off more than its supercomputers can chew. And in fact, the effort to produce what the corps calls its risk and reliability report has long passed its original deadline of June 1, 2006. Last week, its publication was delayed yet again, into mid-June.

If all goes as planned, the color-coded maps and tables in the report will also help residents know whether their living rooms are likely to be wet or dry in the storms to come — and even whether they want to commit to staying in their city or pull up stakes.

Ed Link, director of the official corps investigation into the levee failures and a professor of engineering at the University of Maryland, said that while “everyone’s frustrated with how long it’s taken, especially us,” the agency would deliver the report only “when we have confidence that it’s the right information.”

“Misinformation is a whole lot worse than no information,” Dr. Link said.

The new methods employed by the corps have already been adopted by the other government agencies most interested in hurricanes and flooding: the National Oceanic and Atmospheric Administration and the Federal Emergency Management Agency, which runs the National Flood Insurance Program and will use the data in its flood maps.

It is the first time all three agencies have agreed on a common method of assessing such risks for the Gulf Coast. Eventually, Dr. Link suggested, it may be used by all of them nationwide.

The corps has all but completed the initial work of patching the damage done to the network of levees, floodwalls, gates and pumps. But as some work continues, there has been a lull in moving on to the next major step: raising the level of protection to meet the challenge of a 100-year storm, the kind of hurricane that might have a 1-in-100 chance of occurring in any given year.

The new report will provide the estimates of the kind of storms that could be expected at intervals of up to 500 years, and the damage and flooding they can be expected to produce. And it will help to develop proposals for protection systems against the strongest storms. The report is an important prelude to the design process.

Walter O. Baumy Jr., the chief of the engineering division for the New Orleans district of the corps, said the agency had not been sitting around waiting for the report, but had been designing structures for the next round of building based on the best estimates available. If necessary, he went on, the designs will be altered “when those numbers get finalized.”

Col. Jeffrey A. Bedey, commander of the corps’ Hurricane Protection Office in New Orleans, said the extra time had allowed the corps to learn from the mistakes of the past. “There’s a tremendous opportunity,” he said, “to show this isn’t the same old corps.”

No one disputes that the old way of doing business did not work. The New Orleans levees and floodwalls were built to withstand a hypothetical storm called the standard project hurricane, a model developed with the Weather Bureau beginning in 1959 and based largely on data drawn from previous storms.

The standard project hurricane was a hypothetical construct that may have been the state of the art at the time, but is “very simplistic” by today’s standards, Dr. Link said. The old model was limited by the shortage of data on older storms and is essentially a static set of values, Dr. Link said, adding, “You pull a hurricane out of a box and you stick it down at landfall.”

That did not show the complex behavior of a real storm, which produces surge and waves that have profound effects on coastal areas in the days before it actually hits land.

After Hurricane Katrina, an outside panel from the American Society of Civil Engineers pushed the corps to come up with a new system based less on history than on the broader range of statistical probabilities — the kind of tools commonly used today to determine, for example, how a building will fare in an earthquake.

“Without a statistically valid approach” to evaluating risk that goes beyond the old methods, the panel wrote to the corps in March 2006, “no rational hurricane protection system can result.” The chairman, David E. Daniel, president of the University of Texas at Dallas, said the effort “may well represent the most complex risk analysis ever undertaken for a major metropolitan area.”

The work has been done under the supervision of Donald T. Resio at the corps’ Engineer Research and Development Center in Vicksburg, Miss. Dr. Resio, a computer scientist with a sunny, professional air, works in the ephemeral world of code and screens in a place that has its roots in soil, steel and concrete.

He had been working on long-term risk assessment projects with NOAA and FEMA and academic and industry experts when the storms came, and he approached the leadership of the corps and offered to adapt the work his team had been doing to the urgent task at hand. “I went to headquarters,” he recalled, “and said, ‘If you want this done right, this is how it can be done.’ ”

He recalled saying something else as well: “I told them, ‘It’s difficult; it’s tedious.’ ”

Since then, he said, “we’ve been burning 80-, 100-hour weeks for so long we’ve lost track of them.”

The model uses chains of computer programs, each feeding into the other some of the data from historical storms and hypothetical future storms, setting up ranges of intensity, size, path, forward speed and other variables. These factors, blended together, produce data on the kinds of wind, storm surge and waves that can be expected to strike the shore, and how much rain can be expected. All of those elements of the model are then applied to the digital re-creations of local geography and the man-made structures of the region.

Now, Dr. Resio said, “we have come miles this year in our understanding of hurricane behavior and hurricane probability in the Gulf of Mexico.”

Dr. Link said the group had found, for example, that storms in the Gulf do not behave randomly and come to shore anywhere, but are likeliest to follow certain sets of paths — with New Orleans getting more than a random share of storms. And though much hurricane analysis relies on the handy Saffir-Simpson scale of storm strength, the size of a storm can be as important as its intensity in producing surge and waves. Despite its destructiveness, Katrina was just a Category 3 storm when it hit Mississippi, with winds that hit New Orleans only in the range of Category 1 or 2, but because it was more powerful when it was farther out in the Gulf, it generated a battering surge near 30 feet in some areas.

Most important, Dr. Link said, the new method lays the groundwork for finding the best way to protect a coast, and moves away from the old habit of evaluating projects largely on the basis of cost-benefit analysis. He added that the model was flexible enough to incorporate new information in the future, as global climate change affects hurricanes.

The biggest remaining question about the effort, however, is whether it will be accurate. Much of the delay has come down to attempts to reduce the levels of uncertainty throughout the series of calculations.

Robert S. Young, a coastal geologist at Western Carolina University, said the project, while worthy, would not succeed because there is still precious little data on the behavior of storms in the Gulf. Given the variability of nature, he said, devising a predictive model “is nearly impossible.”

“All of the modeling and prediction they’re doing is just guesswork,” he went on. “I’m not sure that is better than nothing.”

Orrin H. Pilkey, an emeritus professor of geology at Duke University, said “undue confidence in these models” could lead to a false sense of security about the hurricane protection system.

Dr. Link acknowledges that it is hard to come up with meaningful data but says it is important to try.

“You can wring your hands and say, ‘Woe is me!,’ ” he said, or “use the best projections that you can and try to inform yourself.”

The dispute is familiar to anyone who works in the world of complex computer modeling. Francine Berman, director of the San Diego Supercomputer Center, said the data should be rich, the model should be accurate and the computer should be powerful to provide truly useful information that can be validated.

“At the end of the day, a bad model is just interesting math,” she said. “These complex models are incredibly hard to get right. The first time out, it’s very unlikely that you’re going to get an accurate enough model, but you have to start with the best representational model you can come up with and iteratively improve it over years, decades — even centuries.”

An enormous amount of the recent work of the risk and reliability team has been translating the data into a form that the average person can understand. Donald E. Powell, President Bush’s coordinator for Gulf Coast rebuilding, said in an interview that he tells corps officials, “Put it in language my mama can understand.”

But the ultimate message is likely to be that despite the billions of dollars being spent to improve hurricane protection for New Orleans, it remains a city in the cross hairs for dangerous storms. Dr. Daniel, the University of Texas at Dallas president, said, “It doesn’t take a sophisticated risk analysis tool to say it’s a risky place.”

Szólj hozzá!

A bejegyzés trackback címe:

https://oliverhannak.blog.hu/api/trackback/id/tr3688722

Kommentek:

A hozzászólások a vonatkozó jogszabályok  értelmében felhasználói tartalomnak minősülnek, értük a szolgáltatás technikai  üzemeltetője semmilyen felelősséget nem vállal, azokat nem ellenőrzi. Kifogás esetén forduljon a blog szerkesztőjéhez. Részletek a  Felhasználási feltételekben és az adatvédelmi tájékoztatóban.

Nincsenek hozzászólások.
süti beállítások módosítása