Wednesday, November 12, 2008

Little Osage Creek

My main work for over a month has been the flood study for the headwaters of Little Osage Creek in Centerton, Arkansas. I'm way behind on the project, and am soon to be over budget (in part because the project included a flood study of the headwaters of McKisic Creek as well; yet the budget is 80 percent consumed). I let this project sit almost a year, then got a promise from another department in the company to piggy-back it on one of their projects along the West Branch of Little Osage Creek. Somehow they didn't do it, saying they never claimed they were intending to. So I had to take it back into my work load, dust off my "tools", and get the job done.

As of today, I am really, really close. I have completed a complicated computer model for calculating the flood flows. This model includes 18 detention ponds, some 21 subbasins, and a number of stream runs (called "reaches" in the vernacular). I have run this model for the 100-year storm, and had believable and predicted results. Next, in a separate program, I merged the existing FEMA stream geometry model with the CEI stream geometry model (the one my project was supposed to piggy-back on), filled in the 600 foot gap between them, corrected a considerable number of errors in both, added missing culverts, added needed cross-sections that were left out of the FEMA model and due to which the model never should have been approved, and finally, today, ran a successful stream geometry model together with the flood flows from the other model. I have an answer!

I still have some tweaking to do. The results (i.e. the flood elevation, which drives the spread of the flood waters) are not as favorable as I would like, so I have to look some more at it and see if anything in the merged geometry is too conservative. If so, I might be able to reduce both. If not, the City will just have to live with the calculated flood elevation and spread.

When I started the study I had some goals in mind as to how to improve the situation on the existing flood maps, which contained obvious errors along with some things that drastically changed from the previous flood map, but which may or may not have been errors. One of the obvious errors will be gone on the new map; the other looks like it will remain. That's not a final answer, but it's most likely.

Why am I writing this? Today there has been an emotional release with the success of these models. I still have much to do on the project, including model tweaks, calculating flows for other storms, writing a technical report, making a presentation before an unhappy city council, putting an electronic and print submittal together for FEMA, and then working through the (probable) six month FEMA approval process. But with the successful model having all missing links plugged, no obvious errors, and believable results, the release is at hand.

4 comments:

Anonymous said...

Dave, How do you verify the models? Or more specifically, calculate the margin of error? People tend to look for a final number and ignore the uncertainties. The report on the bridge in MN that collapsed a couple of years ago cited undersized gusset plates and overloading with new construction materials as leading to the disaster. Surely this bridge was modeled, but nobody seems to have paid attention to the results. There appears to have been no effort to quantify or at least publicize the structural limitations.

So, final question: how do you communicate the risks and probabilities to people who only want to know will the countryside flood while I'm still in office?

-Gary

David A. Todd said...

Gary:

I never did see the final report on the Minnesota bridge, though I thought a large part of the failure was maintenance.

We are not able to verify the model through calibration. This is much too small a project for that (only 3,800 feet of urban stream, though mody of the western tributary area is undeveloped. Verifying that an accurate model has been put in the computer is a question of checking upon checking upon checking. Making sure I have no typos. Making sure I have accurately stated the length of the creek channel, left overbank, and right overbank. Making sure I have used parameters that accurately define the drag of the flood plain against free flow. Making sure that each culvert is accurately defined by survey (or occasionally by approximate measures). It is then a matter of trusting the developers of the computer program to have accurately put together mathamatical routines that will give an accurate flood elevation. We really have no other way of verifying it, short of field calibration; which, when you are modeling the 100-year storm, it's kind of hard to get field calibration.

I will prepare a submittal to FEMA, asking them to change the flood map according to my analysis. In addition, I will prepare an engineering report of my findings and making recommendations to the City. One huge uncertainty for the future is how that western basin will develop. The City needs to make sure it develops in a way such that downstream flooding will not get worse. I'm going to recommend some ordinances they could enact to help the situation.

Anonymous said...

Certainly, the model software must be tested when originally designed. Or is it like the general circulation models in climate science where they fudge poorly understood components with best guesses? In other words, do they actually run water through a section of river or a physical model (like a wave tank in ocean engineering) to measure the accuracy of the computer model?
-Gary

David A. Todd said...

Gary:

The two computer programs I'm using are from the US Army Corps of Engineers. I believe a lot of what they have done on riverine studies, especially at the Hydraulic Engineer Center (HEC) at Vicksburg, Mississippi. The two programs are HEC-HMS and HEC-RAS. At Vicksburg, they have large buildings with models to run water through and to calculations. But they have also done modeling based on the Mississippi River and other rivers with gauging stations. Of course, what these physical models do is give us a better handle on the parameters that have to be entered into the computer.

For example, for a given rainfall, how much water runs off rather than soaks in? Based on much empirical data, we have published parameters based on type of soil, type of surface features, and even somewhat based on the nature of the storm. For a given waterway, how much friction does the surface features generate? Some of this has come from the Corps, but a lot more from university studies over the last 100 years.

So I have good confidence in the computer models and published parameters. Now, in myself to properly apply those models and select the right parameters, I have less considence.

Dave