Feeds:
Posts
Comments

Archive for the ‘USGS gage data’ Category

Any flood modeler is well aware of the challenges that this particular peril presents, for several reasons.  Ivan Maddox lays this out quite well in his Risks of Hazard blog where he discusses “The Ingredients of a Good Flood Model.”  The National Research Council raised similar concerns about the need for high quality elevation data in their 2009 Mapping the Zone report which is available at a free pdf download here.

In addition to the challenges of obtaining high quality elevation data, there is also the question of which discharge to model.  To the extent a flood model is based on an historic analysis of annual peaks, and the methods put forth in either Bulletin 17B (four pieces of software are currently available from four different agencies that incorporate this method) or 17C, eventually some significant uncertainty arises as one moves further along the curve towards higher magnitude, lower frequency events. Another approach to flood modeling is a Monte Carlo analysis with thousands of modeled events, but given how common an annual peak analysis is, I am going to stick to this type of analysis.

The graph below is taken directly from the macro enabled Excel spreadsheet that the Natural Resources Conservation Service provides on this page. Upon downloading the file and opening the ‘freq curve’ tab, a user will see this graph for the Yellowstone River near Sydney, MT. With over 100 years of annual peak discharge, it’s a pretty robust empirical set of observations.  The orange rectangle is what I refer to as the ‘box of uncertainty’, and in this case, it is the box for the 1% annual chance flood.

It has been pointed out that communicating about floods and especially flood frequency to the general public is challenging for agencies, disaster planning teams and hydrologists.  The NRCS recognizes this, and to their credit, they put exceedance probabilities on the x-axis at the bottom of the graph and they also include some flood frequency recurrence intervals (RI) in years on the top of the graph, which I clipped out.

The key point here is to look at the uncertainty that surrounds a given exceedance probability which is what the orange rectangle highlights for the 1% exceedance probability event.  For starters, if one holds to the 1% event, in my experience, the typical application is to read the value off the blue line and say that the value is 158,000 cfs.  In other words, a colloquial interpretation would be to say, “the 100 year flood is 158,000 cfs.”  A more nuanced and admittedly more challenging way to communicate the analysis would be to acknowledge the uncertainty and state that staying within the 5% and 95% percentile estimates, the 1% exceedance flood could be as low as 140,000 cfs and as high as 184,000 cfs.

In addition, if we hold to the 158,000 cfs estimate, one should also recognize that within the 5% and 95% estimate, this discharge could have an exceedance probability of as much as 3% (e..g ~33 yr RI) to as low as  0.3% (e.g. ~333 yr RI).

The air photo below is a screenshot from FEMA’s National Flood Hazard Layer (NFHL) Viewer. The USGS Sidney, MT gage is just to the northeast of the bridge crossing; flow is from the bottom to top. The blue-green shading represents the “1% Annual Chance Flood Hazard.” If I happened to own the property with the red roof near the bottom of the image, given all the modeling uncertainty and assuming the road just to the west of the property doesn’t provide any flood protection, it seems like purchasing flood insurance would certainly be in my best interest.

Communicating flood risk is and will continue to be a challenge.  I certainly prefer exceedance probability to recurrence intervals with years as the unit (e.g. the 50 or 100 year flood). In closing and keeping Ivan’s excellent piece in mind, another ingredient of a good flood model would be a well written flood model report explaining some of the uncertainty in the model results.

 

Advertisements

Read Full Post »

I don’t use this blog too much for advocacy, but on March 1, 2016 the EPA and USGS released a draft report whereby public input is being solicited. Have a read and let your scientific expertise on ecology, hydrology and anthropomorphic alterations to flow be heard:

http://www.regulations.gov/#!docketDetail;D=EPA-HQ-OW-2015-0335

Read Full Post »

Well, it’s September and it appears as though this update was released in back in March.  In any event, the latest version of PeakFQ is now 7.1 and it is available for download here:

http://water.usgs.gov/software/PeakFQ/

Be sure to read the version history which will discuss the new functionality in the program.

You might also want to read about what the Subcommittee on Hydrology, Hydrologic Frequency Analysis Work Group’s piece on Determining Flood Frequency using Expected Moments Algorithm here:

http://acwi.gov/hydrology/Frequency/b17_swfaq/EMAFAQ.html

Read Full Post »

I live in Massachusetts and really am not terribly knowledgeable about Florida surface water hydrology. I do know that it is a flat state, it has karst geology and that groundwater plays an important role.  I lived in Northern California for a number of years and spent a good bit of time reviewing USGS gage data. The coastal terrain tended to be steep and storms off the Pacific were capable of dropping significant amounts of rain.  The first time I saw a gage’s stage reading increase by over 15 feet in less than a day I was rather surprised, but over time I became more comfortable seeing such rapid stage increases.

Florida recently experienced some staggering amounts of rainfall. I heard reports indicating 22 to 26 inches of rain falling. As such, I had to go to the USGS NWIS Florida site to get a handle on how the rivers were responding.  As of this writing (May 1st, 2014) a number of gages are currently coded in black which the USGS labels as ‘High’). The Shoal River response caught my eye as it reminds me of responses that look like a Humboldt County California gage might look after a Pacific storm hits.

The Shoal River near Mossy Head (drainage area 123 mi2) was running at 348 cfs (2.8 csm) the morning of April 29 and peaked during the late afternoon on April 30 at 7580 cfs (61.6 csm) .  You can also see that the USGS sent hydrographers to the gage during just prior to the peak. Well done USGS and I hope that life and limb were not risked to obtain these data.

348 cfs to 7580 cfs in a little over a day and a half

348 cfs to 7580 cfs in a little over a day and a half

The stage change in 39 hours was just shy of 14 feet!

Nearly 14 feet of stage change in a day and  half

Average stage increase on the rising limb on April 30 was 8.4 in/hour

As impressive as this storm was, the historic data at this site indicate five events that were larger than 8,000 cfs. Even though roughly two feet of rain fell, other storms and antecedent conditions in past have led to even greater storm runoff.

April 30th flood will be the new flood of record

The data indicate five previous floods that were larger

As fascinating as I find these data, this is the classic case of what I refer to as the “hydrologist’s dilemma”.  We find these rare events exciting and interesting, yet at the same time, many people are suffering and are experiencing a life changing natural disaster.  It goes without saying that my thoughts and prayers are going out to the people in Florida who are now facing the challenge of a post flood situation.  May your fellow neighbors, place of worship, elected officials, local businesses and insurance companies all be a source of inspiration and may you be a more flood resilient community in the end.

Read Full Post »

The USGS has reviewed several gauges throughout the state and has now developed regional bankfull geometry curves.  For those of us waiting for these data so that we can put them to use on projects, this is a great development.  The study does have a few geographic limitations, so take note of those caveats before putting the equations to use.
http://pubs.usgs.gov/sir/2013/5155/

 

Read Full Post »

I’ve read quite a few reports and postings about the devastation that Tropical Storm Irene brought to Vermont. It was with great pleasure that I just ran across this page:

http://action.clf.org/site/PageNavigator/hurricane_irene_anniversary.html

Being an alumnus of Middlebury College, I was quite interested to hear about how Otter Creek responded to the storm. When I heard about the wetlands storing so much flood water and that the discharge at Middlebury was actually less than the discharge in Rutland, I was a bit taken aback. I had to check out the gage data based on what Mike Kline pointed out in the video.

Figure 1. Mean daily flow for the Otter Creek at Rutland and at Middlebury

Figure 2. Unit Discharge (cubic feet per second per square mile) for the Otter Creek at Rutland and at Middlebury.

Both Figures 1 and 2 tell an extraordinary story as was pointed out in the CLF video. On August 29, the folks in Rutland were seeing the river at 13,500 cfs whereas in Middlebury, the river was at 3,700 cfs. The next day, the Otter Creek was already receding in Rutland, whereas in Middlebury, the river was still slowly rising, due to all the stored water in the wetlands between the two sites slowly releasing that large volume of water. The Otter Creek at Middlebury doesn’t peak until September 2, three days after the storm! So while Irene has moved up into Canada and the weather in Middlebury was turning sunny, the river is still rising due to all the stored water slowly being released. Figure 2 tells the same story as figure 1, the only difference is that the data have been divided by each gage’s drainage area. The unit runoff in Rutland peaks at 44 csm, which, for this part of Vermont, is a large amount of runoff per unit of area. That the Otter Creek only gets up to 9.7 csm at Middlebury is simply another way of recognizing how much water was indeed stored in those wetlands between the two gages. That those wetlands shaved off roughly 34 csm of runoff should make every resident living within the Otter Creek floodplain extremely grateful for their presence. Intellectually, I’ve always known that wetlands can store floodwaters, but to see an effect this dramatic is something I’m finding rather astonishing.

If all those wetlands didn’t exist, one could make some very basic assumptions and ‘play’ with the data to simply get a very rough idea how bad things could have been in Middlebury and points downstream. The drainage area for the Rutland gage is 307 mi2 and the drainage area for the Middlebury gage is 628 mi2 so the drainage area for the Otter Creek at Middlebury is essentially twice that of Rutland. If we simply take the Rutland mean daily flows and multiply them by two, you can get an idea how bad the devastation could have been further downstream.

Figure 3. Synthetic hydrograph for the Otter Creek at Middlebury vs actual flows at Middlebury.

The synthetic hydrograph in Figure 3 is clearly a worst case, ‘what if’ scenario. It assumes that the Otter Creek watershed at Middlebury generates the exact same amount of unit runoff as the watershed does at Rutland, which we know is clearly not the case. What the synthetic hydrograph points out however is that had those wetlands not been storing so much water (in other words, had those wetlands been drained and developed), it’s conceivable that downtown Middlebury and points downstream could have seen discharges north of 20,000 which as Louis Porter points out in the video could conceivably destroy a bridge and certainly numerous more homes would have been flooded out.

Figure 4. Google Earth screenshot of some of the floodplain wetlands along Otter Creek.

That such large tracts of land have been protected and are allowed to flood so frequently (Figure 4) is a testament to the land protection that has occurred in Vermont. Hopefully this story can be used to promote the state’s Fluvial Erosion Hazard Program and various land protection groups can continue to protect the swamps and wetlands within the Otter Creek floodplain.

h/t: Laura Wildman, ASCE Restoration TC Group on LinkedIn

Read Full Post »

Since about 1997 or so, I’ve been looking at USGS daily flow data.  I also started using ESRI’s GIS software around that time as well.  Despite being released last December, I just found out about this tool only a few days ago. Now that I’ve given it a spin, I am very impressed.  ESRI’s latest version of ArcGIS (version 10) and the USGS NWIS Snapshot tool are perfect companions.

Here’s a link to the 2 page summary of the USGS tool: http://pubs.usgs.gov/fs/2011/3141/

This link will take you to the download and software support page: http://txpub.usgs.gov/snapshot/default.aspx

Assuming you have ArcGIS Desktop version 10, installation is pretty straight forward.  The only additional data a user would likely need is a shapefile or feature class for a watershed area of interest. The documentation walks a user through the process of searching for surface water sites and obtaining the mean daily flow values.  Once those data have been obtained, it’s just a matter of personal preference how a user would then want to use that data. ArcGIS has some built in graphing options, the data could be imported into Excel, or a user could use another graphing piece of software.

The Snapshot tool is one of the easiest to learn tools that I’ve seen in terms of being able to obtain mean daily flow data from USGS gages.  In under ten minutes, I was able to create hydrographs of five stations that have collected flow data within the Ipswich River watershed.  Personally, I think that’s pretty amazing.

CUASHI has developed HydroGET (http://his.cuahsi.org/hydroget.html) and HydroExcel (http://his.cuahsi.org/hydroexcel.html) which in many ways will ultimately do the same thing for a user as far as obtaining mean daily for values from the USGS is concerned.  HydroExcel can search WebServices other than just the USGS  and HydroGet will search for precipitation stations. Nevertheless, I think the NWIS Snapshot tool has the best user interface and is the easiest to execute.

The Snapshot tool along with CUASHI’s HydroDesktop (http://hydrodesktop.codeplex.com/) allow hydrologists to discover water resource data sets rather quickly using a GIS software.  Generating graphs with these tools is also a fairly quick affair.  Weather working on a pre-project proposal or if you are in the depths of a project, these two pieces of software should prove rather handy and time saving tools.

Read Full Post »

Older Posts »

%d bloggers like this: