Feeds:
Posts
Comments

Archive for the ‘software’ Category

Any flood modeler is well aware of the challenges that this particular peril presents, for several reasons.  Ivan Maddox lays this out quite well in his Risks of Hazard blog where he discusses “The Ingredients of a Good Flood Model.”  The National Research Council raised similar concerns about the need for high quality elevation data in their 2009 Mapping the Zone report which is available at a free pdf download here.

In addition to the challenges of obtaining high quality elevation data, there is also the question of which discharge to model.  To the extent a flood model is based on an historic analysis of annual peaks, and the methods put forth in either Bulletin 17B (four pieces of software are currently available from four different agencies that incorporate this method) or 17C, eventually some significant uncertainty arises as one moves further along the curve towards higher magnitude, lower frequency events. Another approach to flood modeling is a Monte Carlo analysis with thousands of modeled events, but given how common an annual peak analysis is, I am going to stick to this type of analysis.

The graph below is taken directly from the macro enabled Excel spreadsheet that the Natural Resources Conservation Service provides on this page. Upon downloading the file and opening the ‘freq curve’ tab, a user will see this graph for the Yellowstone River near Sydney, MT. With over 100 years of annual peak discharge, it’s a pretty robust empirical set of observations.  The orange rectangle is what I refer to as the ‘box of uncertainty’, and in this case, it is the box for the 1% annual chance flood.

It has been pointed out that communicating about floods and especially flood frequency to the general public is challenging for agencies, disaster planning teams and hydrologists.  The NRCS recognizes this, and to their credit, they put exceedance probabilities on the x-axis at the bottom of the graph and they also include some flood frequency recurrence intervals (RI) in years on the top of the graph, which I clipped out.

The key point here is to look at the uncertainty that surrounds a given exceedance probability which is what the orange rectangle highlights for the 1% exceedance probability event.  For starters, if one holds to the 1% event, in my experience, the typical application is to read the value off the blue line and say that the value is 158,000 cfs.  In other words, a colloquial interpretation would be to say, “the 100 year flood is 158,000 cfs.”  A more nuanced and admittedly more challenging way to communicate the analysis would be to acknowledge the uncertainty and state that staying within the 5% and 95% percentile estimates, the 1% exceedance flood could be as low as 140,000 cfs and as high as 184,000 cfs.

In addition, if we hold to the 158,000 cfs estimate, one should also recognize that within the 5% and 95% estimate, this discharge could have an exceedance probability of as much as 3% (e..g ~33 yr RI) to as low as  0.3% (e.g. ~333 yr RI).

The air photo below is a screenshot from FEMA’s National Flood Hazard Layer (NFHL) Viewer. The USGS Sidney, MT gage is just to the northeast of the bridge crossing; flow is from the bottom to top. The blue-green shading represents the “1% Annual Chance Flood Hazard.” If I happened to own the property with the red roof near the bottom of the image, given all the modeling uncertainty and assuming the road just to the west of the property doesn’t provide any flood protection, it seems like purchasing flood insurance would certainly be in my best interest.

Communicating flood risk is and will continue to be a challenge.  I certainly prefer exceedance probability to recurrence intervals with years as the unit (e.g. the 50 or 100 year flood). In closing and keeping Ivan’s excellent piece in mind, another ingredient of a good flood model would be a well written flood model report explaining some of the uncertainty in the model results.

 

Advertisements

Read Full Post »

Well, it’s September and it appears as though this update was released in back in March.  In any event, the latest version of PeakFQ is now 7.1 and it is available for download here:

http://water.usgs.gov/software/PeakFQ/

Be sure to read the version history which will discuss the new functionality in the program.

You might also want to read about what the Subcommittee on Hydrology, Hydrologic Frequency Analysis Work Group’s piece on Determining Flood Frequency using Expected Moments Algorithm here:

http://acwi.gov/hydrology/Frequency/b17_swfaq/EMAFAQ.html

Read Full Post »

Every year, ESRI hosts its annual users group meeting in San Diego.  For several years now, the day before the big conference opens, the hydro/water resources folks at ESRI host a 1 day seminar.  All the videos from that one day event can be viewed here:

http://video.arcgis.com/series/36/hydro

Updates to ArcHydro, GeoRAS and GeoHMS are provided.

Thank you ESRI!

Read Full Post »

Earlier this year, the USGS has released a web mapping service called Streamer.

http://nationalatlas.gov/streamer/Streamer/streamer.html

It’s a fairly straightforward tool.  A user can either trace a channel upstream or downstream. From there, users can get a summary or detailed report that has some basic information such as how many states the river is in, or how many USGS gages are on the trace.

A bit more background on the tool is available here: http://nhd.usgs.gov/newsletters/News_July_13.pdf

Read Full Post »

Dr. Andrew Simon pulls together an excellent set of ideas in this presentation:

http://www.eng.buffalo.edu/glp/events/summer2008/week1/full/16-ResponseAndRestorationImplications.pdf

For several years, I’ve struggled with the idea of a reference stream. He lays out some of the problems with the approach.  I like how he introduces the idea of a hydrologic floodplain and a topographic floodplain.  I also like how he highlights the notion that “bankfull” discharge applies to a stable channel.

One of the best questions he asks is: “How does the channel respond?” Answer: “It depends”  The figure below was pulled from Janet Hooke’s 2003 Geomorphology article titled “Coarse sediment connectivity in river channel systems: a conceptual framework methodology”  I think the image does a good job of supporting Dr. Simons’ question about how a channel would respond. Clearly the spatial variability that all rivers have dictate that a thorough inspection of a site and its context within a watershed is warranted.

erosion_deposition_spatial_variability

I also always like a presentation that goes back and explicitly states first principals in geomorphology:

Applied (Driving) Forces vs. Resisting Forces.

I was first exposed to this idea as an undergraduate at Middlebury College in the early 1990’s thanks to my advisor Jack Schmidt and it is still true as it ever was today.

I like the way Mr. Simon thinks and presents his ideas. Keeping these ideas in mind the next time a restoration project comes along would be excellent, especially at the early stages so that that all parties can better understand the river adjustment dynamics at a project site.

Read Full Post »

The World Water Online group on ArcGIS.com has added NRCS SSURGO soil survey database to its group of maps.  You can read about it here:

http://blogs.esri.com/esri/arcgis/2013/04/04/u-s-soils-data-added-to-world-water-online/

One of the features that I really enjoy about ArcGIS online is how well integrated it is with ArcGIS Desktop.  Any map I’m developing with shapefiles and feature classes stored on the local network can easily include a map that is on ArcGIS online.

Now when working on a restoration site, access to soil data is literally just a few clicks of the mouse away.

Read Full Post »

I’m an active voluneer for the Ipswich River Watershed Association. One of their programs is a volunteer monitoring program that collects temperature and dissolved oxygen at over thirty sites throughout the watershed once a month.  Now that several years of data have been collected, a nice data set has been developed.  I also follow some of the developments that CUASHI HIS has undertaken, namely the development of HydroDesktop and the WaterML standard.  In the past month, the monitoring data have been put onto CUASHI’s servers and now when one searches for either dissolved oxygen or temperature in the Ipswich River watershed using HydroDesktop, all the monitoring locations show up and the data can be downloaded.  I think that’s a great development for the program.  Way to go IRWA!

There are several professors at universities around the country who are active with CUASHI HIS, there is a CUASHI offices in Medford, MA and Washington D.C and the company Kisters is also active with the CUASHI HIS community.  The point being, if you are aware of some water quality monitoring that is being collected on your stream or river of interest, there are plenty of resources available to help you get your data published and made available to a wider audience.

Read Full Post »

Older Posts »

%d bloggers like this: