Feeds:
Posts
Comments

Archive for the ‘Excel tools’ Category

ESRI has created it’s own social network for user’s of its software called GeoNet. Back in March, ESRI created a webinar titled “Community Maps for Hydrology” and can be viewed here. The entire video is 53 minutes. Viewers might want to fast forward to about the 22 minute mark where the speaker starts to talk about and demonstrate the watershed delineation and downstream trace tools.  If you have points on a stream in a watershed, it is a fairly straightforward affair to delineate a watershed in either ArcMap or ArcGIS Online.  The appeal to doing this workflow in ArcMap is that if the delineated watershed is in a geodatabase, it is easy to get an area straight from the attribute table.   The video also discusses maps that provide real-time stream gage data and how it is possible to get a hydrograph plotted just below the map.

ESRI has also updated Maps for Office to version 3.  The interface is a bit more modern. As for new functionality, I think one of the biggest additions is that users can have more than one map displayed for a given tab.  ESRI’s blog post on the update is available here.

Advertisements

Read Full Post »

Pre and post project photographic monitoring has been around for a while.  I’ve seen some drab appendix tables that list the lat/long and the azimuth for where each photograph was taken. I’ve also seen photographs jammed into an appendix as well. Why not take a more modern approach using the features ArcGIS.com and photo hosting services such as Flickr or PicassaWeb? Assuming you have your lat/long, azimuth and photo filename in Excel, a pretty powerful online map can be made that shows where each photo was taken, the direction of the photo and the photo itself in a map that can be panned and zoomed and presumably have a recent aerial photo basemap.

Here are three ESRI blog posts that can help make such a map:

How to configure pop-ups:
http://blogs.esri.com/esri/arcgis/2013/07/25/custom-attribute-display-pop-ups/

Adding photos from Flickr, PicassaWeb or WindowsLive
http://blogs.esri.com/esri/gisedcom/2011/02/25/map-your-photos-with-arcgis-explorer-online-automating-the-process/

Rotating arrows to show azimuth:
http://blogs.esri.com/esri/arcgis/2014/01/06/rotating-layer-symbols/

If you have an ArcGIS.com organization prescription, then you can take the map you’ve made with all your georeferenced photos, configured pop-ups and arrows showing the photo azimuth and then put that dynamic map into a PowerPoint presentation. Pretty cool!

Read Full Post »

Dr. Andrew Simon pulls together an excellent set of ideas in this presentation:

http://www.eng.buffalo.edu/glp/events/summer2008/week1/full/16-ResponseAndRestorationImplications.pdf

For several years, I’ve struggled with the idea of a reference stream. He lays out some of the problems with the approach.  I like how he introduces the idea of a hydrologic floodplain and a topographic floodplain.  I also like how he highlights the notion that “bankfull” discharge applies to a stable channel.

One of the best questions he asks is: “How does the channel respond?” Answer: “It depends”  The figure below was pulled from Janet Hooke’s 2003 Geomorphology article titled “Coarse sediment connectivity in river channel systems: a conceptual framework methodology”  I think the image does a good job of supporting Dr. Simons’ question about how a channel would respond. Clearly the spatial variability that all rivers have dictate that a thorough inspection of a site and its context within a watershed is warranted.

erosion_deposition_spatial_variability

I also always like a presentation that goes back and explicitly states first principals in geomorphology:

Applied (Driving) Forces vs. Resisting Forces.

I was first exposed to this idea as an undergraduate at Middlebury College in the early 1990’s thanks to my advisor Jack Schmidt and it is still true as it ever was today.

I like the way Mr. Simon thinks and presents his ideas. Keeping these ideas in mind the next time a restoration project comes along would be excellent, especially at the early stages so that that all parties can better understand the river adjustment dynamics at a project site.

Read Full Post »

I have reviewed and analyzed USGS mean daily flow data for several years and I have settled into a bit of routine once I save the text file from the NWIS website for a given gage.  This file (https://www.box.com/s/9x7y9rpi6lpnhis3kny6) shows the steps I take to generate a flow duration curve for the Connecticut River at Montague City (#01170500).  This gage happens to have a very nice 108+ year record; clearly this is on the high end for most gages.  One of the first steps I take at this stage is to obtain the data based on water year (WY). As such, I will enter 1904-10-01 for the begin date and 2012-09-30 for the end date.  I always save the data as ‘Tab-separated’ and make sure that the file has a .txt extension.

Once I have the text file, I import it into Excel and the tab separated format makes the import process a breeze.  The next step is to simply copy the date and discharge data and paste the data into a separate tab.  You should be able to run the macro called: Year_Month_Date and this will calculate the year, month and date for each flow value.  Once that is done, I can then generate a pivot table using Year for Columns, and Month followed by Date for the rows.   Since I’m using 24hr mean daily data, I can use the sum function for the flow data, but if I had 15 minute or hourly data, I would use the average of discharge.  Once this table is made, I then copy and paste it into yet another tab.  If the data are pasted into cell A1, then the delete month macro should quickly delete the twelve extra rows that were generated in the pivot table.  It is at this stage that I think the fun analysis can begin.   The PercentileCharts tab shows two different flow duration curves, one for the April 1 to May 30 flows and another one for the June 1 to July 31 flows.  I use the ‘Percentile’ function to generate the data.  When using this function, remember that for the percent of time a flow is equalled or exceeded, you have to use 1-the percent of interest for the data to be plotted correctly.  These dates are somewhat arbitrary but if you are engaged in perhaps a fish study, you can tailor your dates to a specific life stage (e.g juveniles, adults, migration seasons).  If you know that certain decades or time periods were wetter or drier than others, you can modify your array selection for certain years.

While I don’t generate a hydrograph based on percentiles very often, I thought I would include how the percentile function can be used to create an annual hydrograph based on percent exceedance of interest.  It’s simply another way to explore and think about the data.  I plotted the data on a log scale simply to spread out the low flow data, it’s a bit difficult to discern on a linear scale.

All of the steps mentioned above can be done in a matter of minutes.  The internet and Excel make these analyses rather quick and straightforward.  Comments on these techniques or your own methodology are welcome!

Read Full Post »

This is not exactly hot off the press, but there is a wealth of good information here.  The state of Montana held a training back in December, 2009 and has posted what looks like essentially all the training material that David Williams developed for the course.

http://www.dnrc.mt.gov/wrd/water_op/floodplain/streambank_course/default.asp

Be sure to scroll through the whole list of resources and check out the spreadsheets on the bottom of the page.

Read Full Post »

I haven’t written anything to date for this analysis.  I’ve simply manipulated data from the USGS Suspended Sediment Database and summarized them in a spreadsheet available from the Ohio DNR and made a map.  The watersheds that are delineated in the map were made using EPA Watershed Delineation add-on in HydroDesktop.  The one data point that jumps out at me is the 1982 annual load for the Salmon River: 1,158 tons/mi2/yr, 96% of which occurred on June 5th and 6th. (Granted, the record that year only had 245 days of data, and no data were collected on June 4th). In any event, the number is still quite large and puts it in league with some of the larger annual loads I saw while working in coastal northern California, an area known for its high sediment loads.  The subsequent question that makes me curious is how did the channel respond to this flood in early June, 1982?  I have to assume that the massive load of sediment was deposited somewhere, but I also have to assume that much of bed and banks were scoured. The Google Earth imagery goes back to 1990 and it suggests a fairly stable single thread channel that hasn’t migrated much in the past two decades.  The vast majority of the floodplain appears forested, so my assumption is that much of the suspended sediment was deposited on the floodplain on the receding limb of the storm event or it was simply carried downstream to the Connecticut River.

(Circling back to the Allen Brook analysis, these results from CT gauges make me question  all the more the estimated sediment loads using the SWAT model for subbasins 1, 3, 4 an d 22 which supposedly were generating annual loads in excess of 5,ooo tons per square mile.  As consulting colleague of mine in California once said to me, “Beware the uncalibrated model.”)

The other bit of insight I gleaned from these data are how episodic the pulses of suspended sediment are in New England.  Granted, the data are limited: only 3 gauges had 5 or more years of nearly complete (360+ days of data for a given water year).  Nevertheless, it still suggests to me that most rivers are relatively calm and not transporting an inordinate amount of suspended sediment and then WHAM!, a large and rare event occurs that has the ability to deliver a substantial amount of sediment, well outside the range of normal annual load variability (very roughly speaking 20 to 100 tons/mi2/yr).

I also used the Effective Discharge from Suspended Sediment spreadsheet available from the Ohio DNR here.  Unfortunately, I’ve never quite gotten the gist of the spreadsheet, despite looking at it and reviewing the hidden Calculation Table tab.  In any event, I’ve plugged in the data for the eight CT gauges.  The hyperlink in the first column should take you to the USGS data, and the hyperlink on the right should take you to the spreadsheet I’ve saved on Box.net.  Each spreadsheet is just under 6Mb and is saved in .xlsm format to save space and to allow the macros that Dan Mecklenburg at the Ohio DNR developed to be functional.

Station Number
(USGS Link)

Station Name
(Effective Discharge Spreadsheet Link)

1125415

Muddy Brook At Childs Hill Rd Nr Woodstock, CT

1127500

Yantic R At Yantic, CT

1184100

Stony Bk Nr West Suffield, CT

1184500

Scantic R At Broad Brook, CT

1192883

Coginchaug River At Middlefield, CT

1193500

Salmon R Nr East Hampton, CT

1199000

Housatonic R At Falls Village, CT

1200500

Housatonic R At Gaylordsville, CT

It is at this point that I’m seeking some help.  If anyone has some insight into how to reasonably estimate effective discharge for any of these gauges based on the available data, I would love for you to contact me.  Please feel free to use the comment section.

Read Full Post »

Back in 2004 I was living in California, but I knew I wanted to move to the east coast. Consequently, I started scouring the internet for various hydrologic and geomorphic datasets in New England. I ran across a TMDL for Allen Brook in Williston, VT that was listed as impaired for sediment. “Perfect!” I thought. I took the data from that report and tried to build upon the analysis a bit.

Now, back in 2004, the internet and aerial photography isn’t quite what it is today. At the time, Microsoft offered a service called Terra Server. As a user, I could one by one, download little black and white air photos. I then had to use a photo editor to stitch the photos together. The process was a bit tedious and time consuming.  Since this was merely a hobby for me, I limited the number of photos that I stitched together.

The take home message for me was that the SWAT model could produce some very high loads and in my opinion, loads that seemed unrealistic when converted into tons/mi2.  Parts of Northern California are known for its unconsolidated, highly erodible bedrock that is steep and subject to intense rain and earthquakes.  These factors are all missing in Williston, VT, yet some of the subbasins in the TMDL analysis were indicating loads in excess of 5,000 tons/mi2.

Allen Brook mainstem airphoto.ppt

While searching for geomorphic data, I also ran across the Ohio DNR website that was sharing it’s cross section template developed by Dan Mecklenburg.  The template was perfect fit for the data that were collected as part of the TMDL.  I plugged in all the data and developed the spreadsheet below.  The one step I never got around to was to develop a nice summary table of all the data produced in this spreadsheet.

Allen Brook cross sections.xls

Finally, while it really wasn’t much of an analysis, since Allen Brook has no gauge, I put in a bit of effort to find some nearby gages that could serve as a surrogate gauge.   Below is the brief write-up I did for that.

Allen Brook surrogate gauge analysis.doc

Read Full Post »

Older Posts »

%d bloggers like this: