1. Imagine you bought solar PVC arrays today, sold the power generated to your local utility, and used the proceeds to add new solar arrays each year.  What generating capacity would you expect in 100 years?  What would be the impact of uncertainties in inflation and power generation?  I've been obsessed with this idea for a while.  I developed a new Shiny App available here to help play around with these questions.


    There is a link in the app to obtain the R script.  Since this is not my area, I am hoping that people with domain specific knowledge will tinker and improve.

    Opportunities for Improvement  There are several areas where I think some improvement is possible.  The app assumes a uniform distribution (defined by the user) of annual power generation over the 100 years.  Users can estimate localized power generation ranges using the PVWatts Calculator.  However,  as the NREL PV efficiency chart shows, higher efficiency systems are in the pipeline.  Some sort of forecast trajectory seems like a possible improvement.  Similar trajectories could be applied to inflation rates too.  The app only uses 100 Monte Carlo iterations, which is a very small number.  But the app also uses looping, and loops are slow.  I ignored cost associated with real estate.  My assumption here is that brownfield sites close to the grid would be available and suitable, but explicit real estate costs could be included.
    2

    View comments

  2. WASP and other flow and water quality models ask users to input multiplier and exponent values relating velocity and depth to discharge. These relationships take the form of V=aQ^b and depth=cQ^d, where the values a, b, c, and d describe the curve that approximates paired points from other sources. When paired field measurements are lacking, Manning's equation provides an estimate of open channel flow based on channel characteristics.



    This app takes Manning's equation input variables and fits nonlinear least squares parameters to estimate a, b, c, and d over 20 equal increments of depth from 0.05 meters to the user specified maximum. The computations assume a symmetrical trapezoidal channel with continuous sloped sides.

    The app is available here including a link to download the code and a separate link to download a batch version of the same tool. 
    1

    View comments

  3. In 2014, EPA documented the relative lack of nutrient data from waste water treatment plant effluents, even though development of surface water quality standards for nitrogen and phosphorus has been a stated priority for more than a decade.

      A new shiny app lets users explore effluent nutrient concentrations from an existing data set by waste water treatment plant type, and by nutrient of interest.  When the number of available observations is sufficiently high, the app plots the IQR of the data by month, to show seasonal effects in treatment efficiency.  The app also shows the results in tabular form so users can incorporate the results into water quality models or watershed planning.

    You can see the app here.  There is also a companion report with more information about data processing and results.
    1

    View comments

  4. Monte Carlo analysis is a great way to explore the impact of input variable uncertainty on the results of engineering equations, and with vector variables and distribution and sampling functions at its core, R is a natural platform for this analysis.

    During a recent rainy vacation, I built a Shiny app that applies Monte Carlo analysis to Manning's Equation for open channel flow.  You can play with the app here.  The slider bars define the upper and lower limits for input variables such as depth and Manning's roughness coefficient, and the Shiny app computes resulting discharge (flow) distributions on the fly displayed via histogram and boxplot.  The app uses uniform distributions for input variables, but the script could easily be modified to incorporate other distributions.  I was especially impressed with the speed with which R and Shiny recomputed the distributions in reaction to changed input variables.  For me this project was an update of an older effort using spreadsheet add-ins.  While I will always have a warm place in my heart for Crystal Ball, R's natural fit to Monte Carlo analysis and unlimited plotting capabilities have me excited to do more.
    3

    View comments

  5. For as long as I have been working with water data, I wanted to construct a line graph superimposed on a box and whisker plot where the boxes show the distribution of values and the line shows some current condition.  One of my favorite things about ggplot2 is that it allows users to construct complex combinations of graphs in ways that make sense.

    This plot shows daily mean flow values in the Schuylkill River (the blue line) against box plots of the rolling 20-year set of daily mean values.  In this instance, flows in the lower part of the river are higher than the 75th percentile flow value, while flows in the upper part of the river are within the interquartile range.  Each point (and box plot) represents data from a USGS gage retrieved using the dataRetrieval library.  The vertical step increases in the blue line are inputs from tributary gages.  This particular image was developed as part of an automated dashboard hosted by the Delaware River Basin Commission.  R scripts, run in batch mode, are called by Windows Task Scheduler.  The scripts run overnight every night to retrieve data, develop new graphs, and upload the graphs to the dashboard web page providing a near real-time visualization of river conditions.
    4

    View comments

  6. High winds in the Delaware Estuary region caused a "blowout" tide in early April 2016, where observed water surface elevations were much lower than those predicted via harmonic constituents.  Extreme low blowout tides can hamper navigation due insufficient depth.

    This animated graph was created in R using data obtained from the NOAA PORTS system for the Delaware Estuary, using the NOAA API and the animation library. 
    0

    Add a comment

  7. The Delaware River experienced some high flow in late February 2016, providing an opportunity for an interesting animated graph of river response.


    This plot was developed using data from the USGS NWIS system for gauges on the Delaware River, retrieved with the excellent dataRetrieval library for R from USGS and the also great animation library.  The plot shows the discharge per drainage area (cfs/square mile) responding to rainfall.  Tributaries cause unequal response initially, but then the entire system settles into a flow wave rolling downstream.  Most gauges report discharge directly, but discharge was estimated for a few stage-only gauges using local stage-discharge relationships.  The concept here is the same as older posts looking at the impact of hurricane Irene and super storm Sandy, but using R to generate an easily portable GIF file.
    4

    View comments

  8. In this earlier post, I analyzed tidal water surface elevation data from the NOAA PORTS system from both the Delaware Estuary and the Chesapeake Bay, showing how the two systems react very differently to the tidal forcing at their Atlantic Ocean boundaries.  Animated plots may be even more effective at demonstrating this difference in response.  In the Delaware, the tidal fluctuation is amplified as it is translated upstream, but in the Chesapeake it is dampened.



    On January 23rd, you can see the influence of the surge from winter storm Jonas.

    This graph was created in R using the animation library.
    0

    Add a comment

  9. In a previous post I showed an animated age structure diagram depicting output from a simple population model written in Excel.  Here is another version of that model written in R.  One of the things I like about the R version is that I can post the animated .gif files directly into presentations without having to link to a video hosting site.

    In class, I use the changed fertility and death rates to demonstrate the impacts on a population.

    Interestingly, I found a link on the FlowingData blog implements a similar idea.
    0

    Add a comment

  10. After I completed the animated tidal water surface elevation plots for the Delaware Estuary, I looked for other systems with a good set of tidal observation stations.  The Lower Columbia River near Portland, Oregon fit the bill.  Using the NOAA PORTS stations, I set up near real-time animated plots of the Columbia.  Throughout December 2015, this plot showed some interesting results.

    At the beginning of December, the observed and predicted water surface elevations are in pretty good agreement.  But, with high flows upstream, the observed water surface elevations rose well above the predicted, especially upstream, and remained high until January 2016.  One of the observation stations went offline temporarily around December 19th.

    The above plot was created using R in R Studio and the animation library.
    0

    Add a comment

My Blog List
My Blog List
Subscribe
Subscribe
Blog Archive
About Me
About Me
I am an engineer working in water resources and the environmental field. On the side, I work with small businesses to help automate their data processing functions. I offer reasonable rates and am very efficient. Send me an e-mail at JYagecic@gmail.com
Loading
Copyright 2016. Dynamic Views theme. Powered by Blogger. Report Abuse.