Fort Collins Science Center

You are here:  FORT > Products > Pubs > 4037 > Application

SNTEMP (In)Frequently Asked Questions:
Application Issues

Back to SNTEMP FAQ

Q43. I am in the process of writing up a couple of reports that involve SSTEMP model results. I thought I should do a couple of things that I would like your or your co-workers advice on. I have constructed my models by picking one of the hottest days in the summer and then adjusted wind and Manning's n, within reason, to obtain a best fit for that one day.

1) To "validate" the model from this single day, I plug in humidity, air temperature and flow from other days in the summer (with similar Langleys, day-length etc) and see if errors are similar to the output from the first chosen day. So far, my errors are all similar and generally less than 1 degree F. Is this a reasonable and important exercise? Would you suggest a better (more rigorous/acceptable to peer review) way?

2) I would also like to conduct a sensitivity analysis, similar to page 11 of Information Report 13, but I can't imagine how to do it in a manageable way. It seems a daunting task to vary everything simultaneously with SSTEMP. What is the easiest way to do this? Do you guys have a separate module for do this?

A43. (1) Your approach to validation is quite reasonable. However, if you do have multiple days of measurements, then SNTEMP would be better to use than SSTEMP.

A.(2) The new SSTEMP for Windows can do a single-parameter sensitivity analysis. This is not the same as the multiple-parameter analysis that was done for IP#13, but it's probably just about as good. For the information paper, for each parameter, all other parameters were varied simultaneously so that the quartiles could be developed. [Added 12/2001]


Q89. A consultant has been hired for an application I must review. What are the main data that I should ask him for when checking his assumptions? I have an idea but I need to call him and am still a little unsure of how these models work and how it comes together.

A89. I would ask for copies of field notes, a list of assumptions, and data sources. If this person has used SSTEMP, I'd ask for a copy of the automated sensitivity analysis. If you might be running the model yourself, then certainly ask for the data sets themselves. [Added 12/2001]


Q90. A consultant developed the temperature model from data collected for this year only. But this has been the driest year we have had since 1937. Is it appropriate to use data from this year where this area had 13 to 15 days of 95 degrees maximum air temperature with daily averages of 80 to 86 degrees. We have also had humidity of between 50 and 80%. Is it acceptable to plug this information into the models or would you consider this to be an aberrant year? Last year was also quite aberrant for this area due to the amount of rainfall and the unseasonable cool temperatures. It would be interesting to see what kind of similarity between 1998 and 1999 we had.

A90. It depends on your objectives. This could be a blessing in the sense that it might represent a "worst case" year. Sometimes a drought is worst case in flow only – the temperatures may not be the hottest, or vice versa. Regardless, the model is a physical process model and should work well under a variety of circumstances IF properly calibrated. The further one gets form the data set used for calibration, however, the more doubt there is. [Added 12/2001]


Q91. Given that the model was calibrated for one set of circumstances, is it appropriate for me to plug in an average or mean air and a maximum temperature that occurred this year without changing any other parameters? I was thinking that only things that might change would be ground water flow and the temperature changes during this drought.

A91. Yes, you may change any parameters you wish. What you are doing, in effect, is asking "What would the downstream water temperature be IF these were the hydrologic, meteorologic, and geometric conditions that exist in the stream for a particular time of the year?" If that question makes sense for your application, then the model should work reasonably well, given the considerations I mentioned previously. [Added 12/2001]


Q96. A hypothetical:
A run-of river hydro project.
Low flow is 10 cubic meters/sec
High temperatures are already a problem (20-25 C already occurs)
The project includes a weir that will raise WSEL 3.5 meters, and back water up for about 2 km.
Exchange rate in the impoundment is about 4 hrs at low flow
Banks are fairly steep: about 45 degrees.

Now the question: Will the creation of this impoundment *necessarily* aggravate high temperature problems? The idea has been advanced that "the impoundment will obviously cause additional heating".

What do you think? I am of the opinion that modeling should be able to predict fairly accurately whether the impoundment will heat/cool/do nothing, but significant heating is not a foregone conclusion. However, I have no large database at my fingertips to back this up.

A96. I have been asked this question before and I respond in typical fashion -- it depends. It depends on the relative increase in width compared to depth for the length involved, on whether the impoundment is well mixed under all hydrologic and meteorologic conditions (and whether it is top or bottom release). I don't necessarily fully trust SNTEMP to deal effectively with run-of-river impoundments, but there are bound to be some models that at least attempt to do this -- CE-QUAL-W2 for one. I suspect that under worst-case conditions, most shallow impoundments will increase temperature of the release/spill, but one cannot tell without further study. [Added 6/2002]


Q97. I was asked to use SSTEMP, or some other simple model, in a screening exercise to determine whether a small "headpond" on a proposed hydro project might contribute to heating. If a 3-meter weir is put in place, backing up the water for a kilometer or so, what can one do with SSTEMP to assess this other than change the width term? And what kind of confidence can be put on the result? Would SNTEMP do a little or a lot better?

Assumptions: We have good air and water temperatures with existing conditions, and stratification in the impoundment seems unlikely. The widths and depths of the impoundment can be predicted accurately.

A97. This is not an uncommon screening request. Unfortunately, I have no knowledge of how well the model actually works in this sort of situation. The width is the obvious variable, but also Manning's n is severely affected. You probably have a better handle on Manning's n than I do, or could estimate the velocities (and therefore the resistance). The kicker is whether there will be any stratification (or really any non-longitudinal variation) that would violate the assumptions about thorough mixing. I know one consultant that ran into some trouble on a project with large ponding where it seemed that the water was not thoroughly mixed. It seemed to take a couple of days for the inflow to make it through to the dam -- thus you may need to think carefully about the time step.

I guess the bottom line is that the more this pond is like a wide spot in the river, the more you can model it with SSTEMP (SNTEMP would be the same). The more it is a reservoir, the more you need another model like CE-QUAL-W2. [Added 6/2002]


Q98. Has anyone tried to apply SSTEMP to small flow-thru ponds? i.e. assume its a widening of the stream. Is there another software (simple) that you know of that would be more appropriate. These ponds are less than 0.5 ha with only approx. 1 cfs.

A98. Yes, there have been many applications similar to what you describe, but I'd have to be honest and say that I don't know how well they have worked. The problem for a situation like you describe is stratification. If you have a small flow coming into a "big deep" pond, you are likely to have stratification such that the outflow is simply at or near equilibrium temperature. Both SNTEMP and SSTEMP believe that water in a stream channel is fully mixed. When this is not true, the model is inconsistent with reality and should not be applied.

SNTEMP has a feature to simulate equilibrium release from a reservoir, and this might work for you. I'm not sure. I am unaware of any other software for your problem, other than regression models. [Added 6/2002]


Q99. We are trying to determine the thermal impacts of changing 300 feet of streamside vegetation from forested to shrubs. How accurate is SSTEMP for short stream segments?

A99. I have no good metrics for this. There is some scant evidence, and theoretical rationale for believing, that the model does best when the reach length approximates the daily (or single time step) travel time, i.e., water entering the top exits the bottom in a single time step -- no more and no less. Yet we violate this rule all the time, especially on the short end, and it doesn't seem to matter much. Probably the bigger problem is in estimating maximum daily water temperatures (as opposed to means) for a variety of reasons -- see the documentation for SSTEMP. On the plus side, you likely won't see much change in 300 ft anyway unless it is a really small stream. [Added 12/2001]


Q100. I'm not sure I was very articulate the other day in asking you my question. Let me use recent temperature study as an example. The data input matrix was as follows:

Historical datasets hydrology air temp meteorology All else

1995 1995 flows/water temps 1995 temp 1995 data constant
1996 1996 flows/water temps 1996 temp 1996 data constant

Synthetic datasets

Medians POR medians 2 sets (normal & 85th) norm/average constant
Absolute minima Monthly minima 2 sets (normal &85th) norm/average constant
Constant flow 80, 100, etc cfs 2 sets (normal &85th) norm/average constant

We used 1995 and 1996 measured water temperatures to validate the model and the statistics looked excellent. My question is:

a) In varying both the flows and meteorology in the synthetic sets, can we still feel reasonably confident about the model's predictive ability? You are changing several factors. Does the nature of the model take that into account?
b) If yes, have you ever evaluated how many factors can be changed without hurting the model's predictive capabilities (i.e.: Suppose you also had clear cutting occurring and needed to alter the shade file as well?). Would it be preferable to look at the factors separately as the alternatives become more complex? I would assume that the farther you stray from your validation sets, the more risky the conclusions.

For the basis of the user's manual and documentation, I've not run across a passage that specifically discusses the overall strategy in employing historical versus synthetic datasets. Unless I've missed something, the procedure is discussed from the operational perspective of running the model but not more generally.

A100. These are good questions. I suppose you are anticipating a legal challenge (?).

The general answer is that since SNTEMP is a physically-based model, it is likely that the model will produce mean daily water temperature predictions in the range of reliability found during your historical calibrations. When I say physically-based, I am contrasting with a statistically-based approach typically using some form of multiple regression. As you know, regressions, especially "elaborate" regressions, often fail when queried beyond the range for which they have been "trained".

This model (SNTEMP) has been validated many times (though few good examples have been published, unfortunately). By this I mean that the model has been calibrated using one data set and then run against a data set of "real" values not used in the calibration. Most often what people find is that the goodness-of-fit statistics are almost as good as those found during the initial calibration, sometimes better, sometimes worse. This is especially true if little or no calibration was done in the first place. One trap people can fall into is "over calibrating" their model by minutely tweaking the calibration variables. This can backfire in that the model may not do as well under altered conditions. The fact that SNTEMP works so well uncalibrated in many different areas of the country is testament to its wide applicability.

Rarely does one have a long data set to use for calibration. One or two years (or seasons) of data are usually the case. And Murphy's Law "dictates" that the conditions you use for calibration are always unlike the conditions you really want to evaluate. For example, you collect data for high flow conditions, but want to make predictions for low. It will always be true that the further your synthetic conditions are from your measured conditions, the more you would want to verify that the model works under those altered conditions. The more extreme the difference in conditions, hydrologically or meteorologically, the more you would want to be cautious about the results. Nonetheless, being physically-based, SNTEMP can be widely trusted, especially from the meteorological side of the equation. I can get nervous on the hydrologic side for two reasons: (1) stream width can be a very sensitive variable; it is important to estimate any change accurately. (2) Large flows can mask effects of groundwater accretions that have a more pronounced thermal effect at low flow.

Your synthetic datasets are all in the realm of what has actually happened. That's good. Using medians or normals should not be cause for alarm. It would be the minima that may be less likely to produce reliable results. It is important to tell people that these are model estimates only and not precise values. You can safely say that the model may be trusted (within the bounds of your error statistics), but it will always be wise to double-check the model (and perhaps recalibrate) when more data are available to test it under conditions like what you want to evaluate. I am not concerned that you varied more than one variable at a time. Again, the wide applicability of SNTEMP is testament of its flexibility in this regard.

I have never formally evaluated how well the model would do if several factors were changed at once, but as stated above, I do not consider this a problem. Just running the model for a whole year introduces large changes in hydrology and meteorology for most rivers and the model still does fine, albeit with some seasonal biases reflected in the output. [Added 6/2002]


Q101. We have been having an internal debate here at our office. Streams tend to show a natural longitudinal increase in water temperature. A trend line could conceivably be constructed for fully canopied streams. The graph shows that upon reentry into undisturbed conditions, water temperatures return to the trend line. Some in the office believe that there is nothing to remove heat from the disturbed reach. The new trend line would be parallel, but remain elevated above the undisturbed trend line.

What has you modeling and empirical experiences led you to believe?

A101. Will you let me sit on the fence on this one? No, I didn't think you would, so let me explain.

Most of my experience is in the modeling arena, so I tend to think about virtual streams and therefore what I say should be suspect from the start. My thinking is also constrained by scale, dealing with somewhat larger streams and perhaps longer distances than you may generally be considering. By larger, I mean streams greater than 15 cfs base flow and more often quite a bit bigger. For streams at least this size, some small-scale physical processes may be less important. I'll get back to those in a moment.

I have looked a bit at this "recovery zone" issue and my conclusion was that the "answer" to your question was somewhere in the middle. That is, depending on the relative equilibrium temperatures involved, once you add heat to a stream, that "excess" heat can remain with the stream for a "considerable" distance. The stream will recover, but it depends on how much heat is truly excess, or perhaps more precisely, on the degree of excess heat with respect to the "normal" ambient temperature. At least theoretically, remember, the rate of heating and cooling depends on the amount of deviation from equilibrium, but since equilibrium is really only a theoretical limit, and it is asymptotic, you never really reach the asymptote.

I have also looked at an even broader basin-scale model that more properly takes into account the gradient in ambient meteorology with elevation. My conclusion from this work is the same, namely that heat added can persist far downstream, but with very small differences, making it hard to measure (see below). And I must admit that this still isn't a good test of even a virtual stream in that I did not examine a situation where the stream may go through alternating thermal zones, sometimes cooling and sometimes warming. I find it hard to believe that the conclusion would be different, but I haven't "tested" that anyway, so I remain noncommittal.

But what about small forest streams, often non-fish bearing. I am going to plead ignorance rather than go too far out on a limb. It appears that there are many elements of the physical processes that are not now being adequately represented. Here I am referring to the interchange of water with the streambed and the alluvial aquifer (hyporheic zone) that have not gotten the attention they may deserve. These processes may come close to fully buffering excess heat, so close in fact that we may have great difficulty measuring differences and proving or disproving exactly what is going on. Berman and Quinn (http://yosemite.epa.gov/R10/WATER.NSF/1887fc8b0c8f2aee8825648f00528583/ed79b85ef5810df0882568c400739cc2?OpenDocument) have some provocative things to say about these processes on large and small scales, but unfortunately offer little in the way of hard data to back themselves up.

Enough rambling. My general conclusion is that there can be a cumulative effect in the sense of additive heat below a disturbance, but that effect is not cumulative if cumulative is defined as fully persistent. Back radiation is always at work, day and night, seeking equilibrium (removing heat if "necessary"), and it does a good job. The line you describe on the graph tends to return to the trend line, and may approach it quite closely, but never be exactly the same as if the entire stream had been "undisturbed." Scale is very important. Does that help? [Added 6/2002]


Q102. Do I need headwater nodes for my tribs if I know the flow and temp at the junction?

A102. No, you can treat the tributaries as Point sources rather than tributaries if you wish. This has some merit in terms of being simpler for a reviewer to understand, i.e., you didn't really model the tributary, you just "interjected" its flow and temperature. The alternative is to keep the tributary, but just give it a length of 1 meter and supply the flow and temperature at its H node. This may have the advantage of being closer to how you have set up the hydrology node file, I don't know. Really, either alternative should work fine.

One clarification: I assume you mean you know the flow and temp just ABOVE the junction. [Added 6/2002]


Q103. I am currently conducting a small habitat assessment/instream flow project. This river currently sustains a native brook trout population. However, portions of the river are under stress created by surface-water withdrawals for golf courses and turf farms, and also by increasing ground water withdrawals. I installed ten Temperature-Tidbit data loggers in the river this summer to try to get a handle on the temperature regimes of the mainstem and tributaries, and on the effect of several small impoundments on stream temperature. Ultimately, I am interested in trying to determine the effect of the withdrawals on stream temperature, and whether temperature may become a limiting factor for the brook trout before low flow does. Hopefully, the training course will give me some insight on these issues. If you can provide any references on similar studies it would be greatly appreciated.

A103. Your project sounds very interesting. I have been interested for some time in thermal effects due to urbanization, but have never been actively involved in that work. A year or so ago, I attended a presentation by M. Kieser with RRS Greiner Woodward Clyde who was doing some work on thermal enrichment, largely from stormwater drainage (hot parking lots and roofs, that sort of thing). They had carefully monitored adjacent cold-water streams that showed interesting thermal behavior (spikes). Their work was aimed toward mitigation of these spikes. They cited a "classic" report by Galli :

John Galli. 1990. Thermal Impacts associated with urbanization and stormwater management best management practices. Metropolitan Washington Council of Governments,777 N. Capital Street, Washington, DC [Added 6/2002]


Q161. I originally ran my data through for the X. River for year 2002 - a low flow year (we are modeling for low flows). I want to run some alternative flow scenarios to see what flows could be recommended to lower temperatures in the river so as to keep them below 31°C.

I thought that I could do this by altering my 3 Validation nodes by adding 2, 4, 6, 8, and 10 cubic meters per second (cms) to the original data and re-running it through SNTEMP. This produced nice maximum daily results showing that when flows increased, maximum daily temperatures decreased, which is what I had expected. However, the average temperatures increased as flows increased.

The only explanation I can come up with is that because there is a now increased flow, the water is becoming well-mixed and causing the average to increase.  Does this sound like a logical explanation to you, or is there something funny going on with my data?

Attached are two graphs, one showing the maximum and one showing the average values with the altered flows.  If you could give me your opinion on this matter it would be greatly appreciated!

I also just realized as I am typing this that as I changed the flows at my V nodes, the temperatures are remaining the same from the original data....this can't be the correct method can it?  If it is not, can you recommend how to simulate increased flows to predict maximum and mean daily temperatures?

A161. The only reason I can think of at the moment that average temperature would increase with adding water (and I assume you mean adding water at the upstream end?) is if the incoming water were generally warmer than at the terminus.  Under these conditions, adding more warm water would increase the average temperature.  But doing so should also increase the maximum, at least under most circumstances.  I think I could imagine a case where the average would warm and the maximum continue to cool, but I haven't tried to duplicate that.  My supposition is that your situation is close to equilibrium and odd things can happen under those circumstances.  I think if I were you that I'd try to more or less duplicate the situation in SSTEMP just to confirm it one way or the other. I do not think your explanation is correct because the stream is always well mixed in SNTEMP by definition.

The utility TDELTAQ is meant to be used to adjust your hydrology data file in the manner you suggest.  That is, you tell the program what sort of changes (plus or minus) to make at some node and it does all the tedious changes for you, making sure that mass balance is maintained, which I hope you did.  You might read through the TDELTAQ documentation and make one of the changes you need just to double-check that what you did is correct and thorough.

Making the change in discharge at your V nodes is no problem as long as you had no missing temperature (or flow) data.  However, the "proper" way to do this is to actually add synthetic years of data to your time series files.  In this way, the validation statistics continue to be meaningful and you need not supply temperatures at V nodes for your "what if" simulations.  This is a concept that is difficult to get across, but take a look at the data set that accompanies SNTEMP to see how it is done.


Q162. I’m involved in yet another SNTEMP study.  At least, I think I will be using SNTEMP.  My question is whether the model will work well on very small streams.  This is a tributary of the X. River, where summertime flows are 1 cfs or less.  The tributaries of the tributary are of course even smaller.  Do you have any experience with SNTEMP in such small streams?  The issues in the study include flow changes from better land use, and riparian restoration.

A162. Given your previous experience, I doubt whether you will find that SNTEMP works as well for really small streams as it does for larger ones.  It is not that the physics are not the same, but rather that small streams are so quickly responsive to small changes in most anything, including stream geometry (width especially), hydrology (variable flows, intergravel flows), and meteorology (everything).  Because these streams can be so inconstant, calibration statistics are likely to be more variable.  You will likely find that some monitoring stations may calibrate poorly where others come out well.

The following reference found evidence that the streamflow varied throughout the day, likely due to diel evapotranspiration:
Mattax, B.L. and T.M. Quigley. 1989. Validation and sensitivity analysis of the stream network temperature model on small watersheds in Northeast Oregon. Pages 391-400 in Proceedings of the Symposium on Headwaters Hydrology, W.W. Woessner and D.F. Potts, eds. Am. Wat. Res. Bethesda, MD 20814-2192.

The Forest Service developed a model called Temp-84 (or was it 86?) for very small streams, especially where they thought that solar radiation was being absorbed by the streambed (since the water was generally less than 10 cm deep) and that caused some modeling trouble.


Q163. I recently downloaded a water temperature modeling paper you prepared that focused on uncertainty. I have a question for you, not on the paper, but on water temperature modeling. Are you aware of any studies where water temperatures were forecast several days in advance at locations in a river? Was the model successfully able to do that? I have not seen such studies, and was wondering if you had.

A163. Basically, I think the issue boils down to forecasting the appropriate meteorological parameters in advance and then running the water temperature model.  As we get better and better weather forecasts, this will become more and more accurate, but some element of uncertainty will remain.  Obviously, if your targets were weekly rather than daily, the targets become somewhat easier to predict simply because of the averaging period.

One extreme case might arise when reservoir releases essentially dominate downstream temperatures but at some distance downstream, i.e., where travel time is the main variable.  In this case, it would be easy to figure out what to do.  But rarely would this extreme dominance exist.  Instead, this problem would grade into some balance between reservoir releases (and maybe dam release temperature control) and meteorology.  I know that the Bureau of Reclamation in Sacramento is faced with this situation for compliance points on the Sacramento River.  They are, I believe, attempting to incorporate weather forecasts with real-time stream temperature feedback and basically using human guesswork to try to 'optimally' meet the criteria.  I believe you could contact Paul Fujitani in their Central Valley Operations office for more details.  One reference I have on this class of problems is
Bravo, H.R.  1993.  State space model for river temperature prediction.  Water Resources Research, 29(5):1457-1466. 
In addition, there is a PG&E report that looked into this:
Railsback, S.  1997.  Design guidance for short-term control of flow releases for temperature management, Final Report 009-97.10, San Ramon, CA.
and
Gu, R., S. McCutcheon, and C. Chen.  1999.  Development of weather-dependent flow requirements for river temperature control. Environmental Management 24(4):529-540.

Short of something like this, what is most often discussed is taking a probability-based approach to forecasting.  The idea would be to generate exceedence probabilities based on time of year or knowing the recent history of temperatures, or both.  I have occasionally thought that developing a neural network model might be an appropriate technique for this sort of application.  As with the other techniques, it would still be based on recent history, but could also have 'fuzzy' inputs for the upcoming weather forecast.  One interesting twist was an approach taken by David Neumann (Masters Thesis, An operations model for temperature management of the Truckee River above Reno, Nevada, University of Colorado, 2001).  Basically, he looked at the trade off between the level of certainty required and the amount of water spent in compliance.  I believe this has now been published.


Q164. We are interested in comparing the temperature data collected and the modeling results to historical conditions (before project construction). Flows in the two bypassed reaches have been modeled empirically relating rainfall to runoff. We do not have historical instream temperature data or widths, but we do have access to historical meteorological data. Would we be able to use SNTEMP to gather some knowledge about temperature in the bypassed reaches and below the project tailrace without having validation points and just using some average widths?

A164. Absolutely.  The more assumptions you make, the thinner the ice, but there is no fundamental problem in what you describe -- thus the benefit of mechanistic models.  Perhaps your biggest unknown will be inflow temperatures.


Q165. I am wondering if the SSTEMP model does not work well for large rivers.  I have modeled nine segments and have low percent error for all except a segment on the mainstem river.  The sensitivity analysis indicates air temperature is having the greatest impact on the estimate, but I deployed an air thermograph at the downstream site and have actual measured data for air temperature.  The predicted maximum in the model seems quite high.

A165. On the contrary, these models all tend to work best on larger rivers where the flow volume adds considerable "thermal inertia" or buffering to quick changes that often occur on small streams.  I'm not sure I'd really class your situation into "large" because I might draw the line at 100 cfs or so, but then that's arbitrary.

I took a careful look at your parameters.  There are two that jump out at me.  First is the effective stream width of 67 feet for a flow of 66 cfs.  This seems quite wide to me, but admittedly is not all that sensitive in your case.  Second is the segment length.  Are there really no significant tributaries in that distance, potentially bringing in cooler water?

Assuming all is well with the input data, what could be wrong?  I have but two thoughts.  Though I don't know the setting for this stream, I find myself wondering if there is much sub-surface flow through gravel/cobble.  If there were, and if this particular day was abnormally warm in terms of air temperature, the subsurface flow could be keeping the stream cooler than the model believes.  Second, it's just hard to believe that, in this case, we have water coming in at 19.5°C and leaving at 19.3°C when the air temperature is 25°C and the sun is essentially at full intensity.  Your predicted mean equilibrium is 28°C, which sounds about right.


Q166. I have calibrated the SSTEMP model for a Rosgen C4/E4 stream – Y. Creek in New Mexico (with 1% error for maximum temperature [modeled = 78.83, actual = 79.3]; 7 percent error for mean temperature [modeled = 66.74, measured = 64.4]; and 14% error for minimum temperature [modeled = 54.64, measured = 51.8]) and am now adjusting the shade parameter and Width's A term to try to lower the maximum temperature to the water quality standard of 68°F.  The shade is currently measured at 4.5 percent for this assessment unit (see photo).  To our knowledge, the area is not currently used for cattle grazing but is managed for elk.  The sensitivity analysis indicates that mean daily air temperature has the greatest influence on the modeled temperatures.  So, adjusting shade and Width's A terms do not make much difference in the maximum temperature (unless I raise the shade to 50%, which is not reasonable based on the setting of this stream) - and, of course, I cannot alter daily air temperature.  It seems that in most cases, the daily air temperature has the greatest influence on the results.  But, for the purpose of writing TMDLs, I need to adjust values for features (like shade and Width's-A) that can be altered, so that best management practices can be implemented and the Water Quality Criterion achieved.  I am struggling with finding reasonable ways to lower the stream temperature and am wondering if you have any suggestions.  I have attached my calibration run to this e-mail.  Do you think SSTEMP is OK to use on streams like this?

A166. I am so glad that you included the photo in your e-mail.  Why?  Because I can safely say that I have had absolutely no experience with a stream this small.  I certainly have heard anecdotal stories of the effects of cumulative grazing, and have seen dramatic slide shows given by Don Chapman and Bill Platts -- you know the classic fence-down-the-middle photos, one side grazed and looking about like your photo and the other side ungrazed with abundant willows, sedges, instream cover, etc., presumably with good quality shade, narrower widths, and so forth.  I also hear that when you have abundant willows that beaver may have been far more abundant, doing their normal dam building, raising the groundwater level and introducing pools.  This may have tended to moderate water temperatures somewhat, or at least provided some refuge from otherwise high stream temperatures.  The list of effects given in one Platt’s workbook I have are:
1.  Shear or sloughing of stream bank by hoof or head action
2.  Water, ice, and wind erosion of exposed stream bank and channel soils because of loss of vegetative cover
3.  Elimination or loss of stream bank vegetation
4.  Reduction of the quality and quantity of stream-bank undercuts
5.  Increasing stream bank angle (laying back of stream banks), which increases water width and decreases stream depth
6.  Drainage of wet meadows to facilitate grazing access
7.  Changes in plant species composition (e.g., brush to grass to forbs)
8.  Reduction of overhanging vegetation
9.  Decrease in plant vigor

Are there any 'pristine' areas that one could compare with?  Also remember that there is no guarantee that these streams 'ever' were that much cooler than now, or even under full restoration potential.

Do I think that SSTEMP is a good model for streams that small?  The Forest Service once developed a model explicitly for small (forested) streams.  Their contention was that small streams had significant interactions with rocky substrates not captured by other models at the time (~1984).  This included how solar radiation penetrated the 10 cm or so through the water column, heating the substrate during the day and then having the substrate warm the stream during the night.  However, I am unaware that anyone now uses that model.  Other 'newer generation' models, perhaps including HeatSource in Oregon appear to have some facility to better incorporate substrate/hyporheic interactions, though I do not believe they do this in the same way that the Forest Service did. 

Some of the difference in error you reported between mean and maximum daily temperatures may be due to some of these inaccuracies.  However, I have not heard you say that SSTEMP was too inaccurate for your use.  In fact, I suppose I'm pleasantly surprised that it did as well as it did.  The smaller the stream the less thermal mass there is and the more erratic the thermal behavior.  I believe SSTEMP should adequately capture the magnitude of effect expected from a variety of remedial actions, such as those you described (width, vegetative shade, etc.), but it would probably be best to express the differences as change from the baseline rather than absolute temperature predictions.


Q167. I've been pouring through the literature and think I know the answer to my question but wanted to make sure before moving on. Both the SNTEMP and SSTEMP models are steady-state models, correct? In other words, I cannot provide unit value (say 5-minute) discharge data for a month as an input file and have these models compute downstream temperatures at the same interval. The reason I ask is because our cooperators are interested in understanding thermal load from a developed area to a stream during storm events. My understanding is that I have to plug in a single headwater discharge value, groundwater discharge value, air temperature, etc. to each stream segment and the model will spit out a mean, maximum, and minimum downstream temperature.

A167. You are absolutely correct.  The theoretical basis for the model is steady state only.  All input values represent mean daily conditions.  However, we have used these models to 'bound' the possibilities.  For example, when working with hydropeaking, you could simulate both the low and high flow conditions and know that the 'truth' is somewhere in between. 

If you really need to model a storm surge, you may be better off using something like CE-QUAL-W2.  But even though that model is fully hydrodynamic, I'm not sure how it may deal with all the features that may be associated with true storm surges.  It has a tendency to become unstable when there are significant changes in forcing conditions.


Q168. I have another question about temperature and comparing a control station with several downstream stations on a particular stream here in Kentucky.  We have 5 stations with one control station above all possible impacts to water temperature.
 
Our task is to find a suitable model or statistical method to compare these downstream stations with the control and to provide us with a good information base to determine temperature changes at the downstream stations just using temperature without involving prediction of land use changes. We are also trying to find a defensible permit criterion for standard deviation for these temperature changes that would show the company that they are affecting the stream by increasing water temperatures.

I am not sure an email can adequately express what I am trying to get across what we need for this permit.  We have 3 years worth of data for all 6 stations.  The temperature, conductivity, and dissolved oxygen were all measured on either 15 minute or 30 minute intervals during the day from May through October.

I hope you have something at your finger tips that we can use to analyze this data and possibly a defensible methodology for showing changes in the stream temperature. 
Follow-up - I think the approach that you mentioned in #1 and #2 below will probably be the slant we will take on this subject matter.  The stream here in KY in question is the same one (Y. Creek) that I emailed a year ago or so about SSTEMP and the conditions that can be manipulated to see worst case scenario.  We are trying to see if 10% violation of a daily maximum temperature over a week's period will actually workI have thought about using standard deviation giving so many degrees above the daily maximum. 

I have three years of twice per hour data from May 1st through October 30th. We are trying to apply background water quality issues to this stream.  A coal company is placing a hollow fill and a pond that I feel will ultimately change the background water temperature.  Since our standards say that water temperature shall not be increased through human activities above the normal seasonal temperatures, we are trying to determine when/if the company will cause an increase in water temperature. We have 6 stations that are also taking 2 per week specific conductance measurements to apply to the aquatic community also.  We have found here in KY that specific conductance will cause loss of taxa richness of aquatic organisms.  We are still looking at the numbers but it seems if specific conductance climbs above 500 umhos/cm or u/s several mayfly taxa will start to decline in the biological component of the stream.

Again, you have been a great help and have substantiated my position that violations based solely on the daily maximum over a week’s period without the benefit of a standard deviation would be open to adjudication.

A168. About the only thing that occurs to me beyond taking simple averages of temperature (or DO), whether mean daily or maximum daily, would be something akin to the various statistical values calculated by the so-called Indicators of Hydrologic Alteration.  Though ostensibly for comparing with- and without project hydrology, I have occasionally used this software for water temperature.  It produces a lot of output, and sorting out the wheat from the chaff can be somewhat of a challenge.  Jim Henriksen here at the USGS has been working to put this toolkit (and more) into a user-friendly software package, and I forwarded your e-mail to him, but have not heard back from him as to whether he thinks it appropriate.  Undoubtedly there would need to be some data reduction from your 15- or 30-minute sampling interval to compute mean and peak daily temperatures or DO depressions.

One way to beef up the (apparent) strength of the analysis is to compute some measure of deviation so that you are not simply talking about simple means.  This can get tricky though if someone might argue that the distributions are not statistically different.  Carefully consider the pros and cons and be prepared for this issue.

Some other thoughts would be to carefully consider metrics other than direct water temperature (or DO).  Options might include:
            1.  Degree-days.  We have used this as an integrator of acute or chronic stress by setting an appropriate threshold and calculating the cumulative exposure over that threshold for a specified time window.  For example, number of degree-days over 20°C during the summer.  Note that this could also be degree-hours to try to capture peak exposures -- sort of the reverse philosophy for DO.
            2.  Season length.  Similar to the above, but defining the length of time fish (or the aquatic system) are exposed to acute or chronic stressful temperatures.  This could be simply the number of days that a threshold temperature is exceeded, or it could be the length of the season (i.e., from the first day > threshold to the last day > threshold).
            3.  Finally, we sometimes compute just the length of river with suitable (or unsuitable) temperatures.  Though this may not be appropriate (or even possible) in your case, I mention it just in case it triggers a thought on your end.


Q169. I also wanted to ask a question about obtaining actual field measurements of stream temperature. I am about to install several temperature loggers in our study stream so we have some verification points down the road. It occurred to me that stream temperature could vary quite a bit depending on where you install your probes. In your experience, where is the best place to collect stream temperature? I will try to find well mixed areas of the stream segment but what about depth or vertical and horizontal location? 

A second question refers to proper channel mixing when a tributary joins the mainstem. Is there a rule of thumb for how far downstream one must go after a tributary confluence in order to obtain proper mixing? We have the good fortune of a well funded project so I'm going to be installing 25+ validation nodes in the model. I just want to make sure my validation data are accurate.

A169. Yes, temperatures can vary depending on probe location.  The goal of course then becomes trying to measure temperatures that are truly representative of flowing water at the longitudinal distance you are interested in.  Most guidelines are common sense kinds of things.  Install probe in a well-mixed spot, deep enough not to be exposed to air if the flows drop, yet not in a location that might get scoured out if flows peak.  Sometimes probes get buried even if they don't actually get washed away.  If possible, you want to protect them from vandalism, so disguising them as trash can work well.  I recommend placement out of direct solar exposure.  This can often mean putting them in a perforated plastic pipe of some sort.  This can also help keep the probe from being in direct contact with the streambed.  You also don't want to put the probe in a stratified pool. 

I have no specific thoughts on "horizontal" placement except to avoid placement in "abnormal" surroundings, e.g., springs or seeps, upwelling, flows percolating through gravels, just downstream from a tributary (as you mentioned), etc.  Because temperatures can vary, and probes are relatively cheap, I have recommended placing two probes fairly close to one another.  Not only does this add insurance, but assuming both are retrieved, you can get some idea of the "inherent" variability.  At a minimum, I recommend verification of the probes before and after deployment with an ASTM thermometer (ASTM 63C works best).

Mixing zones can be longer (or shorter) than they might at first appear.  The shallower and/or more turbulent the stream, the shorter the mixing zone.  One approximation for rivers and streams in which the depth is less than one tenth of the width is (Milhous, pers. comm., adapted from Ruthven 1971):

                              
            0.085 W2
    L =  ---------------
           D(5/6) * n

 

where L = mixing length (feet) that assures the variation in temperature
          across a section arising from a point source interjection does not
          exceed 10%
          W = average stream width (feet)
          D = average stream depth (feet)
          n = average Manning's n (roughness)

For example, suppose you had a river 100 feet wide, 5 feet deep, with an n value of 0.06.  The length to near complete mixing would be:
                                                                         
          0.085 1002          850                             
 L =  ------------------ = ---------------- = 3705 feet            
         5(5/6)* 0.06         3.82 * 0.06

Most, of course are far shorter distances.  Some people say to go 3 or more channel widths downstream.  Having a live probe (i.e., immediate readout) is good to test this in the field. 

Also see http://www.odf.state.or.us/DIVISIONS/protection/forest_practices/fpmp/Projects/Stream_Temperature/STTempProfromWQG.PDF

Document everything!


Q170. Off the top of you head, what processes would make a stream stay at a fairly uniform temperature?  I can get the mean temperature OK but SSTEMP still gives a wider range than the data suggest and I'm not sure just what to look at for keeping temperature uniform.

A170. I will forward a correspondence trail that addresses this issue to some degree.  Assuming that the meteorological variables really do represent conditions near the stream and are not buffered somehow by riparian vegetation, I strongly suspect hyporheic interactions, but cannot really prove it.  Other folks are hammering around the edges of this issue.  For example, see

http://water.usgs.gov/pubs/fs/2004/3010/
and
http://pubs.nrc-cnrc.gc.ca/cgi-bin/rp/rp2_abst_e?cjfas_f04-040_61_ns_nf_cjfas

A. D. Ronan, D. E. Prudic, C. E. Thodal and Jim Constantz. Field study and simulation of diurnal temperature effects on infiltration and variably saturated flow beneath an ephemeral stream. Water Resources Research. v. 34, no. 9, September 1998. p. 2137-2153.

Constantz, J., C.L. Thomas, and G. Zellweger. 1994. Influence of diurnal variations in stream temperature on streamflow loss and groundwater recharge. Water Resour. Res. 30:3253–3264.

Constantz, J., 1998. Interaction between stream temperature, streamflow, and groundwater exchanges in alpine streams: Water Resources Research, v. 34, no. 7, p. 1609-1615.


Q171. A question came up in a planning meeting that I couldn't accurately answer. The question was if a small community expands impervious surfaces by 30% in 10 years, what would happen to the average daily temperature in a nearby trout stream. My answer was that groundwater recharge could lessen and surface runoff could increase. Going back to the model I don't see a way you can differentiate groundwater flow (lateral accretion) from surface runoff in a stream.  Is this model only applicable to base flow conditions?

A171. Great question!  In many ways, this question is outside the scope of SNTEMP per se, here is what I can offer.

First let's do the "easy" part.  SNTEMP really only simulates open channel heat flux.  Therefore it will know nothing, and can do nothing specific about overland (or even underground) heat flux processes.  But you certainly can separate lateral flows (accretions) to the stream into two components, groundwater accretions (typically coming in at ground temperature, but you can change that) and return flows (but these must be point sources, though you could have lots of them.)  When one specifies one or more return flows, you do have the option of setting a flag that tells the model to estimate the temperature of that return as either equilibrium temperature, ground temperature, or some weighted average.  See Theurer page III-69 for how to do this.  I have not used this option in a very long time.  You may need to experiment.

Back to "normal" lateral accretions, since you do have the option of specifying accretion temperatures in the hydrology data file, perhaps this is the mechanism for your question and I may have already said too much.  Regardless, now comes the hard part.  What assumption should you make about altered groundwater (or return) temperatures in an urbanizing environment?  Frankly, I will plead ignorant.  It is a long and difficult process to change true ground temperatures, especially at depth.  But interflows I know little about.  If you have not already done so, I'd certainly go to the literature, but not many years ago, this was quite scarce. See

John Galli.  1990.  Thermal Impacts associated with urbanization and stormwater management best management practices. Metropolitan Washington Council of Governments, 777 N. Capital Street, Washington, DC
Brown, L.R., R.M. Hughes, R. Gray, and M.R. Meador, eds. 2005. Effects of Urbanization on Aquatic Ecosystems.  American Fisheries Society. 420 pp.
Krause, C.W., B. Lockard, T.J. Newcomb, D. Kibbler, V. Lohani, and D.J. Orth. 2004. Predicting influences of urban development on thermal habitat in a warm water stream.  J. Am. Wat. Res. Assn. 40(6):1645-1658.
Roa-Espinosa, A., T.B. Wilson, J.M. Norman, and K. Johnson.  No date.  Predicting the impact of urban development on stream temperature using a thermal urban runoff model (TURM). Available on the Internet at http://www.epa.gov/owow/nps/natlstormwater03/31Roa.pdf.
as examples.

Beyond that, you will become the expert!


Q172. I understand when describing the geometry and shade parameters for each node you are averaging the conditions found between the current and next downstream node (except for latitude and elevation which are actually at the node location). So when SNTEMP kicks out a temperature, is it giving me the temperature at the downstream edge of a segment? Here's an example:

Stream is represented by H ----C1-----C2-----E (oversimplified, I know). I would describe geometries and shading between H and the C1 under the H node. Then I would describe the same between C1 and C2 under the C1 node. When I run SNTEMP it will give me a temperature at each node. So is the temperature at H really at the H node or is it at the C1 node? Common sense tells me it’s the average between the two. I ask because I have real temperature data at nearly each node. When comparing to SNTEMP I want to make sure I pair up the right values.

A172. It is true that the reach geometry is between nodes, just like you described.  But the model output for each node (e.g., E) is for that specific point along the stream.  That is, when the model prints the temperature at the C1 node, it is not an average up- or down-stream, it is at that point.  You can verify this at H (or S, P, R, and V) nodes because what you put in for those locations is what you get out.  I risk confusing you because P and R nodes are not stream temperatures, but rather inputs.


Q173. I have been working with your SSTEMP model for the last month and am very pleased with the ease of use and the results.  I am modeling the X. River up here in Z.  The X. River has only major tributary (Y. Creek) that contributes more than 10% flow (80%).  I have broken this 55 mile reach up into two segments for the purposes of this model:  from the base of the mountains to Y. Creek then from Y. Creek to the state line.  In 2004 we spread out 8 instream temperature loggers throughout this reach in addition to placing three air temperature loggers at the beginning, middle and end of the reach.  There are also three USGS stations at the base of the mountains, upstream of the Y. Creek confluence and at the state line.  In addition to the above data we also have several years’ worth of bug and habitat data throughout the reach. 

The initial purpose of this temperature monitoring was to determine at what point the X. River turns into the warm water fishery.  However, after finding numerous historical accounts (1805-1877) of the presence of cutthroat trout in the lower segments of the X. River I thought perhaps I could ask “why is this lower segment of the X. River a warm water fishery?”  That is when I found the SSTEMP model.  Obviously the over appropriation of water from the T. is one of the major reasons for the increased water temperatures and when dealing with any agriculturally related impairment you need to have the best science possible.  That is why I want to make darn sure what I am doing is precise and correct and you may expect me to be bugging you over the next month (if that is OK with you).

That is a brief history of what I am working with.  Now here are some questions that I have for you:

a) Is the SSTEMP model appropriate for what I am trying to do?  I have not worked with the SNTEMP model simply because it makes me nervous.

b) Can you use a width's A term from any period of time or does it need to be specific for that dates you are trying to model?

c) How would you suggest calibrating the SSTEMP model?  My out flow temperatures in average are coming out within a degree of the predicted values.  Is this acceptable?

d) Should small irrigation dams be considered in this model?

e) The SSTEMP model is over estimating temperatures between the segments, why?  I am assuming I am not account for something.

A173. Here are some responses to your questions:
a) Is the SSTEMP model appropriate for what I am trying to do?  That depends.  As stated in the SSTEMP documentation, when the number of segments (2) times the number of time steps (?) becomes large, SNTEMP can quickly become the model of choice.  But for just scoping out the main temperature drivers, SSTEMP can be fine.

b) Can you use a width's A term from any period of time or does it need to be specific for that dates you are trying to model?  The name of the game in modeling is to have parameters that are the most representative of the situation you are modeling as possible.  This applies to the width as well as anything, but obviously if you are trying to understand the past (pre-grazing as an example) then you would want to have some estimate of what widths in the past actually were.  Or if you were trying to forecast how temperatures might change if you had a restoration project, ditto. Or maybe your question has to do with widths that are specific to the flows modeled.  For this you really should estimate both the A and B terms of the width equation.

c) How would you suggest calibrating the SSTEMP model?  Depends on what you are finding.  Most calibration of SSTEMP is either using air temperature (if there is a reasonable rationale) or some of the less representative meteorological parameters like wind speed.  Alternately, some use shade if that is a poor estimate.  But frankly, I'd want to know more about what you find under different dates, different flows, etc., before making any single, simple recommendation.

My out flow temperatures in average are coming out within a degree of the predicted values.  Is this acceptable? That depends on your goal.  If you just want to estimate rough relative changes in water temperature, this may be fine.  If you are going to be challenged in court, you may need to get a consultant/expert to make a more elaborate calibration/validation.  Again, I think I'd want to look at a variety of results.  Here again, this is where SNTEMP can excel, because ultimately it makes the model easier, and more justifiable, to calibrate -- or not to calibrate, as the case may be.

d) Should small irrigation dams be considered in this model? In general, if the dam creates only a run of the river impoundment that can be characterized as a wide spot in the river, I think you'd be safe.  However if the impoundment stratifies, then SSTEMP (or SNTEMP) cannot technically be applied.  You may still be able to get useful information, but someone could too easily discredit what you do.

e) The SSTEMP model is over estimating temperatures between the segments, why?  I am not exactly sure what you mean by this.  Do you mean that the overestimate is always downstream of segment 1?  Basically, you should be asking yourself why you are gaining too much heat.  The answers may be all over the board: widths too high, shade too low, groundwater too hot, etc.  What information can you bring to bear to reduce the uncertainty in your parameters that is justifiable?


Q174. As I'm looking at my wonderful SNTEMP graphs of observed vs. predicted temperatures, the idea of storm flow vs. base flow popped into my head. Am I correct to assume that SNTEMP should really be calibrated only to base flow periods since those are considered steady-state? My observed daily mean discharge values at various points along my stream include overland flow during storm events. This likely has some warming effect on stream temperature and will be recorded by my temperature sensors. However, this increase in flow, as far as SNTEMP is concerned, is really an increase in base flow. Therefore you would likely be underestimating stream temperatures during storm events since that extra water would have an assumed groundwater temperature tagged to it.

If what I've said is true, my next course of action is to plug in only base flow days from my monitored data and have the model interpolate for the missing values then calibrate to only those base flow days. Other reports I've read don't make any mention of this. Am I off base here?
Later Reply - I'm underestimating for all periods in all reaches. I went through and deleted the days where there was overland flow in the stream. Fortunately, the year we collected discharge data was a dry year so I didn't have to delete that much.  I'm underestimating anywhere from 0.5 degrees to 4 degrees depending on the reach (closer on the upstream reaches and further off on the downstream reaches). One game I've been playing is to set the ground water temperature from mean annual air temperature to a mean monthly soil temperature. It gets me closer on the downstream reaches but my upstream are overestimated. The stream I'm modeling has a complex network of groundwater springs, most of them are in the headwaters. My next plan is to vary the groundwater temperatures by reach, i.e. colder at the headwaters and warmer downstream.

A174. As usual, or almost always, it depends on your objectives.  If you want the model to perform best for base flows, it should be calibrated (if necessary) for those conditions.  If on the other hand, you want the model to work best for overland flow conditions (which it is admittedly not too well suited to do) you should target those conditions.  It might of course be possible to do both if you can come up with some decent accretion temperature estimates.

Do you in fact find that you are underestimating temperatures during storm events?  I know I had some trouble with that on the X River some time ago.

[Updated 5/2007]

Top of Page
Skip navigation and continue to the page title

Accessibility FOIA Privacy Policies and Notices

Take Pride in America home page. FirstGov button U.S. Department of the Interior | U.S. Geological Survey
URL: http://www.fort.usgs.gov/products/Publications/4037/faq_application.asp
Page Contact Information: AskFORT@usgs.gov
Page Last Modified: 9:49:25 PM