Temperature (and humidity) measurement
- 1 Introduction
- 2 Thermoscope
- 3 Thermometer (introducing scales)
- 4 Temperature Scales
- 5 Instruments
- 6 Sensor Positioning
- 7 How Cumulus software handles Temperature and Humidity
Temperature observations can range from a single instrument recording the outside temperature in the shade to making several measurements (e.g. in a standard screen, just above grassland or bare ground (soil temperature), in a building and/or with a wet bulb etc.)
Absolute Humidity quantifies the water vapour in the air. A relative humidity of 100% (full saturation) indicates the air is holding all the moisture it can hold at its current temperature. If the temperature falls, and the air can not hold as much water, then Fog (cold air just above ground) or Dew (cold at ground level) results. For a smaller initial relative humidity, the temperature can fall to the dew point temperature before it reaches full saturation. Generally the temperature falls as you go higher, so there is a level at which saturation has risen sufficiently to encourage condensation; Cumulus uses this approach to find the theoretical minimum height of Cumulus cloud.
It is not known who first progressed from a subjective (hot or cold) assessment of the weather to actually trying to quantify it by inventing an instrument to measure it. The first instrument, a thermoscope, indicated how hot or cold it was on a qualitative basis and was in use around twenty centuries ago.
Thermometer (introducing scales)
The first 'published' temperature scale was the Remer scale, named after a person born in Denmark who also originated the use of observing the expansion and contraction of mercury for measuring temperature. A thermometer is an instrument for measuring temperature calibrated with a standard scale that is the same for everyone.
Dry and Wet Bulb
A Dry and Wet bulb thermometer is actually a pair of thermometers.
It is the bulb of a glass thermometer that has a thin wall and being spherical exposes the most liquid volume on a small circumference to the effects of the heat energy causing expansion, so making for a sensitive detector.
Such an exposed bulb is called a dry bulb thermometer when paired with another thermometer that has a muslin tube round the bulb and with a wick dipped in water to form the wet bulb temperature sensor. The moisture evaporates from the muslin so the temperature reported is for 100 per cent relative humidity (saturated air). The difference in the two temperatures allows calculation of the humidity, and this was the standard observing method in the days of manual weather stations.
This is the same cooling effect as you experience by going into a thick fog, or when you perspire a lot on exposed skin.
James Six and Diurnal Variation measurements
A thermometer can be modified to hold a register of an extreme temperature (i.e. allows that extreme to be read at a time that suits the observer). It is called a Six's thermometer after the Briton, James Six, who invented the maximum and minimum registering thermometer still used today.
Colourless Alcohol on the lowest measuring side of a U-shaped glass tube expands and contracts with temperature. A liquid (which traditionally was mercury, but now is a coloured liquid that does not mix with the alcohol) pushes a slider along as the alcohol contracts for lower temperatures, but when the alcohol expands as it warms again, it passes beside the slider, therefore allowing the smallest temperature to be registered. Gravity, shaking, or a magnet are used to move the slider back against the coloured liquid to reset the Six's Thermometer.
On the highest side of the u-tube, there is a vacuum between the coloured liquid and the end of the tube and the slider in there registers the highest temperature and is reset the same way as the lowest register. Resetting once a day at a standard time stores the diurnal temperature observations. Whilst mercury was very good at pushing the sliders on its two ends, replacement liquids are unreliable, so the ban on use of mercury has effectively ended the use of the James Six design in meteorology.
An alternative way of achieving registration (preserving the value) is to have a narrow bore tube (minimum volume to be affected by expansion/contraction) with a capilary constriction separating off the bulb (that does contract but becomes isolated from the narrow bore tube permitting it to continue displaying the highest temperature). This design was used for the traditional clinical body temperature measurement, also largely abandoned following the mercury ban.
Temperature scales use a unit called degrees apparently because some early thermonmeter designs were based on circular tubes containing liquid and had 360 equally spaced markings on them as a rough way of reporting relative 'temperatures' before any formal scales based on calibrated points was introduced.
Daniel Gabriel Fahrenheit (24 May 1686 to 16 September 1736) visited Ole Remer in 1708. In 1709 Fahrenheit originated the alcohol-in-glass thermometer, and in1724 Fahrenheit developed the scale named after him based on 3 fixed points (a slightly modified version of this scale still called Fahrenheit is still used by some people in UK and is the official scale in a few countries such as the USA).
Rankine is the name given to a scale starting at absolute zero (similar to Kelvin, but based on scale divisions that match Fahrenheit ones).
The history of Celsius is confusing.
INTERNATIONAL STANDARD: The international standard scale for temperature measurements in the weather context is Celsius (after the Swedish person Anders Celsius 1701 to 1744) and defined internationally by two points - absolute zero (-273.15 degrees Celsius - the start of the Kelvin temperature scale whose scale divisions match Celsius) and the triple point of purified water (0.01 degrees Celsius).
ANDERS CELSIUS SCALE: Anders defined 0 on his Celsius scale as the boiling point of water and 100 on his scale as the freezing point of water. Carolus Linnaeus (1707 to 1778) in 1744 (after the death of Anders Celsius) reversed that Celsius scale, to the order we know now. Various other people are also credited with developing what was known under various names and became known in the 19th Century as the Centigrade scale.
CENTIGRADE: Zero Centigrade is the freezing point of water and 100 units from there is water's boiling point at a pressure of one standard atmosphere. Jean-Pierre Christin in 1743 independently developed a temperature scale with zero for water's freezing point and 100 for water's boiling point and in May of that year he published the design of a thermometer by a craftsman in Lyon using this scale. The temperature scale 'in 100 steps' or in Latin 'centum gradus' (anglicised as Centigrade) officially ceased to exist in 1948, when the unit became degrees Celsius. Older people still use the term Centigrade sometimes in the temperature measuring context, outside the Spanish- and French-speaking world where grad applies to angular measurement (and centigrade is equivalent to 0.009 angular degrees - angular degrees is the unit where 360 is one rotation).
For manned weather stations, mercury-in glass thermometers were traditionally used, although the risks associated with mercury means, in the last decade, most glass thermometers sold use other liquids (and body or room temperature is measured by non-glass alternatives such as heat sensitive chemical layers or electrical instruments).
For automatic weather stations, electronic options include thermocouples (two different materials with different temperature properties producing an electrical signal, a rather insensitive low accuracy option) or thermistors and resistance temperature detectors (these use ceramic/polymer and metal materials, respectively, because accurate measurement of resistance in an electrical circuit is easy).
(Pyronmeters that measure thermal radiation are used to measure high temperatures, but this is more applicable to a factory than a weather station, and infra-red detectors can be used to measure temperature emitted from buildings or land or for sensing people).
A hygrometer measures relative humidity, an inaccurate devices with a pointer moving over a dial uses a coil made from metal and paper so it curls and uncurls with moisture; a similar device uses a hair under tension that results in a man or woman swinging out through doors of a 'weather house'. Electronic instruments measure either the electrical resistance or the dielectric constant of conductive polymers and by circuitry calculate the relative humidity.
For the main temperature measurement, a weather station should select a site well sheltered from the sun and rain, but totally exposed to the wind. Basically you want the ambient temperature, so there must not be any direct sunlight nor nearby walls that could affect the temperature by radiation.
The World Meteorological Office sets the official guidance for positioning, but it is really directed at rural locations that permit measurement far from local effects like buildings, vegetation and sloping ground. For a more urban location, academics suggest calculating the mean roughness height (i.e. average of heights of buildings, trees, fences and other objects nearby), and multiplying that figure by one and a half to get the required minimum distance between your measuring location and any object that could cause 'roughness'.
As a rule of thumb, a temperature sensor at normal Stevenson Screen height (1.25 metres) is said to be affected by anything within up to 15 metres of it. For a higher mounted sensor there is a south-eliptical-cone shaped zone of influence, with a radius of 300 metres horizontally at 3 metres height. The widest part of the elipse is upwind of the sensor.
In practice, any temperature sensor mounted at a height of one and a half times the height of roughness objects (buildings, trees etc) is expected to be largely unaffected by those objects, so putting a tall pole above your house is deemed okay. A more advanced model considers the density of the roughness, and for lots of tightly packed buildings (or trees), uses a bigger clearance (height or horizontal separation). (See Royal Meteorological Society website in UK, BOM temperature measurement advice in Austrailia, or the equivalent of a Meteorological Bureau for your country).
How Cumulus software handles Temperature and Humidity
The type of weather station used determines which parameters are reported by that station and which need to be calculated by Cumulus. If Cumulus 1 is using temperature in a calculation of parameters like dew-point or apparent temperature, then for most stations it will apply any calibration settings to the temperature read from the station before using it in that calculation. Similarly if humidity is read from the station, the calibrated value is used in subsequent calculations. See the support forum for further information e.g. for Instromet stations, Cumulus 1 calculates the relative humidity from the reported wet bulb, and dry bulb, temperatures, the uncalibrated values of temperature and (calculated) humidity are then used to calculate dew-point.
At the logging interval you set, Cumulus logs outdoor temperature, dew-point, and relative humidity in separate files for each month. Optionally, measurements from extra sensors can be logged in extra monthly files. Optionally, indoor measurements can be logged in a special ongoing log.
Cumulus Calculated Parameters
See external Wiki or other sections of this Wiki for specific information on the values that Cumulus calculates:
The dewpoint calculation uses a third party library which uses the Davis dewpoint calculation.
Sfws 00:34, 29 November 2012 (UTC)