Subscribe to Newsletter
Subspecialties Biochemistry and molecular biology, Laboratory management, Histology

Water: When Things Go Wrong

At a Glance

  • Laboratory procedures often require large volumes of water
  • A supply of purified water is essential, but there are many risks for contamination, which could stop a lab service completely
  • Sources of contamination include the water supply, the purification unit and the equipment
  • It’s important that laboratories have good lines of communication with purification system suppliers, estates/maintenance teams, and understand what to look out for when things go wrong to minimize disruption

Water is the single most important reagent used by those of us who work in laboratories, but it’s often taken for granted. It is only when the supply is interrupted or compromised that its value is appreciated.

0214-402-in-practice-main

The technical aspects of water quality are well defined (1, 2) and whilst most lab scientists won’t have an in depth knowledge of these details, the impact of system failure are immediately evident and can stop a service completely. So it goes without saying that the quality, purity and reliability of the water supply is a critical part of running any laboratory.

The world’s most common reagent

In clinical chemistry laboratories, the majority of water is used in high-throughput chemistry analyzers; a medium- to large-sized laboratory, for example, may have a peak hourly demand of 50 to 100 liters. Lower volumes are needed for more specialized analytical techniques, such as tandem mass spectrometry or atomic absorption spectroscopy (AAS). Needless to say, any failure in quality – in most cases, contamination – could be a burden on time and resources, and could cause delays in the delivery of clinically important results.

Water contamination in a routine clinical biochemistry service can be introduced at any one of the following three stages: upstream of your purification unit; within the unit itself; post-purification (see Infographic). Being aware of what can go wrong with your supply – and how to address it – can help to keep your results accurate and your workflow smooth.

Pre-purification pitfalls

In my experience, the supply to the unit (either through a direct mains feed or via a cold storage tank) is the most common source of water contamination. Interruptions at this stage, fortunately, are automatically evident through either an alarm mechanism within the purification unit, or through pressure indicators.

Any scheduled (pipe maintenance) or unscheduled (leaks or bursts) interruptions upstream of the unit can also interfere with the supply. Knowing when planned work will be taking place and recommending that it happens during times when laboratory workload is low, will certainly keep disruptions (and therefore possible quality compromise) to a minimum. So it’s important to maintain a good working relationship with the estates/maintenance department within your hospital or institute. It’s also important to maintain those relationships and to keep lines of communication open to minimize the impact of impromptu interruptions; if laboratories are alerted of them early, contingency plans – such as prioritizing urgent samples only – can be put in place.

When supply comes through a cold storage tank, ironically the cleaning of the tank can present a contamination source; specifically cleaning agents, some of which are chlorine-based, can enter the unit and damage components. It’s important to be aware of when cleaning is scheduled, so that the supply can be diverted while it takes place.

More substantial pipe damage and repair can result in damage to the unit consumables. In our laboratory, for instance, we have seen significant silt deposits in our supply following a major mains repair. Our water remained pure and the unit was able to remove the deposits, but the filters become quickly overloaded and needed changing several times.

It’s also important to recognize the impact of geographical and seasonal variation on the quality of your supply. Limestone areas produce what’s known as “hard water”, that’s to say water containing the divalent ions calcium and magnesium. Where this is problematic, discussions with suppliers can help to optimize your unit design, for example, by incorporating softeners (such as sodium salts) into the process. The impact of seasonal variation is usually more apparent in the summer months when supplies are generally lower and impurity content consequently higher. As a precaution, I’d advise that laboratories keep a larger stock of replacement parts during this time of year, as they may need to be replaced more often than they would the rest of the year.

Purification unit problems

Units will vary between manufacturers and by laboratory requirements. As a general rule, the complexity of a unit increases in line with the water quality grade that it produces. As with all pieces of laboratory equipment, occasionally these systems fail so where an uninterrupted supply is required, you may need two or more units.

In many situations, the unit will monitor the quality of the water it produces by tracking resistance, but this is only one indicator. It is important to assess overall quality using a variety of analytical parameters, for example, the absorbance characteristics of blanks in spectrophotometry and baseline changes in chromatographic techniques.

0214-402-in-practice-fig1

Figure 1. The graph shows an upwards shift in calcium calibration curves over time, due to the incomplete removal of calcium impurities in the water supply. Calibration 1 represents the normal absorbance pattern compared with calibration 2, where the background of increased calcium results in increased absorbance in both the blank and the calibrant.

Within the standard repertoire of investigations carried out on a clinical chemistry analyzer, certain tests are more susceptible to deteriorating water quality. Calcium and magnesium analysis are two examples that are often compromised and this may be evident from the calibration data. Some analyzers will hold previous data within their software, which can allow serial calibrations to be graphically compared for changes over time. Figure 1 shows a shift in calcium calibration curves, where the removal of calcium impurities has not been completed. Both baseline and calibrant absorbance should remain relatively constant, but in the example given, both show a shift upwards, which should alert operators to water quality problems.

Equally, changes to blank readings might indicate that a unit is unable to remove organic impurities – typically in chemistry methods, this may be most apparent where the reaction is monitored in the UV region of the spectrum (on a standard clinical chemistry analyzer, this will be at 340 nm methods that utilize the transition of NAD to NADH). Specific method parameters may show flags where baseline absorbance deviates by more than a defined threshold.

In AAS, water contamination issues may also cause blank readings to deviate from what is expected, and this should be monitored. For chromatographic methods, the baseline chromatograms may show characteristic changes as impurities increase. Monitoring these parameters as a method of quality control should be standard practice.

Post purification glitches

Occasionally, problems arise after the purified water is produced. Most commonly, small water bottles that are used to hold supplies of water locally to an analytical area can become contaminated with bacteria. This is most easily avoided by using single use containers, or by only using the container for short periods of time to prevent bacterial accumulation. However, a more complex version of this problem can occur in the tubing that carries water to the analyzers. Careful, periodic cleaning, carried out as per the guidance of the instrument manufacturers, can help to prevent this problem. Again, this kind of contamination may appear in results; many contaminants seem to cause spectral changes in the UV region – I have observed this problem in both ALT and AST enzyme tests where the NADH:NAD transition is monitored.

0214-402-in-practice-fig2

Stages in the supply of purified water to the laboratory and common sources of contamination; introducing contaminants during any stage of the process can have an effect on the resulting water purity, which in turn can affect the operation of lab equipment and therefore test results. Identifying what can go wrong, and how, is crucial to preventing and resolving contamination issues.

The impact of poor quality water

In a routine clinical chemistry laboratory failure to produce good quality water will typically lead to a requirement to repeat tests as quality control procedures are usually able to detect the problem. However, this leads to greater utilization of resources – both staff time and reagents. In some cases it may lead to delay in the availability of results, and in the worse case scenario, the patient may need to have a repeat sample taken. The simplest example of this might occur where there is bacterial contamination present in a secondary aliquot of purified water which leads to a high absorbance during instrument calibration. Even if spotted early this will require a repeat of the calibration and delays to processing. A more complex pattern of contamination occurs when there are intermittent water quality issues. In my experience plasma calcium analysis has often been the most apparent test affected by poor water quality, possibly because our laboratory is in a hard water area.

In a research laboratory the effects of poor quality water are different in nature but equally negative on analysis. For example, the binding of antibodies during well coating of an ELISA can be affected and can lead to sub-optimal assay conditions affecting assay sensitivity.

Avoiding problems

My key recommendations for ensuring a high quality, uninterrupted supply of water to your laboratory are as follows: 1) work with the suppliers of your purification system to ensure it is installed and configured in the best way for the conditions in your laboratory; 2) work closely with the estates/maintenance team and ensure good communication; 3) understand which type of supply system you have, what the most common problems are, and how to go about fixing them; 4) in the event of a problem which cannot be fixed by staff on-site, make sure the supplier of your unit is contractually obligated to respond to your issues within a short period of time.

Access to purified water may seem like a simple requirement, but if it fails, it is one that could have far-reaching consequences. It’s important for anyone working in a lab to think about how the water is supplied, and to ask yourself this: do I have the right measures in place to monitor the quality of water? And how do I plan to respond to any supply-related issues if, and when, they occur?

Receive content, products, events as well as relevant industry updates from The Pathologist and its sponsors.
Stay up to date with our other newsletters and sponsors information, tailored specifically to the fields you are interested in

When you click “Subscribe” we will email you a link, which you must click to verify the email address above and activate your subscription. If you do not receive this email, please contact us at [email protected].
If you wish to unsubscribe, you can update your preferences at any point.

  1. International Organization for Standardization: Specification for Water for Laboratory Use ISO 3696 (1995).
  2. ASTM (American Society for Testing and Materials) D1193-06, Standard Specification for Reagent Water (2011).
About the Author
Tim James

Tim James is head biomedical scientist of clinical biochemistry at Oxford University Hospitals NHS Trust.

Register to The Pathologist

Register to access our FREE online portfolio, request the magazine in print and manage your preferences.

You will benefit from:
  • Unlimited access to ALL articles
  • News, interviews & opinions from leading industry experts
  • Receive print (and PDF) copies of The Pathologist magazine

Register