Quality control and water purity go hand in hand

Quality control is essential to pharmaceutical production. When you grab an off-the-shelf painkiller, or try a new prescribed treatment, you trust that the medicine is not only effective, but also safe. And that’s a fair expectation.

Before any medicine reaches a pharmacy, it must pass stringent quality control tests in line with legislation from governing bodies, such as the Medicine & Healthcare products Regulatory Agency (MHRA). Raw materials – including active pharmaceutical ingredients and excipients – are rigorously tested, both during manufacturing and in the final product prior to distribution.

1. We can’t get by without water

Water plays an important role in many quality control tests – such as sampling, filtration and culturing – as well as being necessary for buffer preparation and washing/rinsing equipment. As a powerful solvent, water can be easily contaminated by dissolved impurities if it is not managed well. In addition, water drawn directly from the mains supply is likely to contain dissolved ions, organic compounds, gases and micro-organisms, the latter having the potential to proliferate in water treatment, storage and distribution equipment.

2. Distorted data

The widespread use of water in quality control and its vulnerability to contamination makes managing purity a high priority. Impure water can damage laboratory equipment and, more importantly, have a detrimental impact on analytical processes, undermining the accuracy and validity of results. For example, dissolved ions can affect routine molecular biology techniques – such as the polymerase chain reaction (PCR) – and micro-organisms can also interfere with sensitive cell culture assays and biochemical reactions. Consequently, using contaminated laboratory water can impair regulatory approval of a therapeutic drug, and have far-reaching knock-on effects across a pharma company.

3. Purity matters

Quality control labs can tackle this challenge in two ways. Firstly, regular water analysis combined with a validated and maintained water system will ensure the reliable production of water of an appropriate quality. Secondly, it is important to supply the appropriate grade of water to minimise contamination during analysis. For example, ultrapure water (Type I+ water) must be used for highly sensitive endotoxin analysis, liquid chromatography and cell cultures, while apyrogenic water (Type II+ or Type II water) can be used for general chemistry and microbiological analysis.

What next?

Hopefully, this post has highlighted the importance of managing water purity throughout the manufacturing and quality control process. If you want to know how to select the right water purity for your lab, read our guide or contact us to discuss your lab water needs, and we’ll be happy to help.

We deliver Laboratory water purification systems and solutions for Research and Science in a Pharmaceutical environment.

Want to learn more on this topic? Take a look at our Pure LabWater guide which is an essential guide for those who use pure water or wish to discover more. 

Watch our webinar on: How to ensure water quality doesn't compromise your analytical results

Great training and development opportunity to gather an insight into technologies and best practices that can be used to mitigate any risk as well as hearing about our work in action where we talk about a case study on the University of Venice. 

Contact an Expert
We'd like to keep you informed by email about our future offers, new product launches and new technology insights. Please untick this box if you do not want to receive these communications.
Don't forget, you can always change your contact preferences at anytime by using the unsubscribe links you will find on all of our emails.