On another forum, nitrate levels are being discussed. One member, a Canadian, says that the max allowed nitrate in drinking water is 10ppm, and that the scale used by water companies is different from our home test kits. He says you need to multiply water company nitrate by 4.43 to get it to the same as our test kits.
[Nitrate is NO3; it contains 1 atom of nitrogen and 3 atoms of oxygen. The American water companies measure only the weight of the nitrogen part of nitrate while the home testers measure the weight of nitrogen AND oxygen, ie the weight of the whole nitrate]
In the UK, the DWI (drinking water inspectorate) says the max permitted level is 50ppm, which is a lot higher than the American max, and sounds suspiciously like their 10ppm multiplied by that 4.43. That is, it sounds as though UK water companies use the same units as our home testers.
My tapwater tests at lower than 5ppm with my API tester. My water company says that out of 8 tests, the minimum was 1.3, the maximum 3.6 and the mean 2.2. Multiplying 2.2 by 4.43 gives 9.75. Given the inaccuracies of home testers, my 'under 5' could in fact be 9.75.
In order to prove which units UK water companies use, I need someone who has tapwater with nitrate near the upper limit to test their nitrate and look up what their water company says it is. With higher numbers, multiplying by 4.43 would make a huge difference.
For example, if the water company was using the American units, 30ppm on a water company website would give a reading of 132ppm with a home tester and it should be easy to tell the difference between 30 and 130 even allowing for home kits being inaccurate. But if the water company was using the same units as the home tester, the results should be similar.
Anyone..........? ? ? ?