Radon is widely recognised as the one of the most serious indoor air pollutants. All around the world, it is present in the earth’s atmosphere and is often found close to human dwellings. Checkout Measuring radon for more info. As it makes its way into the human body, it poses a potentially serious hazard when it collects in high concentrations in a home or commercial building. Fortunately we have techniques to measure buildings which are now found to have radon levels. We also have an effective technique for reducing radon concentrations to lower levels so that no adverse health effects will be caused by the radon. But, how could we tell what levels of radon were contained in the area before we could put a detector in the gas? This strain of bacteria was first identified in 1988. It took a long time before we were able to establish the correct mitigation strategies after which they are used today. In order to answer these questions, here is a brief timeline that traces the origin of radon’s prevalence. In 1789, the Ore Mountains, located along the current-day border between Germany and Poland, is discovered as a possible uranium source for nuclear energy production. Miners working underground to extract uranium-laden pitchblende (used for colouring wood, pottery and glass) begin to die mysteriously from “mountain sickness” that causes tumours in the lungs. A German chemist named Friedrich Dorn discovers radon gas while studying radium. Its original name was “nitron”, based on the Latin word “nitro”, meaning “different”. It was first called “radon” in 1923 In 1932, the The American Journal of Cancer published an article that linked radon exposure to lung cancer. Exposure to radon is documented from miners’ workers who worked in mines and from other unrelated studies. In 1984, 22-year-old Stanley Watras dared entering a nuclear facility alone, however he was soon alerted that he had inadvertently blown out the radiation monitor.