As time goes, on its becoming harder for any rational person to believe that man-made global warming is a real threat. The foundation of this notion built on data that climate scientists from all over the world have collected. However, it’s been shown time and time again that this data is often flawed and manipulated. It’s not easy to take climate change seriously when the supposed experts are cooking the books.
And that’s exactly what appears to be going on. Just last month, a research report that was peer reviewed by MIT and the EPA, showed that the Global Average Surface Temperature (GAST) data produced by NASA was being “adjusted.” Just about every time a new version of GAST was introduced, the adjustments to the data showed a steeper warming trend. How convenient?
But that is hardly the only example. Recently Australia’s Bureau of Meteorology was caught tampering with temperature data. According to the Daily Caller:
Meteorologist Lance Pidgeon watched the 13 degrees Fahrenheit Goulburn recording from July 2 disappear from the bureau’s website. The temperature readings fluctuated briefly and then disappeared from the government’s website.
“The temperature dropped to minus 10 (13 degrees Fahrenheit), stayed there for some time and then it changed to minus 10.4 (14 degrees Fahrenheit) and then it disappeared,” Pidgeon said, adding that he notified scientist Jennifer Marohasy about the problem, who then brought the readings to the attention of the bureau.
As for why these low temperatures were erased from the record and adjusted, it turns out that the Bureau of Meteorology has an automated quality control system in place to remove “false” data. There are apparently limits in place in the system that prevent it from recording extreme temperatures. In this case, it conveniently removed record low temperatures that were measured in July, which as you might expect is being blamed on faulty equipment.
Bureaus Chief Executive Andrew Johnson told Australian Environment Minister Josh Frydenberg that the failure to record the low temperatures at Goulburn in early July was due to faulty equipment. A similar failure wiped out a reading of 13 degrees Fahrenheit at Thredbo Top on July 16, even though temperatures at that station have been recorded as low as 5.54 degrees Fahrenheit.
So basically, this system is supposed to weed out extreme temperatures shifts that are abnormal and not representative of overall trends. That makes sense. You wouldn’t want rare anomalies to skew the data. But in this case, the system essentially deleted a cold temperature reading that wasn’t as extreme as another reading that was eventually kept.
It erased one supposed anomaly, and recorded another that was even more anomalous. Ask yourself, does that sound like the kind of mistake that a computer would make, or does it sound like the data was tampered with? It’s no wonder why the scientists who caught this “mistake” aren’t buying any excuses.
Marohasy, for her part, told reporters that Johnson’s claims are nearly impossible to believe given that there are screen shots that show the very low temperatures before being “quality assured” out. It could take several weeks before the equipment is eventually tested, reviewed and ready for service, Johnson said.
At this point, anyone who doesn’t at least question the established dogma regarding climate change, should have their head examined.
Leave a Reply