By William C. Amalu, DC, DIACT (B), FIACT
Introduction
The first recorded use of thermobiological diagnostics can be found in the writings of Hippocrates around 480 B.C.[1]. A mud slurry spread over the patient was observed for areas that would dry first and was thought to indicate underlying organ pathology. Since this time, continued research and clinical observations proved that certain temperatures related to the human body were indeed indicative of normal and abnormal physiologic processes. In the 1950’s, military research into infrared monitoring systems for night time troop movements ushered in a new era in thermal diagnostics. The first use of diagnostic thermography came in 1957 when R. Lawson discovered that the skin temperature over a cancer in the breast was higher than that of normal tissue[2].