The Birth of Medical Diagnosis

From tasting urine to microscopy to molecular testing, the sophistication of diagnostic techniques has come a long way and continues to develop at breakneck speed. The history of the laboratory is the story of medicine's evolution from empirical to experimental techniques and proves that the clinical lab is the true source of medical authority. Part 1 in a 2-part series.

Three distinct periods in the history of medicine are associated with three different places and therefore different methods of determining diagnosis: From the middle ages to the 18th century, bedside medicine was prevalent; then between 1794 and 1848 came hospital medicine; and from that time forward, laboratory medicine has served as medicine's lodestar.

The laboratory's contribution to modern medicine has only recently been recognized by historians as something more than the addition of another resource to medical science and is now being appreciated as the seat of medicine, where clinicians account for what they observe in their patients.

The first medical diagnoses made by humans were based on what ancient physicians could observe with their eyes and ears, which sometimes also included the examination of human specimens. The ancient Greeks attributed all disease to disorders of bodily fluids called humors, and during the late medieval period, doctors routinely performed uroscopy. Later, the microscope revealed not only the cellular structure of human tissue, but also the organisms that cause disease. More sophisticated diagnostic tools and techniques-such as the thermometer for measuring temperature and the stethoscope for measuring heart rate-were not in widespread use until the end of the 19th century. The clinical laboratory would not become a standard fixture of medicine until the beginning of the 2Oth century. This 2-part article reviews the history and development of diagnostic methods from ancient to modern times as well as the evolution of the clinical laboratory from the late l9th century to the present.

Ancient diagnostic methods

In ancient Egypt and Mesopotamia, the earliest physicians made diagnoses and recommended treatments based primarily on observation of clinical symptoms. Palpation and auscultation were also used. Physicians were able to describe dysfunctions of the digestive tract, heart and circulation, the liver and spleen, and menstrual disturbances; unfortunately, this empiric medicine was reserved for royalty and the wealthy.

Other less-than-scientific methods of diagnosis used in treating the middle and lower classes included divination through ritual sacrifice to predict the outcome of illness. Usually a sheep would be killed before the statue of a god, its liver was examined for malformations or peculiarities; the shape of the lobes and the orientation of the common duct were then used to predict the fate of the patient.
Ancient physicians also began the practice of examining patient specimens. The oldest known test on body fluids was done on urine in ancient times (before 400 BC). Urine was poured on the ground and observed to see whether it attracted insects. If it did, patients were diagnosed with boils.
The ancient Greeks also saw the value in examining body fluids to predict disease. At around 300 BC, Hippocrates promoted the use of the mind and senses as diagnostic tools, a principle that played a large part in his reputation as the "Father of Medicine." The central Hippocratic doctrine of humoral pathology attributed all disease to disorders of fluids of the body. To obtain a clear picture of disease, Hippocrates advocated a diagnostic protocol that included tasting the patient's urine, listening to the lungs, and observing skin color and other outward appearances. Beyond that, the physician was to understand the patient as an individual." Hippocrates related the appearance of bubbles on the surface of urine specimens to kidney disease and chronic illness. He also related certain urine sediments and blood and pus in urine to disease. The first description of hematuria, or the presence of blood in urine, by Rufus of Ephesus surfaced at around AD 50 and was attributed to the failure of kidneys to function properly in filtering the blood.
Later (c. AD 180), Galen (AD 131-201), who is recognized as the founder of experimental physiology, created a system of pathology that combined Hippocrates' humoral theories with the Pythagorean theory, which held that the four elements (earth, air, fire, and water), corresponded to various combinations of the physiologic qualities of dry, cold, hot, and moist.
These combinations of physiologic characteristics corresponded roughly to the four humors of the human body:
hot + moist = blood;
hot + dry = yellow bile;
cold + moist = phlegm; and
cold + dry = black bile.

Galen was known for explaining everything in light of his theory and for having an explanation for everything. He also described diabetes as "diarrhea of urine" and noted the normal relationship between fluid intake and urine volume. His unwavering belief in his own infallibility appealed to complacency and reverence for authority. That dogmatism essentially brought innovation and discovery in European medicine to a standstill for nearly 14 centuries. Anything relating to anatomy, physiology, and disease was simply referred back to Galen as the final authority from whom there could be no appeal.

Middle Ages

In medieval Europe, early Christians believed that disease was either punishment for sin or the result of witchcraft or possession. Diagnosis was superfluous. The basic therapy was prayer, penitence, and invocation of saint. Lay medicine based diagnosis on symptoms, examination, pulse, palpitation, percussion, and inspection of excreta and sometimes semen. Diagnosis by "water casting" (uroscopy) was practiced, and the urine flask became the emblem of medieval medicine. By AD 900, Isaac Judaeus, a Jewish physician and philosopher, had devised guidelines for the use of urine as a diagnostic aid; and under the Jerusalem Code of 1090, failure to examine the urine exposed a physician to public beatings. Patients carried their urine to physicians in decorative flasks cradled in wicker baskets, and because urine could be shipped, diagnosis at long distance was common. The first book detailing the color, density, quality, and sediment found in urine was written around this time, as well. By around AD 1300, uroscopy became so widespread that it was at the point of near universality in European medicine.
Medieval medicine also included interpretation of dreams in its diagnostic repertoire. Repeated dreams of floods indicated "an excess of humors that required evacuation"; and dreams of flight signified "excessive evaporation of humors."

Seventeenth century

The medical advances of the 17th century consisted mostly of descriptive works of bodily structure and function that laid the groundwork for diagnostic and therapeutic discoveries that followed. The status of medicine was helped along by the introduction of the scientific society in Italy and by the advent of periodical literature.
Considered the most momentous event in medical history since Galen's time, the discovery of the circu1ation of blood by William Harvey (1578-1657) marked the beginning of a period of mechanical explanations for a variety of functions and processes including digestion, metabolism, respiration, and pregnancy. The English scientist proved through vivisection, ligation, and perfusion that the heart acts as a muscular pump propelling the blood throughout the body in a continuous cycle.

The invention of the microscope opened the door to the invisible world just as Galileo's telescope had revealed a vast astronomy. The earliest microscopist was a Jesuit priest, Anthanasius Kircher (1602-168O) of Fulda (Germany), who was probably the first to use the microscope to investigate the causes of disease. His experiments showed how maggots and other living creatures developed in decaying matter. Kircher's writings included an observation that the blood of patients with the plague contained worms"; however, what he thought to be organisms were probably pus cells and red blood corpuscles because he could not have observed Bacillus pestis with a 32-power microscope. Robert Hooke (1635-1703) later used the microscope to document the existence of "little boxes' or cells, in vegetables and inspired the works of later histologists; but one of the greatest contributions to medical science came from Italian microscopist, Marcello Malpighi (1628-1694). Malpighi, who is described as the founder of histology, served as physician to Pope Innocent XII and was famous for his investigations of the embryology of the chick and the histology and physiology of the glands and viscera. His work in embryology describes the minutiae of the aortic arches, the head fold, the neural groove, and the cerebral and optic vesicles.

Uroscopy was still in widespread use and had gained popularity as a method to diagnose "chlorosis," or love-sick young women, and sometimes to test for chastity. Other methods of urinalysis had their roots in the l7th century as well. The gravimetric analysis of urine was introduced by the Belgian mystic, Jean Baptiste van Helmont (1577-1644). Van Helmont weighed a numher of 24-hour specimens, but was unable to draw any valuable conclusions from his measurements. It wasn't until the late 17th century-when Frederik Dekkers of Leiden, Netherlands, observed in 1694 that urine that contained protein would form a precipitate when boiled with acetic acid that urinalysis became more scientific and more valuable. The best qualitative analysis of urine at the time was pioneered by Thomas Willis (1621-1675), an English physician and proponent of chemistry.

He was the first to notice the characteristic sweet taste of diabetic urine, which established the principle for the differential diagnosis of diabetes mellitus and diabetes insipidus.
Experiments with blood transfusion were also getting underway with the help of a physiologist in Cornwall, England, named Richard Lower (1631-1691). Lower was the first to perform direct transfusion of blood from one animal to another. Other medical innovations of the time included the intravenous injection of drugs, transfusion of blood, and the first attempt to use pulse rate and temperature as indicators of health status.

Eighteenth century

The 18th century is regarded as the Golden Age of both the successful practitioner as well as the successful quack. Use of phrenology (the study of the shape of the skull to predict mental faculties and character), magnets, and various powders and potions for treatment of illness were a few of the more popular scams. The advancement of medicine during this time was more theoretical than practical. Internal medicine was improved by new textbooks that catalogued and described many new forms of disease, as well as by the introduction of new drugs, such as digitalis and opium. The state of hospitals in the l8th century, however, was alarming by today's standards. Recovery from surgical operations was rare because of septicemia. The concept of antisepsis had not yet been discovered, and hospitals were notorious for filth and disease well into the 19th century.

One notable event that is a forerunner to the modern practice of laboratory measurement of prothrombin time, plasma thromboplastin time, and other coagulation tests, was the discovery of the cause of coagulation. An English physiologist, William Hewson (1739-1774) of Hexham, Northumberland, England, showed that when the coagulation of the blood is delayed, a coagulable plasma can be separated from the corpuscles and skimmed off the surface.

Hewson found that plasma contains an insoluble substance that can be precipitated and removed from plasma at a temperature slightly higher than 50'C. Hewson deducted that coagulation was the formation in the plasma of a substance he callcd "coagulable lymph," which is now known as fibrinogen. A later discovery that fibrinogen is a plasma protein and that in coagulation it is converted into fibrin, attests to the importance of Hewson's work.

The clinical diagnostic methods of percussion, temperature, heart rate, and blood pressure measurements were further refined, and there were some remarkable attempts to employ precision instruments in diagnosis.

Leopold Auenbrugger (1722-1809) was the first to use percussion of the chest in diagnosis in 1754 in Vienna. This method involved striking the patient‟s chest while the patient holds his or her breath. Auenbrugger proposed that the chest
of a healthy person sounds like a cloth-covered drum. A student of Auenbrugger's, Jean Nicolas Corvisart, a French physician at La Charite in Paris, pioneered the accurate diagnosis of heart and lung diseases using Auenbrugger's chest thumping technique. Corvisart's translation of Auenbrugger's treatise on percussion, "New Invention to Detect by Percussion Hidden Diseases in the Chest, popularized the practice of thumping on a patient's chest. The resulting sounds are different when the lungs contain lesions or fluids than in healthy people. This observation was validated by postmortem examination.

James Currie (1756-1805), a Scot, was the first to use cold baths in treatment of typhoid fever; and by monitoring the patient's temperature using a thermometer, he was able to adjust the temperature and frequency of the baths to treat individual patients. It took another hundred years, however, before thermometry became a recognized feature in clinical diagnosis.

In 1707, Sir John Floyer (1649-1734) of Staffordshire, England, introduced the concept of measuring pulse rate by timing pulse beat with a watch. He counted the beats per minute, and tabulated the results; but his work was ignored because of a healthy skepticism for an old Galenic doctrine of there being a special pulse for every disease.

The groundbreaking work for measuring blood pressure was done by Stephen Hale, (1677-1761), an English clergyman. By fastening a long glass tube inside a horse's artery. Hales devised the first manometer or tonometer, which he used to make quantitative estimates of the blood pressure, the capacity of the heart, and the velocity of blood current. Hales' work was the precursor for the development of the sphygmomanometer used today to measure arterial blood pressure.

Additional advances in urinalysis occurred with J.W. Tichy's observations of sediment in the urine of febrile patients (1774); Matthew Dobson's proof that the sweetness of the urine and blood serum in diabetes is caused by sugar (1776); and the development of the yeast test for sugar in diabetic urine by Francis Home (1780).

Table 1:
Evolution of diagnostic tests as documented in textbooks of laboratory medicine 1892 Sir William Osler. Textbook of Medicine
Hemoglobin estimation, red and white blood cell counts, malaria parasite identification, simple urinalysis, examination of sputum for tuberculosis.

1898 Sir William Osler. Textbook of Medicine
Blood culture, agglutination test for typhoid fever, isolation of Klebs-Loffler bacillus with virulence tests in diphtheria, lumbar puncture, examination of cerebrospinal f1uid in suspected meningitis, amino aciduria in liver disease.

1901 Sir William Osler. Textbook of Medicine
Isolation of typhoid bacilli from urine and the clotting time in hemophilia.

1912 Sir William Osler. Textbook of Medicine
Tissue examination for spirochetes in syphilitic lesions, the Wassermann test, osmotic fragility tests, a crude form of the glucose tolerance test.

1914 P.N. Patton. Title Unknown.
Blood counts and examination of stained smears, agglutination reactions, the Wassermann test, parasitology, blood cultures, spectroscopic examination, visual detection of bilirubinemia, Gmelin tests, Garrod technique for uric acid, alkalinity of blood, bacteriology (basic staining and culture methods in use today), urinalysis (pus, red blood cells, albumin, sugar), test meals in use, fecal examinations for fat and stercobilin, histology (frozen section and paraffin embedding).

Nineteenth century

The emergence of sophisticated diagnostic techniques and the laboratories that housed them coincides roughly with the worldwide political, industrial, and philosophical revolutions of the 19th century, which transformed societies dominated by religion and aristocracy into those dominated by the industrial, commercial and professional classes.

In the decades after the Civil War, American laboratories and their champions were met with a vehement skepticism of science, which was viewed by some as an oppressive tool of capitalist values. The lay public, as well as many practitioners, saw the grounding of medicine in the laboratory as a removal of medical knowledge from the realm of common experience and as a threat to empiricism. Many American physicians who went abroad to Germany and France for supplementary training came back to impart the ideals of European medicine, as well as those of higher education fur its own sake to an American society that found these beliefs threatening. The lab itself was not seen as a threat, but rather the claims it made to authority over medical practice were assailed. The empiricists and the "speculative medical experimentalists" were for the most part divided along generational lines. The older empiricists had a stake in continuing their careers independent of a medical infrastructure or system, while the younger physicians 'aspired to careers in academic medical centers patterned after German institutions. The younger, more energetic ranks had to first lobby for such facilities to be built and as older doctors retired from teaching posts and turned over editorship of scientific journals, this opposition to the lab faded.

Medical historians also note that the 19th century was one in which the rest of therapeutics lagged behind, and called it an era of public health. New discoveries in bacteriology allowed for water treatment and pasteurization of milk, which significantly decreased mortality rates. In addition, the advent of antiseptic surgery in the 19th century reduced the mortality from injuries and operations and increased the range of surgical work. Medical practitioners relied, for a time, more on increased hygiene and less on drugs. Advances in public and personal hygiene had dramatically improved the practice of medicine; predictions were even made that the pharmacopoeia of the time would eventually be reduced to a small fraction of its size.

At the beginning of the century, physicians depended primarily on patients' accounts of symptoms and superficial observation to make diagnoses; manual examination remained relatively unimportant. By the 1850s, a series of new instruments, including the stethoscope, ophthalmoscope, and laryngoscope began to expand the physician's sensory powers in clinical examination. These instruments helped doctors to move away from a reliance on the patients' experience of illness and gain a more detached relationship with the appearance and sounds of the patient‟s body to make diagnoses.

Another wave of diagnostic tools--including the microscope and the X-ray, chemica1 and bacteriological tests, and machines that generated data on patient‟s physiological conditions, such as the spirometer and the electrocardiogram reproduced data seemingly independent of the physician's as well as the patient's subjective judgment. These developments had uncertain implications for professional autonomy: They further reduced dependence on the patient, but they increased dependence on capital equipment and formal organization of medical practice.

These detached technologies added a highly persuasive rhetoric to the authority of medicine. They also made it possible to move part of the diagnostic process behind the scenes and away from the patient where several physicians could have simultaneous access to the evidence. The stethoscope, for example, could only be used by one person at a time, but lab tests and X-rays enabled several doctors to view and discuss the result. This team approach to diagnosis strengthened the medical community's claim to objective judgment. Equipped with methods for measuring, quantifying, and qualifying, doctors could begin to set standards of human physiology, evaluate deviations,and classify individuals.

Microscopy. Many scientists were making great strides in bacteriology, microbiology, and histology as well. Improvements in the microscope allowed further exploration of the cellular and microbial worlds in the 19th century. Johannes Evangelista Purkinje was an important Bohemian pioneer of the use of the microscope. In 1823, he was appointed professor of physiology at the University of Breslau and a year later, he started a physiologic laboratory in his own house. Purkinje's work includes descriptions of the germinal vesicle in the embryo, description and naming of protoplasm, discovery of the sudoriferous glands of the skin and their excretory ducts, and numerous descriptions of brain, nerve, and muscle cells.

When John Snow studied the great London cholera outbreak in 1854, he brought it under control by tracing it to the Broad Street Pump and eliminating access to this source of contaminated water. Snow's work foreshadowed some of the earliest successful applications of laboratory methods to public hygiene that came in the 1860s and '70s with the breakthroughs in bacteriology made by Louis Pasteur and Robert Koch. Louis Pasteur (1822-1895) discovered the anaerobic character of the bacteria of butyric fermentation and introduced the concepts of aerobic and anaerobic bacteria around the year 1861. About the same time, he discovered that the pellicle that is necessary in the formation of vinegar from wine consisted of a rod-like microorganism, Mycoderma aceti. In 1867, the wine industry of France reported a significant gain in revenue because of Pasteur's discovery that me spoiling of wine by microorganisms can be prevented by partial heat sterilization (pasteurization) at a temperature of 55--60°C without any alteration of the taste. Later {c. 1878), an accident brought about the discovery of preventive inoculation with the weakened form of a virus. While Pasteur was away on vacation, virulent cultures of chicken cholera became inactive; and when injected into animals they were found to act as preventive vaccines against subsequent injection of a live organism.

'The attenuated virus could be carried through several generations and still maintain its immunizing property. In 1881, Pasteur produced a vaccine against anthrax and lowered the mortality rare to 1 % in sheep and to 0.34% in cattle. In 1885, the Pasteur Institute was opened, and here Pasteur worked with several proteges for the rest of his life.

A contemporary of Pasteur's, Robert Koch, (1843-1910), began a brilliant career and a series of discoveries with his report in 1876 on the complete life history and sporulation of the anthrax bacillus. His culture methods were confirmed by Pasteur; and in 1877, Koch published his methods of fixing and drying bacterial films on coverslips, of staining them with Weigert's aniline dyes, of staining flagella and of photographing bacteria for identification and comparison. The following year, he published a memoir that included an etiology of traumatic infectious disease in which the bacteria of 6 different kinds of surgical infection are described, with the pathological findings of each microorganism breeding true through many generations in vitro or in animals. In 1881, he developed a method of obtaining pure cultures of organisms by spreading liquid gelatin with meat infusion on glass plates, forming a solid coagulum. Koch also played a role in perfecting the method of steam sterilization. The year after that, he discovered the tubercle bacillus by other special culture and staining methods and formulated a rule for determining the specificity of disease-causing organisms.

The rule, called Koch's postulates or Koch's law, stipulated that the specificity of a pathogenic microorganism could only be established if: (I) it is present in all cases of the disease, (2) inoculations of its pure cultures produce disease in animals, (3) from these cultures it can again be obtained, and (4) then it can again be propagated in pure cultures.

In 1883, Koch discovered Cholera vibrio and recognized its routes of transmission as drinking water, food and clothing. In 1893, he wrote an important paper on waterborne epidemics showing how they could he prevented by proper filtration. Finally, in 1905, Koch received the Nobel Prize.

These two bacteriologists were responsible for the isolation of the organisms responsible for major infectious diseases and led public health officials to make more focused efforts against specific diseases. Sand filtration of the water supply was introduced in the l890s and proved to be effective in preventing typhoid. Regulation of the milk supply also cut infant mortality dramatically. Antiseptic surgery, which reduced the mortality from injuries and operations and increased the range of surgical work, represented another successful application of the work of these two scientists.

The emergence of quantitative diagnosis and the hospital laboratory: By the mid-1800s, lab tests had been introduced to detect tuberculosis, cholera, typhoid, and diphtheria, but cures for these diseases would not come until later. Physicians also began to study pulse, blood pressure, body temperature, and other physiological indicators, even though simple, practical instruments to measure these signs were not developed until the end of the century.

The use of precise measurements in diagnosis became standard in medicine in the early 1900s. Standardized eye tests, weight and height tables, and IQ tests were all part of a movement to identify statistical norms of human physiology and behavior.

The first hospital lab in Britain, which was set up at Guys Hospital, was organized into clinical wards. Two of these wards were designated for medical student rotations and had a small laboratory attached for clinical work. By 1890, most laboratory procedures in the U.S. were performed by the physician with a microscope in his home or office. In 1898, Sir William Osler, a Canadian physician, professor, and one of the first well-known authors in the clinical laboratory literature, established ward laboratories at Johns Hopkins Hospital, where routine tests were carried out by attending physicians, and more complex procedures and research problems were referred to the pathology laboratory.

An increasing number of useful laboratory tests were discovered in the second half the 1800s, and by the turn of the century, specific chemical and bacteriologica1 tests for disease emerged rapidly. In the 1880s, the organisms responsible for tuberculosis, cholera, typhoid, and diphtheria were isolated and by the mid-l890s, lab tests had been introduced to detect these diseases. The spirochete that causes syphilis was identified in 1905; the Wasserman test for syphilis was introduced in 1906. Advances in the analysis of urine and blood gave physicians additional diagnostic tools. These innovations were the result of progress in basic science that made it possible to duplicate successful applications more rapidly than ever before. The earlier advances in immunization, such as smallpox vaccination, had been purely empirical discoveries and were not quickly repeated.

Microbiology for the first time enabled physicians to link disease-causing organisms, symptoms, and lesions systematically. The principles that Pasteur demonstrated in the development of anthrax and rabies vaccines now provided a rational basis for developing vaccines against typhoid cholera, and plague.

Surgery.

Surgery enjoyed tremendous gains in the late 1800s. Before anesthesia, surgery required brute force and speed because it was important to get in and out of the body as quickly as possible. After William T. G. Morton's (1819-1868) demonstration of ether at the Massachusetts General Hospital in 1846, use of anesthesia allowed for slower and more careful operations.

Joseph Lister's (1827-1912) methods of antisepsis using carbolic acid, first published in l867, became general practice around 1880 and improved the previously grim mortality rates for all types of surgery. Before antiseptic techniques, the mortality rate for amputations was about 40%. Surgeons were reluctant to penetrate the major bodily cavities and then only as a last resort. After surgeons were able to master the tedious procedures demanded in antisepsis, they began to explore the abdomen, chest, and skull and developed special techniques for each area. The sophistication and success of surgery blossomed in the 1890s and early 19UOs, spurred on by the development of x-rays in 1895. Surgeons were able to operate earlier and more often for a variety of ills, including appendicitis, gall bladder disease, and stomach ulcers. The growth in surgical work provided a means for expanding profit in hospital care as well.

Hematology.

In the later part of the century, several discoveries emerged as the foundation of hematologic methods. In 1877, K. Vierordt introduced coagulation time as an index of blood coagulative power. Sir Almroth Edward Write, an Irish professor of pathology in Dublin, was the first to observe the role of calcium salts in the coagulation of blood. He also devised a coagulometer to estimate coagulation time. In 1879, Paul Ehrlich (1854-1915), a Czech cellular pathologist and chemist, was enamored with dyes and developed many methods of drying and fixing blood smears using heat. Ehrlich also discovered mast cells and saw their granulations using a basic analine stain. His classification of white blood cells into different morphological types (neutrophils, basophils, and oxyphilic) paved the way for identifying many diseases of the blood. Ehrlich also contributed to microbiology the discovery of methylene blue as a bacterial stain.

Brief History of the Hospital

The earliest hospitals in pre-industrial societies were charitable institutions used for tending the sick as opposed to medical institutions that provided for their cure. Medieval hospitals were operated by religious or knightly orders in individual communities. The facility was essentially a religious house in which the nursing personnel had united as a vocationa1 community under a religious rule.

In colonial America, almshouses were the first institutions to provide care for the sick. These facilities also had a communal character in that they provided a substitute residence for people who were homeless, poor, or sick. Founded as early as the 17th century in America, almshouses offered sanctuary to all kinds of dependent people from the elderly to the orphaned, the insane, the ill, and the debilitated. In a number of large cities, hospitals evolved from almshouses: The Philadelphia Almshouse became Philadelphia General Hospital; the New York Almshouse became Manhattan's Bellevue Hospital; and the Baltimore County Almshouse became part of the Baltimore City Hospitals.

The next rendition of the hospital was a facility that served the sick but limited its services to the poor. Not until the 19th century did hospitals serving all classes emerge. In 1752, the Pennsylvania Hospital in Philadelphia became the first permanent general hospital in America founded especially for care of the sick. New York Hospital was chartered in 1717, but wasn't opened for another 20 years, and the Massachusetts General Hospital opened in Boston in 1821. These institutions were termed "voluntary" because they were financed with donations, rather than with taxes.

In Europe, hospitals figured prominently in medical education and research, but were mostly ignored in America until the founding of Johns Hopkins. Between 1670 and 1910, however, hospitals began to play this part in the U.S. as well. Before 1900, the hospital offered no special advantages over the home in terms of surgery. The infections that periodically swept through the wards made physicians cautious about sending patients there. Antiseptic techniques were for a short time adapted for use in patients' homes. "Kitchen surgery" became more inconvenient for patient and surgeon alike as procedures became more demanding and more people moved into apartments, as antiseptic techniques improved, and the stigma of disease in the hospital died out, surgery was brought back into the hospital.

Laboratory science and professional certification in the 20th century

The history of the laboratory continues: Advances made in the lab eradicate life-threatening illnesses, and laboratorians establish their own identities and societies.
At the beginning of the 20th century, therapeutic agents were still relatively few, and many common diseases that are easily cured today were still considered life-threatening. As improvements were made in diagnostic techniques and new drugs were discovered, the laboratory galvanized the authority of medicine by endowing it with the ability to identify and cure disease. Clinical labs began to evolve into permanent institutions within U.S, hospitals as new diagnostic tools were derived from advances in physics. These included radioactive isotopes, electrophoresis microspectrophotometry, the electroencephalogram, and electromyogram. Other techniques such as ventriculography, intracardiac catheterization, and tomography greatly extended the physician's understanding of body function.

In 1840, the only laboratory the average European physician was likely to have used was that of a pharmacist; but by 1900, a host of laboratory types emerged, including physiological laboratories, pharmaceutical and pharmacologic laboratories, as well as forensic, public health and microbiological laboratories.

The lab, in one form or another, became an "obligatory passage point" for researchers who wanted to make new discoveries.

Microbiology

Developments in microbiology attested to the link between the diagnosis and treatment of disease. 'The arrival of antibiotics and sulfonomides was especially important in curing previously fatal diseases.

The accidental discovery of penicillin by Sir Alexander Fleming (1881-1955) in 1928 was paramount in initiating the antibiotic era. The Scottish scientist had been studying the natural bacterial action of the blood and antibacterial substances that would not be toxic to animals.

While working on the influenza virus, he observed a mold that had accidentally developed on a staphylococcus culture plate. Around the mold was a bacteria-free circle around. Fleming experimented with the mold and found it could prevent growth of staphylococcus even when diluted 800 times.

Later, Gerhard Johannes Paul Domagk (1895-1964), a German anatomic pathologist and bacteriologist, discovered that a red dye called prontosil rubrum protected laboratory animals from lethal doses of staphylococcus and hemolytic streptococci. Prontosil was a derivative of sulphanilamide. Domagk was not convinced the substance would be equally effective in humans, but when his daughter became very sick with a streptococcal infection he gave her a dose of prontosil in desperation.

She made a complete recovery, but these results were not divulged until 1935 when other clinicians had tested the new drug on patients. Domagk's discovery of the antibacterial action of the sulphonomides gave medicine and surgery a new weapon against many infectious diseases.

CIinical chemistry

There were many outstanding biochemists of the time. One who conferred a repertoire of tests to the laboratory was Otto Folin: a Swedish professor of biological chemistry at Harvard (1907). Between 1904 and 1922, Folin developed quantitative analytical methods for several urine analytes including urea, ammonia, creatinine, uric acid, total nitrogen, phosphorus, chloride, total sulfate, and acidity. He also attempted to measure blood ammonia and introduced Jaffe's alkaline picrate method for creatinine. Folin showed the effect of uricosuric drugs on blood and uric acid levels in gout; introduced the colorimetric method for measuring epinephrine and published the first normal values for uric acid, nonprotein nitrogen (NPN), and protein in blood. Folin is also responsible for establishing the relationship of uric acid, NPN, and blood urea nitrogen to renal function. The Folin Cicalteau reagent among others developed by Folin, is still used today for protein determinations.

Blood banking

New discoveries about the biochemical nature of blood made possible the transfusion of blood between humans, which greatly advanced the success rate of surgery. By the early 1940s, blood banking was established in the U.S.
In 1900, the Viennese pathologist Karl Landsteiner (1868-1943) discovered the concept of the human blood types and the following year, described the ABO blood group. Accounts of previously unsuccessful blood transfusions from animals to humans reported that the foreign blood corpuscles were clumped and broken up in the human blood vessels, thus liberating hemoglobin. Landsteiner reported a similar reaction in transfusion of blood from human to human. Shock, jaundice, and hemoglobinuria accompanied these early blood transfusions. After Landsteiner's classification of blood types into the well-known A, B, AB, and 0 groups in 1909, the catastrophes of earlier blood transfusions were eliminated by transfusing blood only between individuals of the same blood group. Later, Landsteiner studied bleeding in newborns- and contributed to the discovery of the Rh factor, which relates human blood to the blood of the rhesus monkey.
Another icon of modern blood banking is Charles Drew, MD (1914-1950), an African-American physician from Washington, DC. Early in 1940, the American Red Cross and the Blood Transfusion Betterment Association of New York began a project to collect blood for shipment to the British Isles. Eight New York City hospitals collected blood for what became known as the Plasma for Britain Project. During this project, Drew successfully used the laboratory experiments and blood research done by others to mass produce plasma. Drew heard that the British had successfully modified a cream separator to separate plasma from the red cells in blood, so he ordered 2 of the machines and constructed similar equipment to produce clear plasma on a large scale. Drew became a leading authority on mass transfusions and blood processing method, and was later asked by the American Red Cross and U.S. government to establish a similar program for the Plasma for Britain Project.

The quality movement

The 20th century marks the beginning of a quality movement in hospitals and laboratories that began with physicians and healthcare workers. As part of that movement, those who ran hospitals began to appreciate the skills that clinical chemists could bring to the hospital laboratory. In the early part of the century, many hospitals began reorganizing their laboratories so that they were headed by biochemists. Professional organizations emerged as self-regulating groups that helped ensure the skills and knowledge of laboratory professionals would pass the scrutiny of the hospitals that employed them. These professional organizations also served their members by lobbying in Washington for advantageous legislation.

The American College of Surgeons conducted the first inspections of hospitals in 1918. Initially, the inspections were based on a single page of standards, including a requirement for an adequately staffed and equipped laboratory. Surveyors inspected 671 hospitals of 100 beds or more, and of these, 81 passed initially. One hundred more passed reinspection after improvements were made.

That same year brought the first call for a method of certifying technologists on a national scale by John Kolmer, who published "'The Demand for and Training of Laboratory Technicians" which included a description of the first formal training course in medical technology. Also during that year, the Pennsylvania State Legislature passed a law requiring all hospitals and institutions, particularly those receiving state aid, to install and equip an adequate laboratory and to employ a laboratory technician on a full-time basis.
By 1920, clinical laboratories in large hospitals were distinct administrative units of service directed by a chief physician. They usually consisted of 4 or 5 divisions including biochemistry, clinical pathology, bacteriology, serology, immunology, and radiology. Trained, often salaried, professionals staffed each. An American Medical Association survey later showed that 48% of U.S, hospitals had clinical laboratories by 1923, and another survey in 1925 showed that 14% of all U.S. clinical laboratories were commercial or reference laboratories by this date. In spite of possible economies of scale, reference labs performed only a small share of tests over the next several decades.

The American College of Surgeons figured prominently in ensuring that hospital laboratories remained under the control of pathologists by promulgating certification standards that required hospitals to have a laboratory with a pathologist in charge. Because pathologists had a monopoly on laboratory tests in the hospital, these labs became extremely lucrative as the number of available tests increased.

Certification of lab professionals

Physicians in the clinical lab have always played a large role in the status of other lab professionals. Until the last 20 to 30 years, physicians have managed to resist corporate domination throughout the history of medicine. Doctors were motivated not only to preserve their autonomy, but also to prevent third parties from making a profit that might otherwise go to the doctor. In 1934, the AMA stated in a section of its code of ethics that profit from medica1 work "is beneath the dignity of professional practice, is unfair competition with the profession at large, is harmful alike to the profession of medicine and the welfare of the people, and is against sound public policy." This is not to say that the AMA didn't want physicians to make profits for themselves; only that they should not become a part of a larger organization whose function it was to make money. Whether the motivation for this policy was capitalistic or humanitarian is still the subject of debate. This policy helped physicians establish a medical infrastructure that allowed them to delegate to other healthcare professionals work that was repetitive and time-consuming.

To maintain their autonomy, physicians needed technical assistants to help them use hospitals and laboratories without being employees of these facilities. The allied health professional began to emerge in the first 30 years of the century with the encouragement of the doctors who needed them. Doctors needed technical assistants who were competent enough to work in their absence yet not threaten their authority.

These professionals were developed by physicians in 2 ways: (1) the encouragement of a kind of responsible professionalism among the higher ranks of subordinate healthcare workers, and (2) the employment of women in these auxiliary roles who could he professionally trained but would not challenge the authority or economic position of the doctor.

Because clinical pathologists were striving for professional recognition among other physicians, the American Society for Clinical Pathology was founded in 1922. Among the Society's objectives were the goals of maintaining the status of clinical pathologists as well as "encouraging a closer cooperation between the practitioner and the clinical pathologist." By 1928, when the ASCP established its Board of Registry (BOR), 80% of the initial group of 350 applicants for ASCP medical technologist (MT) certification were women. The code of ethics for technicians and technologists was and continues to be that these professionals agree to work under the supervision of a physician, refrain from making written or oral diagnoses, and refrain from advising physicians on treatment options without the supervision of a physician or pathologist.

Meanwhile, other groups of nonphysician clinical laboratory scientists were striving for professional recognition of their own. The American Society of Medical Technologists, now known as the American Society for Clinical Laboratory Science, was originally formed as a subgroup of the ASCP. Through the ASCP, pathologists prevented nonphysician MTs from becoming an autonomous profession. ASMT established committees to serve the needs of its members and implemented a process to certify MTs who had acquired specialized laboratory expertise. Before 1940, the ASCP's BOR was restricted to pathologists who were ASCP members, but by 1949, the ASCP had amended its bylaws to allow 3 ASMT members on the Board with full voting rights. Ten years later, a fourth ASMT member was added, but ASCP maintained control with 6 of its own members on the Board. The ASMT continued its pursuit of independence with its own mission and objectives, and in 1947 held its first annual meeting independent of the ASCP. ASMT changed its own bylaws to permit individuals with master's and doctoral degrees to join ASMT without ASCP BOR certification. Between the end of World War II and 1962, ASMT began to reassess its views on personnel licensure and regarded it as a positive step toward professional recognition. In the late 1950s, MTs sought governmental recognition of their educational qualifications through personnel licensure laws and position reclassification in the Civil Service and armed forces.

Editor‟s Note: By the end of the first half of the 20th century, laboratory medicine had earned professional legitimacy through contributions to diagnosing disease and discovering drugs to treat formerly' life-threatening illnesses.

Professional societies emerged to develop professional identity and to provide educational support. In part 3 of this series, we'll finish our coverage of professional association by reviewing the establishment of the American Medical Technologists and look at the history of U.S. National Healthcare coverage, Medicare and Medicaid, and the myriad of regulations faced by modern laboratories.

Timeline

1911 Oskar Heimstadt invents the fluorescent microscope
1912 American College of Surgeons is chartered in Illinois.
1913 DD Van Slyke is appointed chemist at Rockefeller Hospital Laboratory; American Association of Immunologists is founded.
1916 KMG Seigbahn develops x-ray spectroscopy
1918 N. Wales and EJ Copeland develop the electric refrigerator
1919 FW Aston develops the mass spectrograph
1920 First clinical laboratory method for serum phosphorus is established; the use of venipuncture for diagnostic testing becomes widespread; Meyers establishes the University of Iowa center for training clinical chemists, primarily for hospital positions; Conference of Public Health Laboratories is founded.
1921 First clinical laboratory method for serum magnesium is introduced; The Denver Society of Clinical Pathologists, precursor of ASCP, is founded.
1922 ASCP is founded in St. Louis
1925 American Type Culture Collection is founded
1926 Arne Tiselius develops moving boundary electrophoresis of proteins; Theodore Svedberg determines the molecular weight of hemoglobin by ultracentrifugation; ASCP appoints a Committee on the Registration of Laboratory Technicians‟ to define and classify medical technicians
1928 GN Papanicolaou first reported the ability to recognize cancer in vaginal smears, thus beginning clinical cytology; FA Paneth founds radiochemistry
1929 Otto Folin introduces the use of the light filter in colorimetry; Gabreus develops the ESR as an index of severity of disease; Knott and Ruska invent the electron microscope; ASCP establishes the Board of Registry for certifying MTs; Mayo Clinic has 21 laboratories by this date.
1930 Kay develops the first clinical laboratory method for alkaline phosphatase, thus beginning clinical enzymology; refractometry is first used in clinical labs for the determination of protein in urine; ASCP issues its first MT certification to PH Adams from Ft. Wayne, IN; Beckman Instruments is founded.
1932 Cherry and Crandall develop the method for serum lipase activity; American Society for Clinical Laboratory Technicians, precursor of ASMT, is founded.
1934 Commericial development of the electron microscope takes place
1935 Beckman Instruments introduces the first pH meter; ASCP BOR first requires a college degree for MT certification
1937 First hospital-based blood bank is established at Cook County Hospital, Chicago, IL; ASCP and its BOR officially oppose state licensure of MTs
1938 Somogyi develops 2 major methods for serum and urine amylase activity; Gutman develops the first assay for acid phosphatase
1939 Conway and Cook develop the first method for blood ammonia; American Medical Technologists is founded.
1940 Visual colorimeters begin to be replaced by photoelectric colorimeters in clinical labs; RCA demonstrates the first commercial electron microscope
1941 GN Papanicolaou and HF Traut prove the diagnostic usefulness of vaginal smears in cervical cancer; Martin and Synge separate amino acids and peptides by chromatography
1943 Penicillin is successfully used in therapy
1944 Sunderman applies refractometry of proteins in the clinical lab.
1945 Borgstrom develops the whole blood clotting time test; itemized charges for hospital services are begun
1946 The Vacutainer evacuated serum collection tube is introduced by Becton Dickinson Co; Arne Tiselius separates proteins by chromatography; College of American Pathologists is founded.
1947 Edwin Land develops the Polaroid Camera; American Association of Blood Banks is founded
1948 American Association of Clinical Chemists is founded
1950 Yalow and Berson developed radioimmunoassay; Levey and Jennings adapt the Shewart QC chart to use in clinical laboratories; Histochemical Society is founded.
1952 MD Poulik invents immunoelectrophoresis
1954 Kuby develops the method for serum creatine phosphokinase activity; Walsh develops the atomic absorption spectrophotometer
1955 Wroblewski and LaDue develop the method for serum lactate dehydrogenase; Karmen develops the method for serum aspartate; Skeggs develops the concept of „continuous flow analysis‟ in connection with treatment of renal disease; Severo Ochoa synthesizes RNA
1956 Wroblewski and LaDue develop the method for serum alanine aminotransferase activity called „serum glutamic-pyruvic transaminase‟ and recognize its greater specificity for liver disease compared with that of AST; Edwards proposes prenatal screening for genetic disease
1957 Van Handel and Zilversmit develop a direct method for determination of triglycerides.
1959 The first clinical chemistry analyzer, the single-channel Auto-Analyzer, is introduced by Technicon Corp.; Technicon first applies flame photometry to automated methods
1960 Methods for CPK isoenzymes are developed; the first method for GGT in serum is developed; Perkin-Elmer Corp. introduces atomic absorption spectrophotometry for the determination of calcium and magnesium; the laser is developed; Feichtmeier invents the mechanical pipettor (Auto-Dilutor).
1961 Becton Dickinson Co. introduces disposable hypodermic syringe and needle
1962 Siegelman develops a method for glutamic dehydrogenase; IBM introduces disk storage for computers; International Society for Clinical Laboratory Technology is founded.
1965 Scanning electron microscope is developed; the US enacts Medicare and Medicaid
1966 Medicare/Medicaid officially goes into effect
1967 GO Abelev shows that alpha-fetoprotein is elevated in serum of patients with testicular teratocarinoma; MetPath Laboratories is founded; US enacts the Clinical Laboratory Improvement Act (CLIA 67).
1968 The first random-access analyzer is introduced by DuPont (the ACA); the 1% Medicare allowance for unidentified costs is reduced to zero; Canada enacts the Federal Medical Care Act; creating a single-payer national health program.

Part 3- Medicare, government regulation, and competency certification

At first, Medicare was found money for healthcare providers in the U.S.; but the program‟s vulnerability soon became apparent, spawning a decades-long government effort to regulate providers who participated in the program.

Much of the history of the clinical lab during the last 30 years can be described as a reaction to the development of Medicare and Medicaid, as well as to the regulatory bodies that were established to oversee the administration of these programs. As medicine became more sophisticated, more and more healthcare services were developed over the years.

Predictably, demand for services rose as well, and with it came the American democratic view that physician services should be available to all. Medicare and Medicaid were a boon to healthcare providers. As the program‟s costs increased, loopholes for excessive reimbursement were closed, but resourceful providers continued to find new ones. What followed was a decades-long period of government regulation to control costs and ensure quality of healthcare services that continues today.

The Medicare/Medicaid bonanza

Legislators made several attempts at establishing a national health insurance plan in the U.S. as early as 1914, but the political interests of trade unions, employers, and physicians defeated such legislation in the first half of the century. Like the many proposals for national health insurance that preceded them, Medicare and Medicaid were actually responses to public concern for greater access to medical services.

When the Medicare bill was first proposed, the American Medical Association opposed a government insurance plan, calling it a threat to the doctor-patient relationship. Strategists for the bill proposed excluding physicians‟ services and claimed to provide broader benefits. Polls also indicated that the majority of those questioned thought Medicare should cover doctor‟s bills as well. The programs that are now Medicare and Medicaid are the result of strategists putting back the provision for physicians‟ bills. Part A Medicare benefits cover hospital, nursing home, and other institutional healthcare expenses and are paid for by Social Security; part B consists of government-subsidized voluntary insurance to cover physicians' bills and other non-physician services, including laboratory tests; anti Medicaid provides expanded assistance to the states for medical care for the poor. President Johnson signed the programs into law July 30, 1965. Some doctors protested, but they eventually figured out that Medicare was a bonanza.

Part of the reason Medicare gained the acceptance of doctors and hospitals was the establishment of a buffer zone between the providers and the federal bureaucracy. Under Part A of Medicare, the law allowed groups of hospitals, nursing homes, and other facilities the option of nominating fiscal intermediaries instead of dealing directly with the Social Security Administration. These intermediaries provide reimbursements, consulting, and auditing services. As expected, the majority of hospitals and other institutions nominated Blue Cross. Under Part B, the secretary of the Department of Health, Education, and Welfare was to choose private insurance agents called carriers to serve the same function in a geographical area. The majority of these carriers were Blue Shield plans. This meant that the administration of Medicare was conducted by private insurance systems originally established to suit provider interests, and the government handed over direct control of the program and its costs.

Finally, Medicare made it's way into law because it paid hospitals according to their costs rather than a schedule of negotiated rates. The rules for calculating these costs were quite favorable to hospitals and included costs for depreciation on assets. This allowed the hospital with the newest, most expensive facilities to garner the most capital through reimbursement. Some observers have said that Medicare officials understood these drawbacks but accepted them because they feared a hospital boycott. By the 1970s, huge increases in costs caused a reversal in prevailing assumptions about the need to expand medical care. The need became one of curbing medicine's apparently insatiable appetite for resources.

In its first year, Medicare cost $4.7 billion for 19 million people and represented less than 1% of the federal budget. By 1985 Medicare costs had risen an average of 17% per year-much faster than the average yearly increase in national healthcare costs.  The need for regulation soon after Medicare and Medicaid went into effect, the U.S. government became aware of the programs' vulnerability to fraud and abuse. It behooved the government to see that money was not being siphoned off through overcharging for services and that the quality of services financed with tax dollars was up to snuff.

The U.S. had established minimal quality requirements for clinical laboratories engaged in interstate commerce to participate in Medicare; but these requirements-collectively known as the Clinical Laboratory Improvement Act of 1967 (CLlA '67) covered only those labs doing business across state lines, only a fraction of all U.S. clinical labs. The need to regulate all labs performing tests on human specimens became apparent to lawmakers; and throughout the '70s, amendments to CLIA '67 were proposed to stiffen personnel requirements as well as mandate inspections to certify that lab facilities met some minimum standards for accuracy and quality control.

CLIA took longer to revise, but by 1972, 100 new amendments to the Social Security Act were made, which included changes to Medicare law. These new Amendments established professional standards review organizations (PRSOs). These were groups assembled by HEW that reviewed medical necessity, inappropriateness, and quality of services paid for with Medicare and Medicaid funds. When final rules for implementing the Amendments were enacted in 1978, they included a list of 12 lab tests to be reimbursed at the lowest price at which similar tests were available in a geographic region. The price to be paid for the 12 tests, which represented 50-60% of all reimbursable lab services, was set at the 25th percentile for a given region. In other words, at least 1 in 4 labs would have to be charging at this definition of the "lowest price or below it. Quality control, proficiency testing, as well as personnel competency rules were also implemented. Medical technologists were required to have a bachelor's degree or the equivalent, and if not, they would have to take a proficiency examination administered by HEW to qualify as "certified" to work in a lab that received Medicare reimbursement. The HEW exam was the closest the U.S. has come to uniform, national testing personnel standards. Even CLIA '88, which actually lowered education requirements for personnel performing high complexity tests leaves considerable room for interpretation of standards for testing personnel.
The requirements fur taking the HEW exam were a high-school diploma and 4 years of laboratory experience. At that time, HEW accepted no private certification in lieu of passing the HEW exam. The test included 4 sections on hematology\ blood banking, clinical chemistry, and microbiology, and the scores were not encouraging. The examination was given 7 times between 1975 and 1987 to more than 65,000 individuals. This group represented 35-40% of all MTs (with and without the bachelor's degree). Of that group, approximately 31,000-less than half- passed the test.

The beef over personnel competency.
Personnel competency standards for lab professionals have been and continue to be controversial. Issues of control by pathologists or by independent clinical laboratory scientists-as well as issues of race and socioeconomic status- have figured prominently in disputes over standards for laboratory personnel.

Groups that advocated for advancing the professional status of clinical laboratory scientists were in favor of a nationally administered test because it would validate their view that clinical laboratory science should be a profession independent of physician control; but others were threatened by the possibility of being legislated out of a job. Labs that received Medicare funds were required to certify the competency of all testing personnel to be eligible to participate in the program, so, technologists had an incentive to meet HEW's requirements to make themselves attractive to current and potential employers.

One court case highlights a longstanding contention by some certifying organizations that education cannot be the only criterion for assessing competency and granting certification. In 1975, a U.S. District Court heard arguments that requirements that only college graduates could take civil service examinations for medical technologist certification were unfair. Margaret Townsend, a Nassau County, NY, blood bank medical technologist, decided to sue the county's Civil Service Commission after being denied a chance to take the exam in 1973.

Townsend, who had trained several younger MTs with bachelor‟s degrees, successfully proved that the Commission's requirements for medical technologists had a racially disproportionate impact and that college graduation did not have a meaningful relationship to the job in question. Moreover, acceptable college programs to train individuals for laboratory work did not yet exist.

The case ultimately proved that the law would have to be changed to accommodate laboratorians like Townsend who lacked the college degree but had needed experience.
Private quality initiatives.

Around the same time CLlA '67 was enacted, a group of clinicians and laboratory scientists, calling themselves the National Committee for Clinical Laboratory Standards was meeting to discuss ways of improving patient services. The group sought to develop a consensus process for standardizing laboratory test methods. The Committee's first manual of operating procedures detailed proposed mechanism for developing and reaching a consensus on standards. The consensus structure comprised committees for various lab disciplines and subcommittees in subdivisions of disciplines.

The consensus process consisted of a system of development, evaluation, and scrutiny at multiple levels, as well as provisions to give consideration to dissenting opinions at all stages of review of proposed guidelines. The American National Standards Institute accredited NCCLS in 1977, and NCCLS subsequently became the home of the National Reference System for the Clinical Laboratory, a collection of broadly understood reference systems intended to improve comparability of test results. NCCLS standards are voluntary, but they are widely recognized as best laboratory practices.

No teeth in self-regulation

Self-rogulation had been the method of choice for labs that wanted to prove their reliability by following NCCLS standards or by voluntarily submitting to private-sector quality assurance programs, such as the College of American Pathologist's accreditation program but legislators criticized the voluntary system as having no repercussions for labs grinding out low-quality or unreliable work.

CLIA '88, After years of attempts to update CLlA '67 to include all labs in the U.S., one bill finally made the quantum leap to legislation when President Ronald Reagan signed the Clinical Laboratory Improvement Amendments of 1988 (CLlA '88) into law. The new regulations were to take effect January 11, 1991, but practical problems revealed by the lab industry in meeting the new regulations delayed their implementation until 1992. Under the Amendments, all labs are required to have a certificate issued by the Department of Health and Human Services. HHS certifies only those labs that have adequate quality assurance and quality control programs in place and that successfully pass proficiency tests. The Amendments also classified all tests into 3 levels: waived (tests of low complexity that required no oversight), moderate complexity, and high complexity. Requirements for testing personnel for each leve1 are also outlined. The physician lobby intercepted early attempts to pass revisions to CLlA '67 because of their stake in physician office labs (POLs). To get the law passed, it was revised to include less stringent requirements for POLs performing low-complexity tests. Such tests qualified for waiver of CLlA '88 requirements. CLlA '88 expanded CLlA '67 from a few thousand interstate labs to virtually every c1inical laboratory in the country, including POLs, and has had a profound effect on nearly every aspect of laboratory operation in the U.S.

Time Line

1969 High performance liquid chromatography becomes widely applied in analytical chemistry; because of changes in reimbursement policy, hospital-based pathologists begin founding independent regional laboratories; Roche Biomedical Laboratories, Inc. is founded; MLO is launched in July as a bimonthly magazine.
1970 Monarch Marking and Plessy Telecom introduce the bar code; US OSHA is founded.
1971 Savory develops the serum albumin assay on the Technicon Auto-Analyzer; alpha fetoprotein (AFP) assay is commercialized by Abbott Laboratories, Inc.; American Association of Clinical Laboratory Supervisors and Administrators, precursor to the Clinical Laboratory Management Association, is founded; Nichols Institute, Inc. is founded.

1972 American Association of Pathology Assistants is founded. 1973 Westgard introduces Westgard control rules into clinical laboratory quality control; U.S. Centers for Disease Control is founded; National Accrediting Agency for the Clinical Laboratory Sciences is founded.
1975 The laser cell sorter is developed; Roche Diagnostics first commercializes the carcinoembryoruc antigen assay; Association of Cytogenetic Technologists is founded; a "malpractice crisis" exists in the U.S. as physicians are sued in record numbers.
1976 The first automated radioimmunoassay is introduced by Micromedic Corp.; at least 1 gene is assigned to each of the 24 human chromosomes by this date.
1977 The Health Care Financing Administration is founded; the U.S. enacts the Medicare/Medicaid Fraud and Abuse Amendments; discounts for lab work are not prohibited if properly disclosed.
1978 Final rules implementing the 1972 Medicare Amendments are enacted; FBI‟s operation Labscam identifies doctors, hospitals, and clinics soliciting kickbacks as a precondition to doing business with labs.
1979 M.E. Yank introduces prostate specific antigen (PSA) as a serum tumor marker; R. Naito develops an artificial blood substitute; F. Nlikkerson, F. Evereaerts. and T. Verheggen develop capillary zone electrophoresis (CZE); the Clinical Laboratory Management Association is founded.
1981 Colcher introduces the CA-72 serum tumor marker primarily for colorectal cancer.
1981 H. Koprowski introduces CA-19-9 as a serum tumor marker primarily for pancreatic cancer; R.C. Bast introduces CA.- 125 as a serum tumor marker primarily for ovarian cancer.
1982 SmithKline Laboratories, Inc. acquires Biosciences
1983 HCFA implements its Prospective Payment System using Diagnosis-relatcd groups (DRGs) as a basis for hospital reimbursement; Hybritech Inc. commercializes the PSA assay; Centocol, Inc. commercializes CA-19-9 assay; Cambridge Life Sciences, Inc. introduces biosensors; L. Lindholm introduces CA~50 as a serum tumor marker primarily for colorectal cancer; the American Association of Preferred Provider Organizations is founded; the U.S. enacts the Social Security Amendments of 1983.
1984 CA.-50 Assay is commercialized by Sterna Diagnostics of Sweden; Genentech, Inc:, produces genetically engineered clotting factor VIII; DNA fingerprinting is developed; the U.S. enacts the Deficit Reduction Act of 1984.
1985 R. Tobias introduces CA-15-3 as a serum tumor marker primarily for breast cancer R. K. Mullis et al. invent the technique of polymerase chain reaction, the first gene amplification technology; CA-125 is commercialized by Centocor Inc.; Beecham Pharmaceuticals, PLC, acquires SmithKline Laboratories; to form SmithKline Beecham Clinical Laboratories, PLC; the U.S. enacts the Balanced Budget and Emergency Deficit Control Act (Gramm-Rudman-Hollings bill).
1986 CA-72 is commercialized by Centocor, Inc:.; expands its accredition activities beyond acute care hospitals and changes its name to the Joint Commission for the Accreditration of Healthcare Organizations.
1987 KR. Brav introduccs CA-549 as a serum tumor marker primarily for breast cancer; S, Fukuta introduces CA-195 as a serum tumor marker primarily for colorectal cancer by this date at least 1,215 expressed genes are assigned to specific chromosomes
1988 Hybritech commercializes CA-195; the U.S. enacts the Clinical Laboratory Improvement Amendments of 1988. 1989 Beckman lnstruments and Applied Biosciences, lnc. commercialize the first CZE apparatuses; Allied Clinical Laboratories Inc, is founded.
1991 By special act of Congress, the Veterans Administration is exempted from the provisions of CLlA '88.
1992 Final regulations implementing CLlA '88 take effect; National Health Laboratories, Inc. agrees to refund $110.4 million to the Civilian Health and Medical Program of the Veterans Administration. (CHAMPUS), Medicare, and Medicaid as a settlement to the largest medical fraud case in U.S. history; the Stark physician self- referral ban goes into effect.
1993 E. Koh, R, Ito, and M. Bissell introduce the first commercial method using CZE-urine vitamin C.
1994 Regionalization of different types of lab services into cooperative networks of labs emerges as a trend in changing laboratory structure.
1995 National Labor Relations Board rules that medical technologists are professional employees; NHL merges with Roche Biomedical, creating LabCorp.
1996 HCFA introduces the Alternate Quality Assessment Survey that allows certain labs to fill out a form for certification; a trend emerges in the sale of hospital labs to large commercial labs.  1997 Consolidated laboratory networks emerge as a trend in cost-cutting in the U.S.; FBI accuses Columbia/HCA of engaging in a systemic corporate scheme" to defraud Medicare; JCAHO recognizes COLA accreditation; HHS publishes its model compliance plan,
1998 FDA approves Dako's immunohistochemical assay, Herceptest, for detection of HER2 protein, the target of trastuzamab (Herceptin), a genetically engineered treatment for metastatic breast cancer.
1999 Continued deciphering of the human genetic code promises to dramatically expand the menu of diagnostic and prognostic tests; Quest Diagnostics acquires SmithKline Beecham Clinical Labs.

Comments

Popular posts from this blog

Ogun NACHPN buried scribe, amidst of tears

15 facts about the late Ogun NACHPN Scribe, Late Adekunle Adeniji

NANNM Issues 15-Day Ultimatum to FG on Nurses' Verification and Other Demands