Amboceptor

Microbiological, virological, bacteriological, immunological, medical, epidemiological, historical, anecdotal

Month: May, 2014

Sure, put tuberculin in everyone’s eyes.

If you’ve worked in a health care facility, you’ve probably been given the tuberculin skin test. You get a little injection under the top layer of your skin, forming a bubble, and an allergic reaction means you’ve been infected in the past by the tubercle bacillus, or Mycobacterium tuberculosis as we now know it. If you haven’t been infected in the past, you’ll have slight discoloration and maybe slight pain.

Or it may mean you’ve been infected by another species of Mycobacterium. There’s a separate skin test for Mycobacterium avium complex, the “MAC infection” that’s becoming more common, but cases of M. avium often turn up positive from the M. tuberculosis test as well. The material used for the test consists of a purified solution of protein (PPD, or purified protein derivative) extracted from the bacteria.

* * *

The tuberculin skin test is also known as the Mantoux test, and has been for over a century, since Mantoux’s practical application of the hypersensitivity reaction discovered by von Pirquet. There were alternatives for much of that time, all variations on the theme of a small skin injection. The Heaf test, for example, was easier to administer consistently, and probably easier to interpret, but harder to manufacture.

And there were more unusual alternatives, early in the 20th century.

In 1908 three Philadelphia physicians, Samuel McClintock Hamill, Howard C. Carpenter and Thomas A. Cope reported the results of comparisons of several diagnostic tests for tuberculosis. These tests involved administration of tuberculin to different sites in the body: conjuctiva (Calmette); deep muscle (Moro); and skin (von Pirquet).

(from “Orphans as guinea pigs: American children and medical experimenters, 1890-1930” by Susan E. Lederer)

Conjunctiva? That’s… the eye, right? They put tuberculin in the eye, creating an irritation at worst, and a major allergic reaction and possible scar tissue if the test was positive? This was done to people, just as a screening test?

Indeed. Remember, back then a simple injection was not as trivial as it is now. Needles and syringes were not disposable, so the Pirquet test involved scarifying the skin and applying tuberculin into the wound. And if a routine injection led to a hospital-acquired infection, there were no antibiotics to treat it. Dropping some liquid in the eye was easier. More from Lederer’s monograph:

 The physicians explained that before beginning the conjunctival test, they were unacquainted with any adverse effects associated with the procedure. The ease of implementing the test (application of a few drops of tuberculin to the surface of the eye) and the relatively quicker results obtained thereby made it attractive to clinicians in search of an effective diagnostic tool. However, in the course of testing, several disadvantages quickly became manifest. The reaction produced a ‘decidedly uncomfortable lesion’ and in several cases, serious inflammations of the eye resulted. In addition, the possibility that permanent impairment of vision might result for several children worried the physicians.

The test proved useful, revealing that many of the children had had undiagnosed cases of tuberculosis. But it was unpopular.

from the Reading Eagle newspaper, 1910

from the Reading Eagle newspaper, 1910

* * *

What were the arguments for and against the eye test?

In the Journal of the Missouri State Medical Association (November 1908), L. M. Warfield explains that the skin test is more sensitive, as it gives positive results from people who have already recovered from tuberculosis, or who show no signs of disease.

tuberculin-warfield

This goes along with his instinct for which one is safer: “I have used the cutaneous reaction more than the ocular reaction, for the eye is too delicate an organ to be played with.”

Another complaint about Calmette’s ocular test is that it should not be done on eyes that are suffering any other malady, which is hard to guarantee. In the New England Journal of Medicine (August 27, 1908) Dr. Egbert LeFevre illustrates how complications may arise.

tuberculin-lefevre

* * *

Within the first year of its introduction the eye test for tuberculosis was already losing fans.

In February 1908, an article by Floyd and Hawes saw the eye test as safer than the skin test — they could be summarized to say “the advantages of the ophthalmo-tuberculin reaction over the cutaneous or subcutaneous methods is that it is absolutely painless, whereas both of the others are painful or disagreeable to say the least. Practically no constitutional symptoms follow the use of the eye, whereas in the subcutaneous test they are important to obtain and often very distressing, and also occasionally occur in the cutaneous method.”

Six months later, doctors were abandoning the procedure. T. Harrison Butler of Coventry, England laid out the empirical observations that changed his mind in the August 8, 1908 British Medical Journal.

tuberculin-butler-dangers

Further argument against the eye test came from L. Emmett Holt of New York, whose paper in the January 1909 Archives of Pediatrics (along with the Philadelphia one mentioned above) became a massive controversy when publicized by “anti-vivisection” activists. The title is a bit alarming. (“Babies Hospital” is now called Children’s Hospital of New York-Presbyterian.)

tuberculin-holt-columbia

According to Holt, not only does the eye test produce unnecessary discomfort, it’s actually harder to perform.

In ease of application there is a decided advantage in the skin test. The scarification is a trifling thing. The patient does not require continuous observation before or after, and the reaction lasts a considerable time. The ophthalmic cases need closer watching, the reaction is shorter and may be missed. It cannot be used well in ambulatory patients.

The 1909 Eye, Ear, Nose and Throat annual points out yet another practical limitation.

tuberculin-cocain

Still optimistic about the eye test, the New York State Journal of Medicine blames problems on improper technique.

In considering the ophthalmic test we must call attention to the fact that harmful results are in all probability due to the instillation of tuberculin into diseased eyes, to infection after instillation, or mechanical irritation, to the introduction of secretion by the fingers of careless patients into the untested eye and to the use of poor or faultily prepared tuberculin.

Calmette reports 13,000 instillations and states that in no case in which the tests were properly applied and controlled were there serious complications. Petit tabulated 2,974 instillations with no ill effects in 698 positive reactions. Smithies and Walker in 450 instillations in 377 patients had four stubborn reactions. It is wise to remind the profession that the eye needs to be thoroughly examined before the test is made and with the slightest abnormality, tuberculin should not be used.

It’s agreed that the test shouldn’t be given to people with any eye problems, and it can’t be given more than once on the same eye (in a lifetime?), and it shouldn’t be given to old people. And maybe you should keep some cocaine around to numb the eyes of children and “sensitive adults” so they don’t squeeze the irritant out of their eyes.

With all these limitations, you’ll have to learn how to use the skin test anyway. So you might as well use it all the time. By 1911 Theodore Potter of Indiana University writes that “the eye reaction has already largely fallen into disuse, being replaced by the von Pirquet test.”

The eye test is still good for cattle, though!

 

 

 

Advertisements

Viruses can be RE-activated by light?

When you’re always looking at old sources, you run the risk of condescending to the experts of the past, who believed scientifically plausible things that now seem obviously wrong. More than once I’ve been ready to point out some amusing practice of the distant past, only to find out that it’s a perfectly valid fact that I (having no medical or physiological training) had never heard of.

One example is “vicarious menstruation”. Is it possible that menstruation could manifest as a nosebleed, or sores in the mouth (sometimes called “herpes”)? Or as a pair of ulcers on the legs, as D.H. Galloway of Roswell, New Mexico reported in 1913? Isn’t it more likely that these stories are exaggerated, or are coincidental? But yes, some combination of hormone levels and blood pressure creates that phenomenon in some women.

galloway-vicarious-menstruation

Another one is “activated milk”, which contained a substance called “viosterol” that was in high demand for preventing rickets in children. Activated milk? Activated by what?

Ultraviolet light, it turns out. Did this work? Well, UV light turns cholesterol into vitamin D3 when our own bodies are exposed to the sun, and it turns the fungal (yeast) equivalent, ergosterol, into vitamin D2. Cows could be fed UV-activated yeast to make them produce “activated milk”, or activated yeast extract could be directly added to milk. Either of these was a way of “activating” milk that probably worked. Exposure of normal milk to UV light seems like it would be a waste of time.

Whether its benefits were exaggerated or not, activation of milk and other foods was extremely popular, as described in Michael Holick’s great historical review in Public Health Reports, called “The Vitamin D Deficiency Pandemic: a Forgotten Hormone Important for Health”. The drug and food industries fought over whether companies like Fleischmann’s Yeast could claim their products were the equivalent of vitamin D supplements. Here’s a contemporary excerpt from Cartels: Challenge to a Free World, Wendell Berge’s 1944 classic of vaguely paranoid economics.

cartels-activated-milk

And all the way into the 21st century, there’s heated debate over whether vitamin D2 (the vitamin D in most supplements) is an appropriate substitute for our own vitamin D3.

* * *

So anyway, here’s another real thing that looked weird and debunkable at first glance.

In virology papers from the fifties and sixties, there are many mentions of something called “photoreactivation”. This started with 1949 work by future Nobel laureate Renato Dulbecco, done in the Indiana University laboratory of future Nobel and National Book Award laureate Salvador Luria. In 1950 Dulbecco summarized the story.

Kelner (1949), working with conidia of Streptomyces griseus, discovered that light belonging to the visible range is capable of reactivating biological material that has been rendered inactive by ultraviolet radiation (UV). Shortly after Kelner’s discovery was known, a similar phenomenon in bacteriophages (bacterial viruses) was observed by accident. Plates of nutrient agar containing UV-inactivated phage and sensitive bacteria had been left for several hours on a table illuminated by a fluorescent lamp. After incubation it was noticed that the number of plaques was higher on these plates than on similar plates incubated in darkness. A short report of this phenomenon of “photoreactivation” (PHTR) has already been published (Dulbecco, 1950).

We’ve been using UV light, gamma rays, and chemical agents like nitrogen mustard to make “killed” versions of viruses, safe for use in vaccines. And now it’s possible that visible light could then re-activate these menaces? Should vaccines be stored in the dark?

Beyond  bacteriophages, many other viruses were found to be capable of photoreativation. A sample:

  • 1955: “Of the three viruses we studied earlier, tomato bushy stunt and the Rothamsted tobacco necrosis virus showed the phenomenon of photoreactivation, and tobacco mosaic virus did not … Of the six viruses that did [in this study], potato X showed it much the most strongly, tomato bushy stunt and a tobacco necrosis virus the least; cabbage black ringspot, cucumber mosaic and tobacco ringspot were intermediate.”
  • 1958: “Thirty minutes of illumination at 300-380 f.c. gave substantial photo-reactivation [of] potato virus X”
  • 1961: Tobacco mosaic virus particles can’t be photoreactivated, but RNA preparations from the virus can.
  • 1967: “Photoreactivation of UV-irradiated blue-green algal virus LPP-1”
  • 1967: “By contrast, photoreactivation of the irradiated [tobacco necrosis virus] was observed in French bean and tobacco, but not in Chenopodium.”
  • 1968: Pseudorabies virus can be photoreactivated in chick embryo cultures, but not in rabbit kidney cells.

In the last of those quotes, it’s becoming clear that viruses don’t photoreactivate on their own. They photoreactivate inside cells. You can use UV light to damage the DNA (or RNA) of a virus so it can’t multiply. But it may still infect cells if the protein coat is intact. Then once the viral DNA (or RNA) is inside the cell, the cell’s DNA repair mechanisms can go to work. One of these is photolyase, found in plants, bacteria, fungi, and some animals, but not mammals. Blue light activates this enzyme to reverse the DNA damage caused by UV light (specifically, covalently-linked pyrimidine dimers).

So instead of thinking of photoreactivation as something that happens to certain viruses, we should think of it as something that happens in certain types of cells, to viral DNA as well as cellular DNA.

By 1958, Dr. John Jagger (who does not have a Wikipedia page, though his wife, also a scientist, does) was already able to write a fantastic review of photoreactivation in general (not just viruses and bacteria), saying:

Photoreactivation seems to be possible whether the UV damage occurs in the liquid or the solid state. However, the reactivation seems to require not only the liquid state, but a rather complex environment, similar to that within a living cell.

It doesn’t quite require a living cell, but it requires “cellular material”. A cellular extract still contains the photolyase enzyme.

* * *

You’ll notice that the above examples are almost all plant viruses. This is partially because plants were a very convenient system for virology in the era before cell lines, but it also has to do with the importance of light in plant biology. Dependent on the sun, they need to be able to counteract the negative aspects of ultraviolet light.

But it’s also clear that photoreactivation takes place in insects and fish.

The data show that fish cells have an efficient photoreactivation system at wavelength > 304 nm that reverses cytotoxicity and dimer formation after exposure to filtered sunlamp irradiation of a shorter wavelength (lambda > 290 nm). Shorter wavelengths in UVB (> 304 nm) are more effective in photoreversal than longer ones (> 320 nm). As a consequence, 50-85% of dimers induced by these wavelengths in fish are photoreactivated while they are being formed. A major cytotoxicological lesion is the cyclobutane pyrimidine dimers. Cultured human fibroblasts do not possess such a repair system.

What about that paper above, in which chicken embryo cells enable pseudorabies virus (a herpesvirus) to reactivate? That looks weird, to me at least. Shouldn’t chickens, being warm-blooded animals, be grouped with mammals rather than fish? But chickens aren’t mammals. This table, from Photoreactivating-enzyme activity in metazoa [Cook JS, McGrath JR [1967] PNAS 58(4):1359-1365] sums it up.

photoreactivation-in-metazoa

Mammals have other DNA repair mechanisms, but we lack photolyase. Which as it turns out, makes us kind of weird.

Can nose-picking give you lupus?

lupus-0-julius-bruess

As stereotype would predict (34% of the city’s residents were German as of the 1900 census), the Milwaukee Medical Journal had regular reports on what physicians were up to in the German and Austrian empires.

In addition to the dispatch from Marosvasarkely (a Hungarian city now part of Romania and called Târgu Mureș), Julius Bruess summarizes two German articles about “lupus”. There were two types of lupus at the time. He doesn’t need to specify which one, because only one is contagious.

lupus-1-resorcin

Here we see “lupus” grouped with “scropholoderma” and “tuberculosis verrucosa cutis” as skin conditions that can be treated with a paste containing resorcin, also known as resorcinol or 1,3-dihydrobenezene. “This paste destroys all lupus tissue, but does not affect the healthy tissues. After 3 days a scab is formed. After this, for several days, application of Kaolin compound.”

The pros and cons of resorcin were covered in a review by Augustus Ravogli in the September 5, 1891 Cincinnati Lancet-Clinic. Ravogli says it often causes as much dermatitis as it cures, but is useful for contagious diseases like impetigo. Wikipedia calls it a “disinfectant” and “antiseptic”, but says it’s now given for eczema, psoriasis, dandruff and even allergies, none of which are caused by infections.

But back to lupus. As suggested above, this form of lupus is one of the many cutaneous manifestations of tuberculosis. Even today, there are several important types of cutaneous TB, with “lupus vulgaris” the most common. It’s generally found in people who have already suffered from TB in the lungs. And from the 19th century to today it’s been described as starting out as a soft “apple jelly” nodule, eventually turning into ulcers or dry lesions similar to “scropholoderma” (scrofuloderma).

lupus-2-apple-jelly

The apple jelly nodule, in F.J. Gant’s Science and Practice of Surgery (1886) and Narasimhalu et al., Infectious Diseases in Clinical Practice 21(3):183-184 (2013).

* * *

Tuberculosis, also known as “consumption,” “phthisis,” or the “white plague,” was the cause of more deaths in industrialized countries than any other disease during the 19th and early 20th centuries. By the late 19th century, 70 to 90% of the urban populations of Europe and North America were infected with the TB bacillus, and about 80% of those individuals who developed active tuberculosis died of it.

For most of the 19th century, tuberculosis was thought to be a hereditary, constitutional disease rather than a contagious one. By the end of the 19th century, when infection rates in some cities were thought by public health officials to be nearly 100%, tuberculosis was also considered to be a sign of poverty or an inevitable outcome of the process of industrial civilization. About 40% of working-class deaths in cities were from tuberculosis.

(from Harvard University Library’s CONTAGION: Historical Views of Diseases and Epidemics)

In the 1880s it became accepted that TB was a contagious disease, thanks to the work of Robert Koch. Both his discovery of the “tubercle bacillus” and his preparation of sterilized extracts that can be used to test people for immune reactions against the bacillus. Thus the disease could be more objectively diagnosed… but not really treated, except by rest and fresh air in the sanatoria that quickly cropped up throughout the Western world.

Knowledge may not provide power, or freedom. In the absence of any effective treatment, it wasn’t necessarily helpful to find out that TB was unquestionably contagious. People tend to become more and more paranoid about ways we “know” the disease can be transmitted, whether AIDS via toilet seats and pay phones during the 1980s, or via dry sputum particles stirred up by street sweepers in the 1890s.

lupus-3-sprinkle

People also become fixated on ways we “know” the disease can be controlled. We rationalize draconian measures, ranging from segregating infected people in the Carville Leprosarium to pursuing siege campaigns against badgers. And just like the Wassermann test for syphilis, covered in our very first post, the Mantoux test for tuberculosis often provided unwelcome and unhelpful information, no matter how accurate. If someone has recovered from their symptoms, but tests positive, what does that mean? What are their chances of manifesting more symptoms? Will they ever test negative?

* * *

Another indication that the tubercle bacillus remains within the body, even after lung infection clears up, is when it reemerges as lupus vulgaris. Today in countries where TB is rare, we use “lupus” as shorthand for “systemic lupus erythermatosus”, but lupus vulgaris still exists. And though many investigations, from Dr. Henry G. Anthony in 1903 to Dr. Harry Keil in 1933, failed to find a conclusive link between TB and lupus erythematosus, there has never been a doubt about TB’s role in lupus vulgaris.

The last excerpt from the Milwaukee Medical Journal’s German dispatches:

A contribution to the hygiene of schools with reference to the jurisdiction of school physicians is furnished by Prof. Lassar in reporting a case in point on the etiology of tuberculosis of the skin. It is a known fact that teachers will with preference, in order to inflict a mild punishment, pull the ears of their scholars. This proportionately considered harmless encouragement may be followed by severe consequences. If the teacher is tubercular he cannot prevent impregnating his finger nails with sputum containing tubercular bacilli.

For any number of diseases, you could say “Hey, if you treat someone roughly and your fingers are contaminated, you might infect that person.” Right? Even in 1902 there was scarlet fever, tetanus, ringworm, the aforementioned impetigo, and so on. But to get people’s attention, you warned them about tuberculosis.

Prof. Lassar’s main case study, a woman who traced her longstanding case of lupus vulgaris back to a teacher’s habitual ear-related punishments, was picked up by some other journals. And he made one other point — which the Hahnemannian Monthly, for example, ignored, but the Milwaukee Medical Journal passed along to its subscribers.

lupus-4-sputum

Really, you COULD transmit any number of diseases by abusing the mucous membrane of the nose with the finger nails. But how often does it happen?