How did X-Rays gain mass adoption?
And what we can learn from this process to understand why cell-free DNA adoption has not skyrocketed yet
At the University of Würzburg, Wilhelm Röntgen took the first X-Ray (XR) and presented his work “On a New Kind of Rays” in December 1895 which was printed in January 1986. In January 1896, it was reprinted in English in Nature, The Electrician, Lancet, and BMJ. A lot of literature was written about XR’s in the months to follow. News outlets from across the world picked up on this story writing that “a professor from Wurzburg had successfully used a new type of light to take a photograph of a set of weights without opening the wooden box in which the weights were kept” and able to "take a picture of the human hand showing the bones without the flesh”. Critics were loud. Otto Lummer, Rontgen’s colleague, said Rontgen had “otherwise always been a sensible fellow and it’s not carnival season yet”. For example, William Morton (US physician and professor of electrotherapeutics) published The X ray or Photography of the invisible and its value in surgery. Arthur Wright at Yale reproduced X-Rays in January of 1896, Francis William published the first clinical report (using some of Edison’s work), Rontgen published on better tube design and image quality, and Tesla made tweaks to the tube to improve performance. There were 49 monographs and 1044 research papers on XRs in 1896 alone. Physicians championed this. For example, William Stubenbord started using XR techniques in April of 1896 and presented to his local society in June. Representative comments include:
At present we can detect deformities in bones and the bony pelvis; we can note the growth of bones and the malacosteon conditions important to the obstetrician; we can clearly see the shoulder-joint, the ribs, the vertebrae, the union of the pubes, the hip-joint and the curvatures of the spine. We can describe fractures of bones and readily distinguish them from dislocations, determine how to set them, and even watch their reunion without removing the splints and bandages. We can watch the absorption of a bone and readily find imbedded foreign substances. We can detect the presence of puss, watch the heart beating, see the form of the liver, the lungs, the kidneys and the spleen. I have seen with the fluoroscope the fetus in utero at the seventh month.
What is required are Crookes' tubes, a Ruhmkorff coil or a static machine, a fluoroscope, plate-holder, sensitized plates and other photographic apparatus. ... The expense of the machine costs only about $400 while a good coil is valued at from $200 to $600. But, of course, this need not be considered by physicians, since all are wealthy
These comments represent the instant click that many physicians felt when working with and look at XRs. It was a tool to answer their own personal curiosities about the human body and get answers to questions they spent much time thinking about and treating. And, for physicians like Stubenbord, they did not think about XRs in terms of return on investment - they had the money and this was a cool tool to try out at the very least.
During the initial wave of XR interest, there were physicians and scientists thinking about the risks associated with XRs too. Exceedingly painful dermatitis (more like a severe burn) caused by XR use was described in 1896. Codman (MGH surgeon) recognized the potential for XR to diagnose and study injuries and disease though he also paid close attention to reported injuries. In 1896, 55 were reported but by 1901 only 1 was reported. He felt the decline in reported occurred because people chose not to publish unless they exhibited unusual features, a reasonable conclusion. William Rollins (scientist and dentist) published over 200 papers from 1896-1903 urging minimum exposure, lead enclosures, and goggles to minimize risks. He became the father of radiation protection. Radiologist Alan Hart and others criticized XRs as “large loud sparking smelly devices prone to mishap and injury even when fully under the control of the physicians who in droves invested money and prestige in them”. Some surgeons doubted the image quality, felt bullet could be left safely in the body, and worried about patient privacy (seeing through clothes). The public was fascinated by this technology and studios offered the public “views of their bones” and “shoe fitting” images. These developments are expected of any new technology and necessary for its rapid adoption. People need to think about safety and drumming up interest with the public creates demand (although in this case they were likely causing harm to people who visited these studios because of the radiation dose and were condemned by medical societies). Carefree use led to calls for regulation at 1905 german radiology congress and American Ray Society protection committee in 1920. Wolfram Fuchs (1896) advised cutting exposures. Pfahler “unlikely to reveal much that palpation cannot” because long exposures blurred images and fractures could already be set empirically and bullets often harmless if left in situ.
Physicians started buying and assembling XR machines in 1896. Shortly thereafter, XRs were used during war time. In the Greco-Turkish war of 1897, the Ghe German Red Cross set up an XR machine in their hospital at Constantinople to aid wounded Turks. The British who supported the Greeks ad their machine in a villa outside Athens. The apparatus was bulky, and the power source unstable, so it was kept outside the front despite “British and German surgeons conclud[ing] that radiographs were of great use”. XRs were used in hospitals in the 1898 Spanish War with conclusions that XRs made probing for bullets obsolete. Medical journals published article after article in the following period describing XRs as indispensable for fractures and foreign bodies. XRs were used in the 1898 Spanish War allowing medics to triage injuries, reduce false incisions, and decide which limbs to amputate in the field. In 1908, at the American Roentgen Ray Society’s 9th meeting, it was stated that there was no excuse whatever for injury because radiographs can be taken in a fraction of a second. From 1896-1908, XRs were used for diagnosing conditions beyond fractures and bullet wounds like gastric cancers and ulcers. XR usage for diagnosing fractures at The Pennsylvania Hospital (TPH) was not common in 1909, despite the military and academic interest. At TPH, the chief resident was in charge of the XR along with other medical equipment. In 1910, the chief at the time stayed on for two years and insisted on purchasing new equipment. He was in charge of only the XR machine and limited his practice to the XR machine (radiology had yet to be created as a formal field) and he was given a percentage of the fees generated from patients who paid for the service. Creating dedicated positions and aligned incentives helped accelerate XR use at TPH. Other centers followed in the coming years and the first radiology residency program was established at MGH in 1915. Then, WW1 happened. This was a large catalyst for adoption. Marie Curie built mobile XR units and persuaded the Military Medical Service and Red Cross to use them. She was named the director of military radiology and built 20 mobile vans and 200 fixed stations training operators at each site. Robert Jones and John Hall-Edwards, who were both doctors and war office advisors, ordered XR equipment for each medical station and create a Roentgen service group within the Royal Army Medical Corps in England. Working with Siemens-Reiniger, each German field hospital recieved a wagon-mounted XR unit. US Surgeon-General William Gorgas commissioned a Division of Roentgenology where mobile XR units became increasingly available towards the end of the war. WW1 accelerated XR usage and proved lifesaving for shrapnel triage. Usage curves rose steeply in the 1920s and onwards. In 1933, 55% of all patients admitted to a university hospital had at least 1 XR. At Mayo, it was around 80% of all patients.
XR use cases continued to expand including Charles Leonard’s work (1900-1930) identifying renal stones missed on physical exam. Like many other XR superusers at the time, Leonard suffered many radiation-induced injuries including amputation of his right arm and died of radiation-induced cancer at 51. Alan Hart built statewide clinics in the 1920-30s for TB screening and in the early 1930’s positive contrast was shown to visualize ulcers, tumors, and strictures. The use cases had quickly expanded from fractures and bullets to most every organ system. By the early 1930s, radiology residency existed, professional societies had been established, hospital workflows were designed to rely on the XR for diagnosis and monitoring of disease.
XRs would not have taken off if patients and the public did not want to pay for it. In fact, hospitals charged $1-3 per image that were billed directly to the patient in a fee-for-service model. Self-pay made up about 70-80% of hospital payments, charity represented 5-10%, and the remaining was from state or municipal lump sum payments for patients who could not afford to pay. This is vastly different from today’s environment where 42% of MGH’s payment comes from Medicare, 14% from Medicaid, 37% from commercial insurance, 2% from self-pay and 5% from workers comp/liability/health safety net. Hospitals in the early 1900s did not answer to large insurers and controlled their pay-mix better so they could do more of whatever they wanted.
In the early 1900s, new services like XRs were desirable if patients would pay for the service or donors would fund the equipment and labor. So, to a first-degree, all that mattered was whether there existed demand. Today, new services are adopted if insurance covers it or it creates obvious increases in margin. Hospitals were fee-for-service largely operating off self-pay. Today, we have fixed payments, readmission penalties, value-based care that avoids excess procedures and emphasizes shortning length-of-stay, and regulatory oversight. These principles are good, but may be preventing other XR-like advancements from positively impacting patients such as cell-free DNA. For example, when Roentgen discovered XRs there was no federal device regulation in place. The first law to touch devices was in 1938. Today, we have heavy pre-market regulation. This regulatory oversight itself has many pros (protecting patients, exceeding standard of care) but also its faults (limiting pace of new technology). The new FDA under Makary has a bold vision for the future, that could protect patients while accelerating adoption of new technologies in the clinic. Beyond regulatory oversight, most technology requires CMS coverage (traditionally, commercial payers follow CMS) before mass adoption so hospitals know they will be reimbursed. An example is Galleri, the multi-cancer early detection test, which does not have CMS approval and low uptake.
How did the X-Ray gain mass adoption?
Military use in WW1 accelerated its adoption (led by individuals like Marie Curie who dedicated time to making this a reality)
The bar for comparison for fracture detection or bullet retrieval was palpation or surgical exploration - two crude technologies. Today, the standard of comparison is globally higher.
Establishing incentives that aligned with physician interest that allowed them to run with the technology like the financial structure established at The Pennsylvania Hospital
Lack of regulatory guardrails. There was none.
Physicians felt strongly about the XR and what it could do - it captivated their curiosity. They tried to image beating hearts and see what a fetus looks like in utero. On the whole, physicians threw caution to the wind exposing themselves to burns, dermatitis, crazy injuries (amputations of limbs), and death (mostly cancer)
The wins for use cases piled on across domains quickly within the first 10 years. It started for fracture detection and bullet localization but quickly spread to TB diagnosis, renal stones, strictures, ulcers, and tumors.
Establishing specialist societies to turn an informal thing into a job; establishing a residency program
Clear billing model (fee-for-service) that required only ~300 XRs to recoup the cost of capital.
Anyone could interpret the end result. How well they could is a different question. But, at the least, they could look at it and go “there’s my hand bones”. It was a picture. And humans are visual in that “a picture is worth a thousand words”.
Consider the assumption that cell-free DNA for monitoring response to treatment and diagnosing disease will become a reality in the future. In 2100, I doubt we will be imaging every tumor to judge response to treatment. Why has it not already gained mass adoption? XRs introduced in 1896 gained “mass-adoption” by 1925. Cell-free DNA gained popularity in the 2005s so we have till 2035 to match XR adoption. Nevertheless, compared to XRs, what makes cell-free DNA harder to adopt:
FDA review or CLIA LDT needed. You can’t just roll it out and see what happens like we did with XRs.
Cell-free DNA reports are text-based, statistical, and require knowledge of genomics and statistics along with medicine in order to interpret. That is a lot of background knowledge. When we simplify the reports too much, they become unusable. But, at the most granular level, they become uninterpretable to basically every clinician. We need a middle ground where these reports not only fit into clinical workflows but can be interpreted by each physician.
We do a bad job of making cell-free DNA reports accessible in Epic. The Genomics tab with scanned pdfs is a horrible interface.
Medicare only covers ctDNA in narrow situations, and hospitals will not embed a test that generates denied claims. With XRs, insurers were not relevant, the hospitals could do more of what they wished and less of what the insurer wanted.
A single broken bone convinced surgeons. No RCTs, no modeling. Now, we want trials that show survival benefits or cost savings relative to standard of care (when the standard of care (imaging) is deeply embedded in the hospital workflow)
XRs are produced in minutes, visible to physician, and immediately alters the next cut of scalpel. cfDNA returns in days, the interpretation is statistical and triggers further workflow (scans, biopsies, genetic counseling). If the MAF of KRAS G12C is 5% down from 7%, and your patient is on a G12Ci, should you continue? What if it was 4%? What if it was 8%? These are non-trivial decisions that clinicians are not trained to make. It gets harder when they don’t have time to sit and deliberate because they are booked with patients or meetings all day.
Slower clinician demand for cf-DNA. Maybe 30% of oncologists I have met are cf-DNA enthusiasts. This number is probably skewed upwards because I love cf-DNA. And, when talking about non-oncologists? Forget it. Most of them could care less.
A positive factor driving cfDNA uptake in the US: COVID helped pushed liquid biopsies more into the mainstream. No procedure rooms are needed to make clinical decisions and keep revenue flowing. Now, every major insurer covers liquid biopsy therapy selection tests (like those for PARPi in prostate cancer), and prenatal cfDNA is basically universally covered. Guardant360 tests rose from 70k in 2019 to 200k in 2023. The number of Natera tests increased from 800k in 2019 to 2.5M in 2023.
Thanks to Simon Barnett for the discussion that led to this post, and comments on the work.
I think the entire first paragraph should have 18xx dates, not 1900 dates...