Teaching X-rays to Speak: How AI Could Predict Fractures Before They Happen

Summary
- Sylvester Comprehensive Cancer Center researchers are using AI and machine learning to spot hidden bone weakness in X-rays before breaks occur.
- Early detection helps prevent emergency surgery and keeps cancer treatment on track.
- Backed by University of Miami’s Frost Institute for Data Science and Computing, the model could expand to osteoporosis and integrate into clinical practice.
Imagine a bridge under stress. To the casual eye, it looks sturdy, but deep within its beams, tiny fissures threaten collapse. Bones in the human body can be much the same, especially when weakened by metastatic cancer. A fracture can turn a patient’s world upside down, halting cancer treatment, causing pain and even shortening survival. Yet predicting which bones will break has long been a guessing game.
A team at Sylvester Comprehensive Cancer Center, part of the University of Miami Miller School of Medicine, is working to change that narrative. With funding from the University of Miami’s Frost Institute for Data Science and Computing (IDSC), researchers are training artificial intelligence (AI) to read X-rays like a seasoned detective, spotting clues invisible to the human eye.
The Challenge: Seeing Beyond the Obvious
Traditional imaging looks for fracture lines, the glaring evidence of a break. But by the time those lines appear, the damage is done.
“We need to catch bones before they fail,” said Brooke Crawford, M.D., M.B.A., division chief of orthopaedic oncology at Sylvester and associate professor of clinical orthopaedics at the Miller School. “Our goal is to identify the silent signals of weakness — patterns that suggest a bone is nearing its breaking point.”

Think of an X-ray as a mosaic. Each tile holds a piece of the story: texture, edges, density. The AI model doesn’t just glance at the picture. It studies every tile, learning how they fit together to reveal hidden vulnerabilities.
How It Works: Teaching Machines to See
The technology relies on a two-part system powered by machine learning:
• DINOv2 vision transformer: A self-supervised learning method that teaches the model to recognize features in medical images without labeled examples. It’s like giving the AI a library of books and letting it learn the language on its own.
• Binary classifier: After the AI slices an X-ray into small patches and converts each into numbers, the classifier maps relationships between patches to understand the big picture — the architecture of bone, subtle textures and faint whispers of stress.
“The model doesn’t hunt for a fracture line,” explained Anastasiya Drandarov, a Sylvester research assistant. “Instead, it learns complex patterns tied to bone integrity. These patterns become a kind of fingerprint, helping us predict whether a fracture is likely.”
The AI model has already learned common visual features such as shapes, edges and textures from millions or even billions of images, said Liang Liang, Ph.D., an associate professor in the Department of Computer Science at the University of Miami and the principal investigator on the technology side.
This isn’t just about cancer. It’s about giving clinicians a new lens, a way to see what’s coming and act before it happens.
Anastasiya Drandarov, Sylvester research assistant
Why It Matters
For patients with metastatic cancer, timing is everything. A fracture can derail treatment plans, force emergency surgery and diminish quality of life. By predicting risk early, clinicians can intervene, reinforcing bones before they snap, sparing patients unnecessary pain and preserving their ability to continue cancer therapy. Preventing fractures isn’t just about comfort. It’s about survival.
“Every decision we make in oncology is a balancing act,” said Dr. Crawford. “If we can avoid surgery for patients whose bones are stable, we keep their cancer treatment on track. If we identify those at high risk, we act before disaster strikes.”
This research is especially relevant for patients with tumors that metastasize preferentially to bone, including breast, prostate, lung and kidney cancers. For these patients, predicting fracture risk can guide orthopaedic oncology teams in deciding whether to perform prophylactic stabilization surgery before a break occurs. Acting early can prevent complications and allow patients to continue life-saving cancer therapy without interruption.
The IDSC grant accelerates this vision. It funds dedicated research time and provides the computational horsepower needed to train the model. This project is part of a broader effort to combine imaging with biomarkers to create a comprehensive risk profile.
Next steps include training the AI on external X-ray datasets, then fine-tuning it with images from UM’s radiology archives. Eventually, the team plans to expand to multi-institutional datasets, ensuring the model works across diverse populations and imaging systems. The ultimate goal? A tool that integrates seamlessly into clinical practice. Picture a radiologist reviewing an X-ray with an AI assistant that quietly flags bones at risk, like a weather forecast for fractures.
Beyond Cancer: A Wider Horizon
While the immediate focus is metastatic disease, the implications stretch further. The same approach could one day help predict fractures in osteoporosis, a condition affecting millions worldwide.
“This isn’t just about cancer,” noted Drandarov. “It’s about giving clinicians a new lens, a way to see what’s coming and act before it happens.”
In essence, the team is teaching machines to speak the language of bone health—a dialect of shadows and shapes, learned from thousands of images. It’s a conversation that could transform patient care, turning uncertainty into foresight.
As Dr. Crawford put it, “Every fracture we prevent is a victory, not just for science, but for the human lives behind these images.”
Tags: AI, artificial intelligence, Department of Orthopaedics, Dr. Brooke Crawford, fracture, machine learning, Newsroom, orthopaedic oncology, Sylverster Comprehensive Cancer Center, technology