
Apr 21, 2026
And What It Actually Takes to Make It Work
By Tom Sanders, CEO of Aton Health
Clinical insight generated through observational trials has become central to how we think about clinical development. The premise is straightforward: broader populations, richer data, and insights that reflect how care actually happens outside controlled environments. In theory, it should make everything from feasibility to outcomes more grounded and more useful. And yet, for many sponsors, the experience hasn’t quite lived up to the promise.
Patient identification is slower than expected. Study execution varies widely from site to site. Data, while abundant, often lacks the context needed to be truly actionable. It’s easy to assume this is a data problem, that if we could just collect more, or structure it differently, the gaps would close. But that’s not what’s actually happening.
The issue isn’t the absence of data or even observational work. It’s the difficulty of accessing and activating the right information within the reality of specialty care.
On paper, specialty environments are rich with information. Electronic medical records contain detailed patient histories. Large networks offer scale. Observational work is increasingly common. But in practice, much of this remains just out of reach, not because it isn’t there, but because it isn’t connected to the moments where decisions are made.
Clinical insight breaks down in that space between availability and usability. Traditional approaches haven’t fully solved this, in part because they still treat observational research as something that sits adjacent to care. Even when embedded in name, the workflows often remain separate. Engagement happens site by site. Visibility is delayed. And the burden of connecting everything falls on already stretched clinical teams. The result is friction at every stage. Feasibility becomes less certain. Recruitment slows. And the insight that does emerge can feel disconnected from the clinical context that gives it meaning.
What’s beginning to work, in contrast, looks different. It starts with being truly embedded, not just contractually, but operationally inside the environments where care is delivered. It aligns with how clinicians already work, rather than asking them to adapt to something new. And it generates clinical insight in real time, through observational work that is part of the care process itself.
In that model, insight isn’t something collected after the fact. It’s something that can be acted on as care is happening. This is less about expanding datasets and more about building practical, usable access to the right patients and information at the right time. And fundamentally, that requires addressing the underlying issue: fragmented clinical pathways.
When patients fall out of those pathways, both care and insight suffer. When those pathways are connected, both improve.
As the demand for clinical insight continues to grow, that distinction becomes more important. The limiting factor won’t be how much data exists. It will be how effectively it can be reached, understood, and used within real clinical workflows. Sponsors who solve for that will move faster, design better studies, and generate insight that reflects reality more closely. Because in the end, clinical insight doesn’t break down for lack of information. It breaks down when information can’t be put to work.