It’s About Time: The Complexity Dynamic

CLN_Goldfarb header

Norman Goldfarb is Editor of the Journal of Clinical Research Best Practices and Chairman of MAGI. His passion is advancing the practice of clinical research by standardizing best practices in incremental steps, so every day is better than the last. He joins Clinical Informatics News with a monthly column highlighting new ideas for advancing clinical research. This month he speaks with David Morin, Director of Research at Holston Medical Group and CEO of TRIKE LLC, developer of SiteOptex Software.

David, what do you think it’s about time for the clinical research enterprise to start doing?

We need to get a better handle on protocol complexity. The problem is that, when the complexity of a protocol increases by, say, 10%, the impact on the research site is not just 10% — it increases exponentially. Until sponsors, CROs, and sites understand this dynamic, we’ll keep getting surprised by high workloads and long timelines.

What is the complexity dynamic?

Look at it this way: First, adding complexity to a protocol increases the workload per enrolled and completed patient. Second, adding complexity also decreases the site’s ability to enroll and retain study participants. In other words, adding complexity takes more work to enroll and complete fewer study participants. A simple measure of productivity is output/input. Increasing complexity impacts both the numerator and the denominator of this formula, your classic double wammy.

4MorinCan you give us an example?

Sure, let’s focus on the process of recruiting and enrolling study participants. The output, here is enrollments; the input is hours of labor (and out-of-pocket costs for advertising). Imagine making two changes to a protocol: First, add an exclusion criterion. Second, add a procedure. These changes will decrease the output numerator—enrolled patients—because it will be harder to find eligible patients who are also willing to endure the added procedure. They will also increase input denominator—work—because the site will have to run more ads, screen more patients, and spend more time in the consent process to enroll those patients. If output decreases by 10% and input increases by 10%, productivity decreases by 1-90%/110%=18%. And, of course, protocol complexity has increased by a lot more than 10% of the past decade, to say nothing of other sources of complexity, such as the proliferation of service providers in a given study.

That helps explain why so many studies miss their timelines. Are there any other affects?

Yes, here are three important ones: First, as it has become more difficult for sites to find and enroll patients, sponsors have reduced the enrollment target for each site. With a smaller enrollment target, sites (and sponsors) have to spread their startup costs over fewer study participants. Second, if the study budget does not reflect the complexity dynamic, (a) sites will lose money on the study and (b) try to make it up by overworking the study coordinator, who might then lose motivation and even leave, with all the consequent impacts on productivity.

Measuring study complexity sounds like a complex endeavor in and of itself. Are there any tools available for measuring complexity that sites and sponsors can understand its full impact?

There are a number of tools out there that score protocol complexity in a fairly straightforward manner. At my site, we have adapted one of the popular methodologies to measure the effect of complexity on productivity, and it yields very interesting data.

Well, David, it sounds like you have identified a huge problem that has been hiding in plain sight for years. Good luck with your new tool!

Norman M. Goldfarb is Editor of the Journal of Clinical Research Best Practices and Chairman of MAGI. Contact him at 1.650.465.0119 or ngoldfarb@firstclinical.com.