This past week we launched an online web app for our network providers that allows our physicians to instantly pull up and analyze their stem cell results. Sometime in the next one to two months, we’ll add a similar feature to the main website for patients. Both show complete transparency in the reporting process for patient outcomes. Contrast that to some of the fictional outcomes we’re beginning to see reported by other clinics and it’s clear that I need a blog post on being a good consumer of outcome data.
The outcome of a procedure is whether it worked or not. There are many different ways to measure outcome for painful conditions—from pain scores to validated functional questionnaires to percentage improvement (called a SANE score for Single Assessment Numeric Evaluation). Each has its advantages and disadvantages. For example, a 0–10 pain score can tell you how much pain relief a patient obtained, but it can’t tell you if the patient can now walk long distances. A functional questionnaire does ask about walking, running, climbing, and so on, but it doesn’t tell you how much impact the patient felt the procedure had on his or her life. A SANE score can give you a quick sense of how well the patient felt the procedure worked with a percentage improvement (e.g., I feel 50% better), but it says nothing about pain or function. As a result, in our patient registry of almost 9,000 stem-cell-treated patients, we have always used all three types of metrics.
We know that how outcome data is collected matters. There are many methods used, and some create data that looks better than it should. In addition, there are many different questionnaires to use.
It would make common sense that if the physician who performed the procedure is the one collecting the information, this could cause a serious problem. Patients are human and want to make their doctor happy, so they’re likely to report a better result than they otherwise would to a disinterested party. Despite this, we still see this happen.
In addition, we need to collect outcome data only from the patient as the doctor has a huge incentive in believing that his or her treatment works (which is called “confirmation bias”). You’d be surprised to learn that this last important point is mostly lost on the orthopedic-surgery community, which has the distinction of being the only medical specialty that routinely allows physicians a say in the final outcome. As I’ve blogged before, this is likely one reason that orthopedic-surgery outcomes look so good in case-series studies only to crash and burn in placebo-controlled trials. In the former you can game the system and make sure your surgery results look great; in the latter you don’t know who got the surgery, so there’s nothing to game.
Finally, there are many different outcome questionnaires that are used, which are “validated.” This simply means that someone has done the research to make sure that the questionnaire accurately measures improvement. There are also terms like “MCID,” which means minimal clinically important difference. This is the lowest change in score that a patient would associate with a meaningful result.
I’d like to show you how to be a good consumer of outcome information by showing you first some problems and then giving you some rules to follow. It’s not hard to navigate these waters as long as you’re armed with information and a good “gut sense.”
Our first example refers back to my blog on Stem Cell Institute of America. Yesterday a colleague showed me a statement about outcomes from their website:
“Regenerative Cell Therapy has amazing results on a wide range of conditions. Most people generally feel significant, if not complete relief from this non-surgical procedure. In a recent study from an amniotic manufacturer, they found that in a group of over 60 participants the average pain scale went from an 8 to a 0 in just 5 weeks for all the participants!”
OK, let’s break down the issues:
Conclusions? This report of outcome information is likely fiction or heavily doctored. There is no citation, the average pain going to 0 for a pain treatment is pretty much impossible, the 100% success rate is fantasy, the preprocedure pain score of 8 is out of place in an arthritis study, and we purportedly only have results from 5 weeks.
To follow-up on the above fantasy report of outcomes, my second example comes from a conference where I was asked to serve as an expert reviewer of the stem cell talks. One report of knee arthritis outcome stood out. It was submitted by a third-party fat-stem-cell company, and it was attributed to a physician in Florida who used bone marrow concentrate and fat stem cells. What was odd was a 8+/10 preprocedure average pain score for knee arthritis patients. When I asked our biostatistician to pull the average preprocedure pain score for more than 1,000 treated knee arthritis patients, it was a 5–6. I actually knew the consultant that this physician had hired to collect the data, so I hunted that person down. I quickly learned that the physician’s staff had neglected to collect most of the preprocedure data as instructed by the consultant. Hence, given the physician’s reported 100% data collection, the 8/10 starting point seemed to be “an estimate” provided by him or his staff. Is that kosher? Nope. So I told the conference organizers that the slide had to go.
This wasn’t the first time that I had this experience with this physician. Way back in 2009 when he first began using stem cells to treat knees, I found our outcome information on his site. I confronted him that he was using a completely different procedure, so our information didn’t apply to his therapy. He then removed our information but replaced it with a general statement saying that his patients reported about 80% improvement. When I asked him where he got that number, he admitted that it wasn’t based on any data that he had collected—it was merely his estimate. Arggghhh!
This last one I’ve blogged on before, but it fits well here. Regrettably, this is a study published in a journal, but it shows the issues with much of the orthopedic-surgery literature. This study looked at fat stem cells or a fat graft for knee arthritis. A startling 91% of the approximately one thousand treated patients were classified as having more than 50% improvement at one year after the injection. Those results look fantastic, easily beating any other report for any other injectable knee stem cell treatment, until of course you take a peek under the outcome covers. When I did, turns out that the doctors who did the study used their assessment of outcome as a whopping 60% of the final outcome score! Hence, the results reported were mostly an optimistic physician reported guess—just like the examples discussed above.
Regenexx is very different in every way, and outcome reporting is no exception. Here’s what we do:
If you want to find out if the data you’re being shown is doctored or real, follow these three simple rules:
The upshot? I hope educating you on outcome data has helped you be a better consumer of healthcare in general and more specially a better consumer of stem cell procedures. We at Regenexx take great pride in that we’ve always tried to do this right, so it’s upsetting to see clinics fabricating or massaging data. Just like in anything you buy, “caveat emptor.”
About the Author
Christopher J. Centeno, M.D. is an international expert and specialist in regenerative medicine and the clinical use of mesenchymal stem cells in orthopedics. He is board certified in physical medicine as well as rehabilitation and in pain management through The American Board of Physical Medicine and Rehabilitation.…