Creating Practical Value in Practice (of Radiology)

There is a lot written these days about the shift from volume-based to value-based in Radiology (and other medical specialties).

The thing is: volume is real easy to measure. And what gets measured, gets managed.

So, how do we measure value?

One can measure the time it takes to complete the report, sign it, and make it available to physicians and other members of the care team. Radiology practitioners call this Turnaround Time (or TAT). This is pretty easy to do.

We could try to measure whether the report is correct. In other words, is what the Radiologist concludes actually what is wrong (or not wrong) with the patient? This can be harder to measure, as it may take a lot of work to correlate many different data points, or a lengthy period of time for proof to be found.

There are a couple of activities that Radiologists, and other people working in the department, can do to improve the perceived value of Radiology.

In this article, a number of suggestions are made as to how to increase the visibility of Radiologists, as well as improve relationships and trust among other physicians and even patients.

And this WSJ article focuses on simply improving the clarity of the report by improving the language and writing skills of Radiologist. Seems obvious as to the value this would provide, when you read it, but how many Radiologists routinely attend training on how to communicate better?

While improving how Radiologists interact with the outside world—whether through better interactions or better writing—will help the Radiologist’s career, one would hope that it would also improve care. Better communication certainly couldn’t hurt.

Article – Who is the better radiologist? Hint, it’s not that easy.

I really enjoyed this article. It gets into the specifics of what we could mean by quality of Radiology reading.

I think it gets to the crux of the problem in any domain when quality is desired—a trade off is necessary. It may be cost, or it may be the experience of the user, but it will be somewhere.

Let’s use a similar evaluation in software development.

Coders that are fast are lauded as innovative and bright and creative, until their barely-tested and unscalable application fails in operations, or a security hole results in a data breach. These folks are often called “hackers” (but generally in a positive way).

The more thorough developer is criticized for taking too long and keeping the application in the lab (instead of the “real world”) for too long. They spend significantly more time in the design and testing and documentation areas, so to outsiders, they are slow. Their products take more time (missing some early opportunities seized by hackers), but the applications are much more reliable and supportable in operations. They are professional software developers.

As someone that has managed R&D teams before, you always want both behaviors (and results), but as the article posits, you often cannot have both. You certainly shouldn’t expect to get both.

I often say: Decisions are easy (I decide I want innovation and reliability, and I want it fast), but Choices are hard. I value people that can make choices, and live with them, much more than so called “decision makers”.

The Gamification of Radiology

Check out this article on gamification and clinicians.

In Radiology practices, obvious applications of gamification is using the inherent social pressure of it to improve report turnaround/signing times and peer review quota compliance. Or, even clinician satisfaction of the report.

It could also be used to provide reward/advantage to technologists that provide superior service to patients and acquire good quality imaging exams.

Participating in continuing education opportunities—say, like by attending the SIIM Annual Meeting—could also earn “points” toward rewards.

To work, it needs to be based on meaningful activities, include an aspect of social pressure and provide rewards that matter to the participants.

Article – Forecasting a New Reality for Radiology — An Investment Banker’s Thoughts on How Imaging Will Evolve

A lot has been written on consolidation of Radiology practices in the U.S.. This article in Radiology Today reiterates the economic and regulatory forces behind this trend, but also includes some points on the emotional aspects felt by those that built Radiology practices and are faced with selling.

One point not raised in the article is the operational efficiencies that can be found in IT consolidation. An effective IT organization using a modern image and management platform, backed with skilled staff can enable Radiologists to focus their efforts on quality of service delivery, and not on IT installation, configuration, upgrades, etc.

JDI Article Published – REST Enabling the Report Template Library

I contributed to an article recently published in the Journal of Digital Imaging. The primary author is Brad Genereaux (@IntegratorBrad). His blog is here.

This article examines the use of a REST API to discover, retrieve and use structured radiology report templates from an on-line report repository.

Check it out and let me know what you think.

Reflections on RSNA 2013

I have attended the RSNA show for over a decade, but always as a vendor. My days consisted of many meetings with many customers of varying needs, trying to convince them that the products that our company made were superior to those offered across the aisle by the competitors.

This year was my first year as a consultant. I attended on behalf of clients, meeting with several vendors to discover how their solution could help my client meet their business and clinical objectives.

In short, I was on the other side of the fence for the first time. And it was enlightening.

First, as a vendor, you are on your feet for hours, actively listening, talking, and demonstrating software or presenting information at a high energy level. It is exhausting and your body feels it by the end of the show.

As an attendee, representing a recognized and respected healthcare institution, I had a much different experience. Upon arrival at the vendor’s booth, we (I was accompanied by one or more representatives from the hospital), were led to the comfy couches. We were offered water or coffee or a latte. Everyone was attentive and polite. I would be lying if I said that I did not enjoy this more than the grueling schedule of vendor staff (do thank them when you see them—they work very hard at RSNA).

My personal user experience aside, I had some observations on the way vendors manage the interaction with a potential (or existing) customer.

Caveat: I am not a sales person and have not been one in any capacity since working retail just out of high school. I do not profess to be a sales expert, but I have observed some of the best and worst at their craft, so I know some things about the art of selling.

Observation #1: Vendors do not ask enough questions

I thought this was Sales 101. Qualify the lead.

What problem are they trying to solve? Why did they come to the booth? What are they trying to learn/accomplish during the appointment? Where are they in the buying cycle? Do they have budget? Who is involved in making a decision? What solutions are under consideration?

It did vary from vendor to vendor, but I was amazed at how few questions were asked. Most just went right into their pitch, often trying to convince us of something that we already knew or believed.

Observation #2: Vendors fail to understand the roles of the people in the meeting

Vendors need to remember who the actual buyer is in the meeting. In every meeting, we clearly defined our titles and roles. I was always identified as a consultant. My client representatives were of roles that make buying decisions, yet in some meetings, the sales person made all their eye contact, and spoke directly, with me. In one case, they did this so much, I felt awkward for my client—they were practically ignored. Consultants may be decision influencers, but when you have an actual decision maker in the meeting, pitch to them.

Observation #3: Vendors don’t prepare well enough for meetings with existing customers

If you are a vendor that already does business with the customer, be prepared for the meeting. Know the outstanding issues that customer is having. Know which of your company’s products are installed there and what version they are on. Know the basic installation details (e.g. physical deployment) and which user communities are using the product.

If you don’t know these things, asking them in the meeting does not instill confidence in the customer, especially if there are some outstanding issues to be resolved.

And don’t tell the customer that they are the only one having these problems. It only makes them feel worse.

Observation #4: Solutions are stabilizing

I didn’t see anything that really amazed me. As a person involved mostly in product definition and development with a vendor, we were always told (often by sales people) that everyone else had amazing products and that we were so far behind. In my experience, the solutions offered in various categories do vary in their strengths, but none are abjectly poor at what they are intended to do.

The quality of sales professional varied more than the quality/functionality of the products offered, quite frankly.

In seeking solutions, it is not so much about finding the best product, but the product that fits the institution’s needs the best. Which requires that you know what those needs are, of course.

Observation #5: Analytics are evolving; So are monitoring solutions

Lots of vendors are offering some form of analytics package. Especially those offering products to optimize workflow (they get lots of info that they can make use of in those HL7 messages).

System monitoring is improving, but still have a ways to go. I think customers need to become better educated as to what is possible with a well-designed system monitoring solution, and the benefits (so that they can get the budget approval needed to put it in place).

Article – Radiology Staffing: How to Do More with Less

A lot of people are talking about using analytics to make operational improvements (read as: lowering costs while improving quality of service), but this article describes some specific ways to do this within a Radiology practice.

Examples (from the article)…

  • Use actual procedure data to determine the specialty needed, as well as the number of staff needed in each facility/location. It also helps determine if full-time or part-time staff are needed.
  • Adapt the daily shift schedule based on hourly exam volume peaks.

The article also explains how technology is used to improve efficiency…

  • Cloud based image sharing, integrated with PACS, to distribute reading of exams among distributed Radiologists.
  • Shared worklist across facilities

Article – CPOE use can reduce unneeded CT scans

Not a mind-blowing revelation, but when doctors are told that the information they want already exists, they don’t order more tests (usually).

And while the results of the study summarized in this article reflect only a small decrease in new CT exams being ordered (“physicians canceled orders after receiving the alerts about 6 percent of the time, making for a net cancellation of 1.7 percent of studies. In a control group, physicians canceled only .9 percent of alerts.”), every bit counts.

And it reduces the radiation the patient receives, as well as helps keep the Radiology schedule free for really important exams.

A goal to simply reduce the number of exams performed is misguided. This blog post summarizes a proposed model to help separate the necessary from unnecessary exams.