Video: Every Case Tells a Story| Webinar: ACR/CHEST ILD Guidelines in Practice

An official publication of the ACR and the ARP serving rheumatologists and rheumatology professionals

  • Conditions
    • Axial Spondyloarthritis
    • Gout and Crystalline Arthritis
    • Myositis
    • Osteoarthritis and Bone Disorders
    • Pain Syndromes
    • Pediatric Conditions
    • Psoriatic Arthritis
    • Rheumatoid Arthritis
    • Sjögren’s Disease
    • Systemic Lupus Erythematosus
    • Systemic Sclerosis
    • Vasculitis
    • Other Rheumatic Conditions
  • FocusRheum
    • ANCA-Associated Vasculitis
    • Axial Spondyloarthritis
    • Gout
    • Psoriatic Arthritis
    • Rheumatoid Arthritis
    • Systemic Lupus Erythematosus
  • Guidance
    • Clinical Criteria/Guidelines
    • Ethics
    • Legal Updates
    • Legislation & Advocacy
    • Meeting Reports
      • ACR Convergence
      • Other ACR meetings
      • EULAR/Other
    • Research Rheum
  • Drug Updates
    • Analgesics
    • Biologics/DMARDs
  • Practice Support
    • Billing/Coding
    • EMRs
    • Facility
    • Insurance
    • QA/QI
    • Technology
    • Workforce
  • Opinion
    • Patient Perspective
    • Profiles
    • Rheuminations
      • Video
    • Speak Out Rheum
  • Career
    • ACR ExamRheum
    • Awards
    • Career Development
  • ACR
    • ACR Home
    • ACR Convergence
    • ACR Guidelines
    • Journals
      • ACR Open Rheumatology
      • Arthritis & Rheumatology
      • Arthritis Care & Research
    • From the College
    • Events/CME
    • President’s Perspective
  • Search

Electronic Registers and Best Practices to Improve Patient Care in Rheumatic Disease

Thomas R. Collins  |  Issue: September 2017  |  September 17, 2017

AMLbox / shutterstock.com

AMLbox / shutterstock.com

MADRID—Determining what is a best practice in rheumatology and then implementing improvements based on what you find can be fraught with complexity, an expert said during the 2017 Annual European Congress on Rheumatology (EULAR). Examples are emerging of benchmarking projects in which electronic registers are used to improve patient care, said William Dixon, MD, chair of digital epidemiology at the University of Manchester, UK.

Dr. Dixon outlined how successful benchmarking—which Xerox pioneered in the 1990s to regain ground lost to competitors in the photocopying business—involves asking the right questions and coming up with good answers to those questions. Benchmarking, as he described it, is a seven-step process:

ad goes here:advert-1
ADVERTISEMENT
SCROLL TO CONTINUE
  1. Identify what to benchmark;
  2. Determine what to measure to assess that benchmark;
  3. Identify who to benchmark;
  4. Collect the data;
  5. Analyze the data;
  6. Set goals and develop a plan of action; and
  7. Monitor the process.

Whatever is measured should be relevant, important, reliable, unambiguous, feasible in terms of being measured and standardized, valid and able to be acted upon, Dr. Dixon said. But what to measure may not be as clearcut as it seems. Example: In an intensive care unit, mortality is an obvious data point to compare from institution to institution, but what if one unit has a higher patient age and a higher comorbidity burden?

“If we think about rheumatology, what is it that we should be benchmarking that meets all of these criteria?” Dr. Dixon asked.

ad goes here:advert-2
ADVERTISEMENT
SCROLL TO CONTINUE

He said the ACR’s RISE Registry deserves praise for having collected data from about a quarter-million patients and more than 300 doctors in at least 55 practices, according to data presented last year. [Editor’s note: As of June 2017, 719 U.S. providers participate in the RISE Registry, representing more than 1 million patients and roughly 20% of practicing U.S. rheumatologists. RISE has amassed data on approximately 6 million patient encounters.] “[After] you’ve done the plumbing [i.e., set up the EHR software], you don’t have to do any more,” he said. “Those data just flow out to the RISE Registry.”

Other examples: In Portugal, the Rheumatic Diseases Portuguese Register (Reuma-pt) involved creating a core set of data that was considered good clinical practice before the registry was established. There, just the process of measuring for these data led to improved care, Dr. Dixon said.

In Denmark, the Danbio registry could be a model to follow. One of its features is that it allows patients to enter their data in real time before a consultation. Dr. Dixon said it’s a “fantastic example of how we can collect structured data for quality improvement.”

He noted that it’s not just a matter of collecting the data in the clinic, but organizing it into a repository of the data in a usable format.

Options for analyzing the data and giving feedback range from static reports to face-to-face meetings to interactive reports and graphics. A lot isn’t known about the best way to give feedback, but it’s fairly well established that prompt feedback is best and that simply telling people what they’ve done is not too effective, he said.

With Danbio, the Danish registry, there’s an annual meeting held for every annual Danbio report that’s issued. “In the first year, in 2005, this public meeting caused substantial anxiety,” Dr. Dixon said. “But since then, everyone has viewed it favorably, and the meeting is always seen as very fruitful, with exchange of experience across the country.”

Building better systems and tools for benchmarking is a worthwhile endeavor. “We need to build and invest in infrastructure that facilitates real-time data extraction,” he said.

It will also be important to keep the patient perspective in mind. “We should also think about feedback to patients and what should be presented there,” he said. “What’s benchmarking from their point of view?”

Building better systems and tools for benchmarking is a worthwhile endeavor. ‘We need to build & invest in infrastructure that facilitates real-time data extraction,’ Dr. Dixon said.

Improving Care

In another talk on improving care in a systematic way, Tore K. Kvien, MD, professor of rheumatology at the University of Oslo, said that guidelines and recommendations seem to be leading to better care, at least in some areas, but it’s clear that clinicians don’t automatically adjust their care for the better simply because of official recommendations.

“We can know about recommendations … but whether we are really applying them in clinical practice, that is something different—and that is the most important thing,” Dr. Kvien said.

A study on physician compliance with EULAR recommendations on RA management—such as early start on disease-modifying drugs and monitoring of treat-to-target goals—found that doctors tend to say they comply more often than they actually do.1 Example: 98% of physicians said they were committed to the recommendation on early DMARD use, but this treatment was actually done only 67% of the time, according to a review of records. And 83% of physicians said they were committed to monitoring treat-to-target progress, but this approach was actually done only 27% of the time.

Nonetheless, patterns have improved in some areas following the issuing of recommendations, including the use of MTX at the right time, Dr. Kvien said. Other areas, such as management of gout, need much improvement.

At Dr. Kvien’s center, the rheumatology department at Diakonhjemmet Hospital in Norway, a system of evaluating cardiac risk in patients with rheumatic diseases—in a preventive cardio-rheuma clinic—has been put into place in response to recommendations. This clinic has led to reduced risk, he said.

“I think this is a good example of how recommendations have been introduced into clinical practice, and I am quite certain that this is really benefiting patients regarding cardiac comorbidity and, perhaps, mortality,” Dr. Kvien said. “I think management recommendations will contribute to improved quality of care—if implemented into clinical practice.”


Thomas R. Collins is a freelance writer living in South Florida.

Reference

  1. Gvozdenovic E, Allaart CF, van der Heijde D, et al. When rheumatologists report that they agree with a guideline, does this mean that they practise the guideline in clinical practice? Results of the International Recommendation Implementation Study (IRIS). RMD Open. 2016 Apr 28;2(1):e000221. doi: 10.1136/rmdopen-2015-000221. eCollection 2016.

Page: 1 2 3 | Multi-Page
Share: 

Filed under:ConditionsTechnology Tagged with:AC&RAmerican College of Rheumatology (ACR)Annual European Congress of Rheumatologybenchmarkingbest practiceselectronic health recordEULARpatient carepatient registryPractice ManagementrheumatologistRISE

Related Articles

    Quadruple-threat Rheumatologist

    October 1, 2010

    In Norway and on the international front, Tore Kvien, MD, PhD, has propelled clinical research to new levels

    Registries & Benchmarking: Implementation of Best Practices Can Work—If Done Right

    August 15, 2017

    MADRID—Determining what is a best practice in rheumatology and then implementing improvements based on what you find can be fraught with complexity, an expert said during the 2017 Annual European Congress on Rheumatology (EULAR). Examples are emerging of benchmarking projects in which electronic registers are used to improve patient care, said William Dixon, MD, chair…

    ACR’s Benchmark Tool Quantifies Data about Rheumatology Workforce, Clinical Practice

    April 2, 2014

    Physicians’ survey covers weekly work hours, annual compensation, primary professional activity in clinical practice, and predominant source of payment for 682 rheumatologists nationwide

    How to Provide Better Feedback to Fellows

    July 15, 2021

    Adobe on Alameda, with Sage (oil), by Ralph C. Williams Jr. Although providing feedback is often discussed as separate from teaching, it is the most important teaching we do as clinician-educators. Whether attending on the inpatient consult service or precepting in the clinic, providing direct feedback is the most effective way to help fellows advance…

  • About Us
  • Meet the Editors
  • Issue Archives
  • Contribute
  • Advertise
  • Contact Us
  • Copyright © 2025 by John Wiley & Sons, Inc. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies. ISSN 1931-3268 (print). ISSN 1931-3209 (online).
  • DEI Statement
  • Privacy Policy
  • Terms of Use
  • Cookie Preferences