Keep Providers Focused on Quality Improvement and Implementation
An orthopedic clinic shares its experience, so you can
see how to improve patient care using evidence-based analysis.
Quality improvement has been the focus of healthcare providers, managers, and administrators for many years. Much of the information gleaned in measuring, tracking, and analyzing quality improvement can benefit an organization, but changes can be difficult to implement with providers and employers.
Define the Challenge
Along with a greater focus on quality care, implementation science is being used to describe the incorporation of evidence into healthcare practice.
What exactly are quality care and implementation science?
Value and cost commonly play a primary role in the definition of quality care. One typical equation is:
Value = Quality + Service/Cost (V = Q + S/C)
Quality components include safety, efficiency, and effectiveness. Effectiveness relates to value and typically is a factor of patient outcomes and satisfaction.
Implementation science is “the study of methods to promote the integration of research findings and evidence into healthcare policy and practice,” as defined by the Fogarty International Center. Key elements of implementation science include:
- Investigating and resolving biopsychosocial barriers that hinder implementing successful changes;
- Testing innovative methods developed to sustain and advance health programs; and
- Determining the relationship between employed interventions and their effect(s).
Quality Improvement in Action
To examine how quality improvement and implementation science can benefit patient care in the clinic setting, consider the following example.
The outpatient orthopedic physical therapy clinic where I work recently implemented several projects to improve how patient care is provided. With more than 30 providers at the clinic, questions regarding how best to address specific patient disorders and diagnoses are a part of good evidence-based practice. These questions are often best answered by outcomes data (both patient and provider-driven measures). Patient satisfaction surveys and patient-reported functional scores are common measures for such data. In the clinic, we frequently use patient functioning scores, such as:
- Lower Extremity Functional Score (LEFS)
- Disability of the Arm and Shoulder (DASH)
- Oswestry Disability Index (ODI)
- Neck Disability Index (NDI)
Our projects are most rewarding, effective, and meaningful when implemented by clinical staff who work day to day with patients. But first, providers and staff must be empowered to recognize current and potential obstacles in everyday care.
Facilitating meaningful projects without adversely affecting billable patient care time — or adding to providers’ already busy schedules — can be challenging. We took this challenge on and have learned a great deal.
We initiated the project during a brainstorming session, while considering the quality goals of our healthcare system. We then asked for ideas related to specific quality projects and discussed those suggestions.
After narrowing the number of projects to six, we asked providers to choose a project to work on. Each resulting group included two to eight providers. Early on, it was clear that some providers were more engaged than others, but this was expected.
Three initial steps helped lay the groundwork for making changes:
- Reviewing established, as well as planned, clinic processes;
- Determining if the work was implemented as intended; and
- Evaluating if the processes addressed patient needs.
Each team’s process was outlined in a specific format referred to as a value summary. The value summary organizes the project and documents the ideas and goals in a standardized format. When this was complete, we began assessing what changes could be employed to best meet the needs of our patients.
We learned that clarifying the goals and purpose of each idea is critical to success. This took several meetings to accomplish. Without this step, there was a risk that everyone on the team would go in a different direction.
A process map, or flow chart, proved helpful in clarifying the current process and outlining the desired, ideal process. As shown in Figure A, a process map visually steps out a process. For us, this helped to expose barriers to the project and patient care.
We also learned that it’s helpful to answer key questions early on, such as:
- What key tools are needed?
- What metrics and baseline data are required to determine success?
- Who has access to the needed data?
- How long do data requests take to fulfill?
- Is the data easily interpreted or will outside input be needed?
- Are there additional staffing needs?
For us, key tools included the aforementioned value summary and process map.
Implement Your Process
Access to data can be challenging. Many entities may be vying for the time and resources of those responsible for data acquisition, and your team’s request could be further down the queue than desired. It’s important to keep your team motivated and engaged during the wait time.
Implementation of the new process is where your team needs to remain vigilant. Pay attention to details, and make sure that providers, support staff, and anyone else who is involved communicates openly.
Learn from Your Data
When your new process is implemented, everyone is communicating, and enough time has passed to acquire new data, you can evaluate the success of the process. This allows you to see the impact of the changes and to celebrate the wins, adjust the process where your expectations were not met, and move on to a new project.
Because quality improvement is a desired part of the clinic culture — not just a project with a start and end point — it’s important to keep the initial teams you formed, and ask the groups to choose a new project or perfect the process they implemented initially.
Here’s an example:
- Questions: Which providers are most effective with patient care? (i.e., the least number of visits with greatest improvement in patient reported functional outcomes)
- Goal: Determine best practices for clinical care in patients with low back pain and apply those practices more broadly across the clinic to improve care and decrease patient costs.
- Process: Review data regarding patient outcomes and the number of visits for clients with low back pain for all providers.
- Barriers: Assess data accessibility, and whether providers are putting information into the electronic health record consistently.
- Results: Providers who apply evidence-based practices for low back pain have the fewest number of visits and greatest improvement in patient-reported functional outcomes (hypothesized).
- Plan: Educate all providers to apply evidence-based practices consistently with patients experiencing low back pain.
These process examples are an actual project in our clinic, and we are still awaiting the data to determine if our hypothesis is true. So far, the data tells us that providers collect necessary information much more consistently if asked (or told) to do so at every visit. Significant pushback has occurred, with complaints coming from front desk staff, providers, and patients; however, other clinics in our system that attempted to get the necessary data were unsuccessful because patient-reported outcomes were collected only every third visit, or only at intake and discharge. Educating the front desk and providers of these previous failures is beneficial. Providers have begun to educate patients about the importance of the data and how it can help them to improve care.
We are looking forward to continuing the projects that prove fruitful in terms of improving quality, costs, and service. The projects that are not fruitful, where the work effort exceeds the benefits of the potential outcomes, will be re-evaluated and either terminated or redesigned. Several groups are moving onto new projects, and our improved focus on quality improvement and implementation science is gradually imbedding into our culture. These changes will be most rewarding for those who enjoy working with colleagues in a structured way, measure what they are doing, and implement meaningful changes that results in improved patient care.
Fogarty International Center: www.fic.nih.gov/ResearchTopics/Pages/ImplementationScience.aspx
Kim Cohee, DPT, PT, MBA, OCS, is the clinical operations director of the University of Utah Orthopaedic Center Therapy Services. She graduated from the University of Utah with undergraduate and doctorate degrees in Physical Therapy and a Master of Science in Exercise Physiology. Cohee received a Master of Business Administration from Western Governors University in 2009, and achieved Orthopedic Clinical Specialist designation in 2006.