top of page

Using VCSQI Data to Drive Meaningful Change

  • 13 hours ago
  • 5 min read

A Practical Guide for Understanding and Acting on Your Reports

At VCSQI, our strength is not just in collecting data—it’s in how we use it to improve patient outcomes across our collaborative.


Our Quarterly Reports provide valuable insight into performance across programs. However, insight alone does not drive improvement—action does. Meaningful change happens when data is clearly understood, translated into action, and consistently applied within clinical practice.


This guide is designed to support that process. The sections below are designed to be used together or individually to help your team understand your data, identify priorities, and implement targeted improvements.


Whether you are reviewing a report for the first time or working to address a specific performance metric, this resource is intended to guide action—not just interpretation.

The sections below can be used together or individually to help your team understand your data, identify priorities, and implement targeted improvements.

Understanding the Report Structure

While metrics may differ across registries, VCSQI reports are built on a consistent framework.


1. Rolling Time Periods

Most reports compare rolling 4-quarter periods, allowing you to:

  • Assess performance over time

  • Identify trends (not just isolated results)

  • Evaluate the impact of changes

👉 Focus on direction, not a single data point.


2. Volume-Based Grouping

Programs are grouped by volume (e.g., low, medium, high) to ensure fair comparison.

👉 Always interpret your performance within your volume group first before comparing broadly.


3. Performance Ordering

Hospitals are typically sorted by current performance, allowing you to see:

  • Where your program falls within the range

  • The spread of performance across the collaborative

👉 This is where variation becomes visible—and where opportunity exists.


4. Statistical Indicators

  • (+) = Statistically better than peers

  • (*) = Statistically worse than peers

👉 These are not decorations—they are signals for action.


5. Benchmark Comparisons

Metrics are often presented against:

  • VCSQI collaborative averages

  • National benchmarks (e.g., STS, ACC)

👉 Benchmarks provide context—but they are not the end goal.

Key Terms to Know

  • Observed-to-Expected (O/E) Ratio: Compares actual outcomes to what was expected based on patient risk.

    • < 1.0 = Better than expected (based on STS or ACC predicted risk algorithms)

    • > 1.0 =  Worse than expected (based on STS or ACC predicted risk algorithms)

  • Risk Adjustment: Accounts for differences in patient complexity so programs can be compared fairly.

How to Get Started

When you first open your report, focus your efforts:

  1. Identify 1–2 priority metrics that are:

    • O/E > 1.0

    • Marked with (*)

    • Trending worse over time

  2. Determine:

    • Is this a new issue or an ongoing trend?

  3. Pull 5–10 recent cases tied to that metric

  4. Schedule a focused review with key stakeholders

👉 This step turns your report from information into action.

How to Approach Your Data

1. Focus on Trends, Not Snapshots

Ask:

  • Are we improving?

  • Are we declining?

  • Are we consistently below expected?

👉 One quarter doesn’t define performance—patterns do.


2. Identify Variation

Variation across programs is one of the most powerful drivers of improvement.

Ask:

  • Where do we fall (top, middle, lower range)?

  • How far are we from the collaborative average?

  • How wide is the gap between best and worst performers?


Interpretation:

  • Top range → Sustain and share

  • Middle range → Refine

  • Lower range → Immediate review

👉 The goal is not to identify who is best—it’s to understand what is possible.


3. Understand What the Metric Represents

Every metric reflects a process, not just an outcome.


Examples:

  • Complications → care delivery + response systems

  • Length of stay → care coordination + discharge processes

  • Procedural metrics → technique + standardization

👉 If you don’t understand the process behind the metric, you can’t improve it.

Turning Data into Action

Data Alone Doesn’t Drive Improvement

This is the most important principle.


Reviewing reports is not the goal. Changing outcomes is.


Use your data to:

  • Identify performance gaps

  • Prioritize areas of focus

  • Implement targeted changes

  • Monitor impact over time



1. Break Metrics Across the Care Continuum

Every outcome is influenced across phases of care:

Phase

Focus

Pre-Procedure

Risk identification, patient optimization

Intra-Procedure

Technique, decision-making, resource use

Post-Procedure

Monitoring, recovery, complication management

👉 Most problems are not isolated—they span multiple phases.


2. Prioritize What Matters Most

Focus first on:

  • Metrics worse than expected

  • Statistically worse indicators (*)

  • Areas trending in the wrong direction


Then consider:

  • High-impact outcomes (mortality, major complications)

  • Operational drivers (LOS, efficiency, utilization)

👉 You cannot fix everything at once—prioritization is critical.


Assign Ownership

For each priority metric:

  • Assign a clinical lead

  • Assign a data/quality lead


Define:

  • What will change

  • By when

  • How success will be measured

👉 If no one owns the metric, improvement will not happen.


Connecting Data to Clinical Practice

Every metric reflects:

  • A process

  • A behavior

  • A decision point


Ask:

  • What are we doing differently in these cases?

  • Where is variation in practice?

  • What should be standardized?

👉 Improvement comes from changing behavior, not just reviewing numbers.


3. Conduct Targeted Case Reviews

For priority metrics:

  • Review 5–10 recent cases

  • Identify:

    • Patterns

    • Process gaps

    • Variation in care

👉 This is where data becomes real. If you skip this step, improvement will stall.


4. Focus on Internal Processes

Avoid relying solely on comparison.

Instead ask:

  • What in our process is driving this result?

  • Where are we inconsistent?

  • What can we standardize?

👉 Sustainable improvement comes from process reliability, not benchmarking alone.


5. Leverage the Collaborative

VCSQI exists to accelerate improvement.


We support members by:

  • Highlighting high-performing practices

  • Sharing protocols and strategies

  • Facilitating peer discussion

  • Providing implementation tools

👉 You are not expected to solve this alone.

Measure the Impact of Change

After implementing changes:

  • Re-evaluate performance in the next reporting cycle


Ask:

  • Did the metric improve?

  • Did variation decrease?

  • Are changes being consistently applied?

👉 If you don’t measure impact, you are guessing.

Establish a Consistent Review Process

Without structure, data review becomes passive.


Recommended Monthly Data Review


Participants:

  • Physicians

  • Nursing leadership

  • Procedural team (e.g., cath lab, OR, perfusion)

  • Quality/data managers


Agenda:

  1. Review key metrics

  2. Identify one priority area

  3. Discuss contributing factors

  4. Define action steps

  5. Assign accountability

👉 Consistency in review leads to consistency in outcomes.

What High-Performing Programs Do Consistently

  • Review data monthly without exception

  • Focus on 1–2 priorities at a time

  • Conduct routine case reviews

  • Standardize key processes

  • Share learnings across teams

👉 High performance is driven by consistency, not one-time efforts.

Common Pitfalls to Avoid

  • Treating reports as a scorecard instead of a tool

  • Focusing only on benchmarks

  • Attributing results solely to patient complexity

  • Reviewing data without acting on it

  • Assuming processes are consistently followed

👉 If nothing changes operationally, nothing changes in outcomes.

Defining Success

Successful programs will:

  • Show sustained improvement over time

  • Reduce variation within their program

  • Align data insights with daily clinical workflows

  • Build a culture of continuous review and action


VCSQI reports—whether STS, CathPCI, or otherwise—are not just reports.

They are tools for understanding performance, identifying opportunity, and driving change.


The goal is not to simply review your data.

The goal is to use it to consistently deliver better care for every patient.

Using the Worksheet to Drive Action

VCSQI reports provide valuable insight into performance, but insight alone does not improve outcomes—action does.


This worksheet was developed to support that process. It guides teams through identifying a priority metric, understanding what is driving performance, implementing targeted changes, and tracking progress over time.


The sections that follow are designed to be completed as a team and used consistently. When applied in this way, the worksheet supports focused discussion, clear accountability, and measurable improvement in patient outcomes.

Consistent improvement comes from how data is used—not just how it is reviewed.


Use this worksheet regularly—focus on one priority at a time and track progress across reporting cycles.


VCSQI offers monthly check-ins to support this work—providing an opportunity to review progress, discuss challenges, and align on next steps.


When used together, this approach supports focused action, clear accountability, and measurable improvement in patient outcomes.


For questions or support, please contact us at info@vcsqi.org.


Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page