The C-CDA has Come a Long Way

Steven Posnack | June 8, 2017

The Consolidated Clinical Document Architecture (C-CDA) standard (version 1.1, C-CDA 1.1) was first adopted in 2012 as part of the Office of National Coordinator for Health Information Technology’s (ONC’s) 2014 Edition final rule. It took nearly three years after that rulemaking for certified health information technology (health IT) with C-CDA 1.1 capabilities to be widely deployed among health care providers. Today’s experience with respect to C-CDA-based interoperability reflects this now five year-old version, and, in some respects, the software implementation and usability decisions that were made at that time too.

We believe this experience is about to change. Here’s why.

Since 2015, ONC made significant investments and implemented a three-part strategy designed to:

  • Improve the C-CDA’s implementation consistency;
  • Enhance the C-CDA’s computability and usability; and
  • Provide more transparency and feedback to industry about the quality of C-CDAs generated in production.

To tackle several implementation issues that originate from the standard itself, ONC established a cooperative agreement with Health Level Seven International (HL7). An early technical critique of C-CDA 1.1 was that it needed more implementation testing, guidance, and best practice examples to be implemented with a high degree of consistency. Our work with HL7 addressed this head on by focusing on projects that would yield high-value implementation guidance informed by hands-on implementation experience.

What happens at an Implementation-a-thon doesn’t stay at one

Since the start of our cooperative agreement, HL7 has conducted four C-CDA Implementation-a-thons with more planned in the future. At these events, engineering leads from different health IT developers exchange C-CDAs among their products in real-time working sessions. They work collaboratively on specific implementation nuances to work out “bugs” and document implementation improvements and potential best practices.

The C-CDA Implementation-a-thons served as a tangible input to two other resources. The first is an expanded C-CDA Example Search Tool, which gives implementers code snippets and concrete examples of the right way to implement specific C-CDA sections and entries. The second yielded the C-CDA R2.1 Companion Guide, which provides additional technical guidance on how to implement C-CDA 2.1 consistent with ONC’s 2015 Edition certification criteria, including clinically-valid examples

In parallel, ONC’s 2015 Edition certification criteria reinforce this work. In particular, two 2015 Edition certification criteria are aimed specifically at the C-CDA’s implementation consistency and usability.

  • The “C-CDA Creation Performance” certification criterion requires extensive testing that was not part of the 2014 Edition. Additionally, it is a conditional criterion, which means that any health IT software presented for certification to a criterion that references the C-CDA must also pass these additional testing requirements. In short, testing requires comparisons to a “gold standard” reference file as well as verifications that required data is structured the right way and in the right place.
  • The “Transitions of Care” certification criterion requires health IT to include certain capabilities aimed at improving health care providers’ experience with the C-CDA upon receipt. Specifically, health IT certified to this criterion must enable a user to: display only the data within a particular C-CDA section; set a preference for the display order of specific sections; and set the initial quantity of sections to be displayed. Health IT developers have free range to innovate and design the best way to meet these technical outcomes. To show what could be possible, ONC and HL7 administered a C-CDA Rendering Challenge contest in the summer of 2016, which required participants to develop an open source tool that could meet these requirements. I encourage you to check out the two-minute video demo produced by the winner.

We often hear from health care providers that C-CDAs are too long and too many pages. But this is not an issue with the standard itself. Instead, this critique indicates an issue with how data is being rendered upon receipt (i.e., what data, how much data, and in what manner it is displayed) among other dynamics. In response, a group within HL7 dubbed the “Relevant and Pertinent team” worked to right-size the document, since the C-CDA is really meant for computers to communicate with each other (final guide and wiki page). So if your health IT is not rendering a C-CDA like the challenge winner (or better), but instead as a mass of text, as tens of pages, or in a table format that requires endless scrolling, you should probably ask why.

When it comes to real-world testing and helping with development, implementation, and post-deployment assessments, ONC released the C-CDA Scorecard last year for developers to use when readying their systems for deployment. Now at version 1, the C-CDA Scorecard is updated to help both health care providers and health IT implementers test the C-CDAs produced by their systems. In addition to both a numerical and letter grade, the C-CDA Scorecard provides a user-friendly, categorized report that pinpoints areas for improvement.

It’s time to schedule your C-CDA check-up

Today, ONC is announcing that a new benchmarking tool is available specifically for health care providers to test the quality of the C-CDAs created by their certified health IT as implemented in production. The new benchmarking tool is called the “One Click Scorecard,” which tests both Direct transport and C-CDA conformance. This benchmarking tool is the health IT equivalent of an internet speed test and is specially designed with health care providers in mind – to give them visibility into the quality of the C-CDAs their health IT generates. To use it, a health care provider (with a DirectTrust accredited HISP) just needs to do one thing: send a C-CDA (via Direct) to scorecard@direct.hhs.gov. In a few moments, the benchmarking tool will test the C-CDA and respond back to the sender’s Direct account with a PDF “report card” that includes scoring information about the C-CDA.

Notably, health care providers and their business associates (if authorized by their business associate agreements) may disclose protected health information (PHI) to ONC while self-testing ONC-certified health IT via the “One Click Scorecard” benchmarking tool without patient authorization and without a business associate agreement with ONC. Once a C-CDA is tested and scored, the entire C-CDA along with any PHI is immediately erased and is not retained. ONC will only retain the numerical scores generated by the tool during testing for benchmarking and industry-reporting. Read the full notice regarding the disclosure of PHI.