C-CDA® Rendering Tool Challenge
HL7 and the Office of the National Coordinator for Health Information Technology (ONC) are holding a challenge to encourage the development of HL7 tools. Will your team be the one to develop the solution and take home the prize?
Want to learn more? View our recorded webinar and the accompanying slide presentation.
Challenge Participants can view the recorded Question and Answer session held on May 4, 2016.

Prize
1st place $15,000
2nd place $5,000

Submission Dates
January 12, 2016 – May 31, 2016

Winner Announced
September, 2016 - During HL7's 30th Annual Plenary Meeting in Baltimore, MD
Challenge update:
Winners will be recognized at HL7's 30th Annual Plenary & Working Group Meeting in Baltimore, MD this September
First place winner:
The Backbeach Software C-CDA Viewer, developed by: Bryn Lewis, PhD, Principal Software Development Consultant, Backbeach Software
https://github.com/brynlewis/C-CDA_Viewer
Try out the tool yourself! Click on any of the Sample CDA documents appended below the tool at:
http://backbeachsoftware.com.au/?cmd=getsql&sid=&id_site=1&id_tag=47
Second place winner:
Patient Insight, developed by: Will Tesch, CEO, HealthLX inc.
https://github.com/healthlx/HL7Challenge.git
HL7's Consolidated Clinical Document Architecture (C-CDA®) standard is an XML-based document markup standard that specifies the structure and semantics of clinical documents such as Discharge Summaries and Imaging Reports that are exchanged between healthcare providers and patients. One of the six characteristics of a C-CDA document is human readability. The process by which these documents are made human readable is called rendering. One of the requirements of C-CDA is that all relevant clinical content of a C-CDA document shall be present (and thus renderable) in human readable form.
GUIDELINES
Entries must adhere to the current principles and requirements for rendering a CDA document (as found in the Clinical Document Architecture standard):
- There must be a deterministic way for a recipient of an arbitrary C-CDA document to render the attested content.
- If the C-CDA document has a title it must be rendered.
- If the C-CDA Body is structured, the label of a section, as conveyed in the Section.title component, must be rendered. The absence of the Section.title component signifies an unlabeled section.
- When structured content is derived from narrative, there must be a mechanism to describe the process (e.g. by author, by human coder, by natural language processing algorithm, by specific software) by which machine-processable portions were derived from a block of narrative.
- When narrative is derived from structured content, there must be a mechanism to identify the process by which narrative was generated from structured data.
NOTE: These principles and requirements have led to the current approach, where the material to be rendered is placed into the Section.content and it frequently is not clear where to consistently find narrative.
Additionally:
- Submission must be open source
-
Submission must meet the minimum certification requirements for viewing C-CDA documents out of a health IT application module as stated in the 2015 Certification Criteria (see page 34 at
https://www.gpo.gov/fdsys/pkg/FR-2015-10-16/pdf/2015-25597.pdf):- Directly display only the data within a particular section
- Set a preference for the display order of specific sections
- Set the initial quantity of sections to be displayed
- The winning entries will reside on Github and made freely available on the HL7 website for use
- The overall winner will receive a complimentary registration and be encouraged to attend the HL7 30th Annual Plenary meeting (September 18-23, 2016 in Baltimore, Maryland) where they can demonstrate their tool for attendees.
Helpful resources:
- HL7's Clinical Document Architecture (CDA) standard
- HL7's Consolidated CDA Implementation Guide
- MU3 Final Rule
- 2015 Certification Criteria - Final Rule at https://www.gpo.gov/fdsys/pkg/FR-2015-10-16/pdf/2015-25597.pdf
Questions and inquiries can be directed to David Hamill, Director, HL7 Project Management Office at pmo@HL7.org.
How to Enter
How to Enter
- Enroll in and attend a brief webinar to learn more about the C-CDA Rendering Tool Challenge, announcement of the judges and ask questions to the Challenge team. The webinar will be held on Thursday, February 4 at 1:00 pm ET.
- Complete the sign up and activation process for an account on the HL7 GForge site (if you don't already have one). HL7's GForge administrator will create a private project for you on the HL7 GForge site, and will send instructions to you on how to upload files to the site.
- Declare your intent to participate by completing the Entering the Tooling Challenge form by April 30, 2016
- Upload your completed submission and supporting material to your GForge project by May 31, 2016.
Please submit the following items within your GForge project site:
- The name of your tool (submitter should label their tool with the name as well)
- The URL to your GitHub repository (where judges can run test data against the tool)
- The URL to your YouTube video demo of the tool (please limit the duration of the video to five minutes or less)
- A slide presentation PDF containing technical specificiations and other information about the tool (7 slide maximum)
Submission requirements/checklist:
- Be sure to test your tool with the following: C-CDA Sample Test Data and Additional C-CDA Test Data.
- All submission must be open source and not returnable
- You are responsible for any and all costs associated with tool development, You Tube video development and submission of tool
- One entry per person/team. In the event of a dispute over the identity of an entrant or ownership of the submission, the submission will be deemed submitted by the authorized account holder of the email account of the email address submitted during the registration process
Frequently Asked Questions (FAQs)
Additional questions and answers beyond the ones listed below can be viewed via the recorded Question and Answer session held on May 4, 2016.
Does the tool need to accomodate data entry of patient data?
No. The tool only needs to be able to read and rendor patient data.
Will HL7 provide Challenge participants sample test data to assist in development of the tool?
Yes. HL7 assembled C-CDA Sample Test Data and Additional C-CDA Test Data to assist you in development of your tool. During the judging period, HL7 will test the entries with a different set of robust test data for evaluation purposes.
What documents should the tool focus on rendering?
At a minimum, the tool should be able to render the following, as these are the documents we'll be testing and judging against:
- CCD: R1.1 and R2.1
- Discharge Summary: R1.1 and R2.1
- Referral Note R2.1
- Care Plan R2.1
Is it expected that the challenge tool will be able to render any other CDA documents (i.e. a complete implementation of the standard)?
No, testing and judging will only be performed against the document types/releases listed above.
How will conflicts of interest of the judges be handled?
We will rely on judges to identify if they have a conflict of interest and ask that they recues themselves from judging that particular entry but still participate in judging other entries.
Must the tool be built on open source technology?
The tool does not need to be built on open source technology, or even licensed technology, but there should be no fees, licensing, etc. associated with redistributions or use of the application. Basically, the distribution must be free to vendors, implementers, users, etc.
Is it required to show integration of the tool with other EHR content? In other words, will the tool ONLY be judged on its functionality, etc. and use of solely the content in a particular CDA?
The demo of the tool is not expected to be integrated, but the judging will be assessing the means that the tool could be integrated. Participants don't have to show that the tool is integrated, but are encouraged to provide some technical back ground on how it could be integrated with other systems.
While the tool must be presented as open source, can the intellectual property be protected by means of an open source license (that, for example, prohibits commercial use)?
Tools should not incorporate any license or software that will disallow the use of the solution by any user without an additional commercial license (& fee). Apache 2.0 has been used in other challenge tool applications conducted by HL7 and ONC, and it allows reuse of the tool and placement of the tool in commercial applications.
Can you comment on the plans for what will happen to/with the submitted viewers after the competition is completed?
The winner's tools will get displayed by HL7; for the rest, it's entirely up to the submitters.
Is there any requirement that it has to be a web or desktop application or it is up to the team developing it?
It's up to the development team.
Must the tool run on any specific operating systems?
No; the tool just needs to be viewable, however the more systems that the tool can run on, the better.
Can the tool be a browser based solution, e.g. similar to the current XSL renderer?
Yes..
Can we use our own test data to show the full capability of our rendering tool?
Yes.
Hospitals and clinical offices have very tight restrictions on access to websites and installing software. How will that be addressed?
By requesting that submissions be open source, we're hoping to alleviate those types of issues.
Will test data be provided so every tool renders the same set of data or should participants provide their own test data?
Test data is provided here. (hotlink it). Participants are welcome to use their own test data as well.
Is this challenge only for a viewer or also an application with a database?
Only the viewer is relevant for the challenge, however this doesn't mean a tool can't be its own application with a database.
Can you give examples as to what innovative tools you expect as an outcome?
See healthdesignchallenge.com.
Can the tool be a desktop application that users can download?
Yes.
Will ease of integration of this tool into existing HIT systems -- e.g., by a simple API -- factor into the judging?
Yes.
Who are the target users of the tool? Clinicians only? Patients?
The target users are clinicians. However, feel free to expand your tool and make it usable for patients too.
Is the competition limited to US participants?
The contest is open to everyone, worldwide.
It seems like the Challenge is for the 'best' style sheet, is that correct?
A simple stylesheet may be the best solution. Our goal is something that is useable for a 3, 30, or 300 page C-CDA.
Is there a requirement to render from the narrative blocks or the structured entries?
Tools should render the header plus section/text (aka narrative).
Is there an assumption that the inputs for rendering are well-formed CDA-compliant artifacts?
Yes.
Is it okay to use languages besides XSLT? One issue with XSLT-based translation is occasional support issues with browser security settings. Would a javascript-based view generator be acceptable if it can be run in-browser -OR- server-side in node.js?
The final product should be readily deployable in existing web browsers. Our goal is a display that existing or new vendors can integrate into their system.
How can we keep the YouTube video private during submissions to protect it while others are still working on their entries?
Our recommendation is to have the video's YouTube Privacy setting at 'Unlisted'. To share an unlisted video: Share the link with the people who want to have access to it, and they'll be able to see it. Making a video unlisted means that anyone who has the link to the video can view it. Unlisted videos don't show to viewers in the Videos tab of your channel page. Unlisted videos don't show up in YouTube's search results unless someone adds your unlisted video to a public playlist.
More information on YouTube Privacy Settings is available at:
https://support.google.com/youtube/answer/157177?hl=en
Information that may influence your tool development
some information to you on behalf of David Tao, an active member of HL7's Structured Documents Work Group. The Structured Documents Work Group (SDWG) discussed a suggestion that could help the C-CDA Rendering Tool participants. They wanted to promote the availability of certain information that participants could benefit from as they develop their solutions. The "Relevant and Pertinent" (RnP) project, commissioned under SDWG, is in progress and has some information that would help the participants in the Rendering Challenge. We surveyed multiple professional societies and have results from over 600 clinicians as to what they like, dislike, and would like to see, in C-CDAs that they receive for transitions of care. They answered questions about what they need, vs what they are currently getting. E.g., what types of data and what time dimensions are most relevant to them? They also comment on C-CDAs organization and display. More than 60% of them favor the idea of better rendering and filtering of C-CDAs, rather than the sender limiting the content, and they also commented on some specific ideas for better rendering.
A letter from David Tao:
Thank you for your efforts to "develop a viewer that enables clinicians to efficiently review the patient data from C-CDA documents that is most clinically relevant to them" as stated on the C-CDA Rendering Tool Challenge web page. We wish you the best success as you develop these tools.
We have some information from our "Relevant and Pertinent" (RnP) project, based on survey responses from over 600 physicians, that we believe would inform and assist you as you prioritize features and design for your tools. The RnP survey was motivated by the same problem you are trying to solve.
Please see the attached slide deck for a brief introduction. It then references additional slide decks showing the results of the survey, which include comments about what physicians consider valuable, and what they consider not valuable, with respect to C-CDA data contents, time dimensions, organization, filtering, and display.
More information is available on the RnP wiki: http://wiki.hl7.org/index.php?title=Relevant_and_Pertinent
Official Rules
Eligibility Rules for Participating in the Competition:
To be eligible to win a prize under this challenge, an individual or entity:
- Shall have registered to participate in the competition under the rules promulgated by HL7 and ONC
- Shall have complied with all the requirements under this section
- May not be a Federal entity or Federal employee acting within the scope of their employment
- Shall not be an employee of HL7 or ONC
- Shall not use HHS' or ONC's logos or official seals in the submission, and must not claim endorsement
A Submission may be disqualified if it fails to function as expressed in the description provided by the user, or if it provides inaccurate or incomplete information.
Submissions must be free of malware. Contestant agrees that HL7 may conduct testing on the app to determine whether malware or other security threats may be present.
An individual or entity shall not be deemed ineligible because the individual or entity used Federal facilities or consulted with Federal employees during a competition if the facilities and employees are made available to all individuals and entities participating in the competition on an equitable basis.
HL7 and ONC reserve the right to cancel, suspend, and/or modify the Contest, or any part of it, for any reason, at HL7 or ONC's sole discretion.
The Judges
Judging Criteria
The criteria on which the submissions will be judged will be:
- Functionality – Ability of the tool to fulfill the requirements as specified for this Challenge when using a predetermined set of marked up CDA document(s) developed by the judges
- Usability – Quality of the tool's interface, including design, layout, efficiency, navigation, etc. with respect to the functionality as specified for this Challenge
- Innovativeness – The tool incorporates a new method, process, product, etc.
- Graphical attractiveness of tool and rendered data
Submissions will be accepted until May 31, 2016
Winners will be announced at HL7's 30th Annual Plenary & Working Group Meeting (September 18-23, 2016 in Baltimore, Maryland)