Real World Testing Plan 2025

Covered by this topic

General Information

Plan Report ID Number
Developer NameMedical Informatics Engineering
Product Name(s)WebChart EHR
Version Number(s)8.4
Certified Health IT Product List ID(s)0015E8UJ8KHX8QL
Developer Real World Testing Page URLhttps://docs.webchartnow.com/resources/system-specifications/ehr-certification/real-world-testing/
Plan Submission Date10/31/2024

Certification Criteria to be Tested

  • Care Coordination
    • § 170.315(b)(1) Transitions of care
    • § 170.315(b)(2) Clinical information reconciliation and incorporation
    • § 170.315(b)(3) Electronic prescribing
    • § 170.315(b)(7) Security tags - summary of care - send
    • § 170.315(b)(8) Security tags - summary of care - receive
    • § 170.315(b)(9) Care plan
    • § 170.315(b)(10) Electronic Health Information export
  • Clinical Quality Measures
    • § 170.315(c)(1)—record and export
    • § 170.315(c)(2)—import and calculate
    • § 170.315(c)(3)—report
  • Patient Engagement
    • § 170.315(e)(1) View, download, and transmit to 3rd party
  • Public Health
    • § 170.315(f)(1) Transmission to immunization registries
    • § 170.315(f)(2) Transmission to public health agencies - syndromic surveillance
    • § 170.315(f)(5) Transmission to public health agencies - electronic case reporting
  • Application Programming Interfaces
    • § 170.315(g)(7) Application access - patient selection
    • § 170.315(g)(9) Application access - all data request
    • § 170.315(g)(10) Standardized API for patient and population services
  • Electronic Exchange
    • § 170.315(h)(1) Direct Project

Criteria-Measure Matrix

CriteriaRequirementMeasure
§170.315(b)(1): Transitions of Care(b)(1)(i)(A)(Alternative) - Send Using Edge Protocol for SMTP/IXE XDR17
(b)(1)(i)(B)(Alternative) - Receive Using Edge Protocol for SMTP/IXE XDR17
(b)(1)(i)(C)(Conditional) - XDM Processing17
(b)(1)(ii)(A) - Receive, Parse, and Process7, 19
(b)(1)(ii)(B) - View7
(b)(1)(ii)(C) - Section Display7
(b)(1)(iii) - Create7
(b)(1)(iii)(A) - Assessment, Plan, Goals, Health Concerns7
(b)(1)(iii)(B) - Diagnoses7
(b)(1)(iii)(C) - Cognitive Status7
(b)(1)(iii)(D) - Functional Status7
(b)(1)(iii)(E) - Ambulatory Referral Summary7
(b)(1)(iii)(F) - Inpatient Discharge Instructions7
(b)(1)(iii)(G) - Patient Matching7
§170.315(b)(2): Clinical information reconciliation and incorporation(b)(2)(ii) - Correct Patient7
(b)(2)(iii)(A) - Simultaneous Display9
(b)(2)(iii)(B) - Reconciled List9
(b)(2)(iii)(C) - User Review9
(b)(2)(iii)(D) - List Acceptance9
(b)(2)(iv) - CCD Creation9
§170.315(b)(3): Electronic prescribing(b)(3)(ii)(A)(1) - NewRx3
(b)(3)(ii)(A)(2) - RxChangeRequest, RxChangeResponse3
(b)(3)(ii)(A)(3) - CancelRx, CancelRxResponse3
(b)(3)(ii)(A)(4) - RxRenewalRequest, RxRenewalResponse3
(b)(3)(ii)(A)(5) - RxFill3
(b)(3)(ii)(A)(6) - RxHistoryRequest, RxHistoryResponse3
(b)(3)(ii)(A)(7) - Status3
(b)(3)(ii)(A)(8) - Error3
(b)(3)(ii)(A)(9) - Verify3
(b)(3)(ii)(C)(1) - Primary/Secondary Diagnosis4
(b)(3)(ii)(E) - Metric Units5
(b)(3)(ii)(F) - Decimal Format6
§170.315(b)(7): Security tags - summary of care - send(b)(7) - CDA Generated with Privacy & Security Markings27
§170.315(b)(8): Security tags - summary of care - receive(b)(8)(i) - Security Tags Document28
(b)(8)(ii) - Preserve Privacy Markings28

§170.315(b)(9): Care plan
(b)(9) - Record24
(b)(9) - Change and Access24
(b)(9) - Create25
(b)(9) - Receive26
§170.315(b)(10): Electronic Health Information export(b)(10)(i) - Single patient export18
(b)(10)(ii) - Patient population export18
(b)(10)(iii) - Documentation34
§170.315(c)(1): CQMs – record and export(c )(1)(i) - Report1
(c )(1)(ii) - Export1
§170.315(c)(2): CQMs – import and calculate(c )(2)(i) - Import2
(c )(2)(ii) - Calculate1, 2
§170.315(c)(3): CQMs – report(c )(3)(i) - Report1, 2
§170.315(e)(1): View, download, and transmit to 3rd party(e)(1)(i) - Web Content Accessibility21
(e)(1)(i)(A) - View14
(e)(1)(i)(A)(1) - USCDI23
(e)(1)(i)(A)(3)(i) - Assessment and Plan of Treatment23
(e)(1)(i)(A)(3)(ii) - Goals23
(e)(1)(i)(A)(3)(iii) - Health Concerns23
(e)(1)(i)(A)(4) - Provider Data23
(e)(1)(i)(A)(6) - Laboratory Test Report23
(e)(1)(i)(A)(7) - Diagnostic Imaging Report23
(e)(1)(i)(B)(1)(i) - Download Human Readable15
(e)(1)(i)(B)(1)(ii) - Download CCD15
(e)(1)(i)(B)(2) - CCD Human Readable15
(e)(1)(i)(C)(1)(i) - Email16
(e)(1)(i)(C)(1)(ii) - Encrypted Transmission16
(e)(1)(i)(D)(1) - Specific Date14, 15, 16
(e)(1)(i)(D)(2) - Date Range14, 15, 16
(e)(1)(ii)(A) - Activity Log14, 15, 16
§170.315(f)(1): Transmission to immunization registries(f)(1)(i) - Create Content10
(f)(1)(ii) - Query Records11
§170.315(f)(2): Transmission to public health agencies — syndromic surveillance(f)(2) - Create Content32
§170.315(f)(5): Transmission to public health agencies — electronic case reporting(f)(5)(i) - Consume trigger codes33
(f)(2)(ii) - Match encounter to trigger codes33
(f)(2)(iii) - Create case report33
§170.315(g)(7): Application access – patient selection(g)(7)(i) - Query processing and response20
(g)(7)(ii)(A)(1) - Functional Documentation8
(g)(7)(ii)(A)(2) - Implementation Requirements8
(g)(7)(ii)(A)(3) - Terms of Use8
(g)(7)(ii)(B) - Public Link8
§170.315(g)(9): Application access—all data request(g)(9)(i)(A)(1) - Demonstrate API20
(g)(9)(i)(A)(3) - Data Classes20
(g)(9)(i)(B) - Data Return20
(g)(9)(ii)(A)(1) - Documentation8
(g)(9)(ii)(A)(2) - Implementation Requirements8
(g)(9)(ii)(B) - Public URL8
§170.315(g)(10): Standardized API for patient and population services(g)(10)(i) - Data response: USCDI v1 + US Core STU v3.1.129, 30, 31
(g)(10)(ii) - Supported search operations29, 30, 31
(g)(10)(iii) - Application registration29, 30, 31
(g)(10)(iv) - Secure connection29, 30, 31
(g)(10)(v)(A) - Authentication and authorization for patient and user scopes: SMART 1.0.029, 30
(g)(10)(v)(B) - Authentication and authorization for system scopes29, 31
(g)(10)(vi) - Patient authorization revocation29, 30
(g)(10)(vii) - Token introspection30, 31
(g)(10)(vii) - Documentation22
§170.315(h)(1): Direct Project(h)(1)(i) - Send12
(h)(1)(i) - Receive13
(h)(1)(ii) - Message Disposition Notification: Processed12
(h)(1)(ii) - Message Disposition Notification: Failed12

Justification for Real World Testing Approach

WebChart EHR is a cloud-based, fully-inclusive EHR solution. All certified functionality is delivered in all instances of the product regardless of the care setting, size of practice, or required use cases for a given practice. Each production client is maintained in a separate database; however, the implementation of the environment is identical with the exception of optional increased security protocols that a client may choose to add for enhanced data protection. Additionally, the only differences between the client-facing portion of each system are a result of configuration settings that can be selected at go-live or updated at any time during a client’s contract. Due to this philosophy of product delivery, all certified capabilities may not be actively used in all marketed care settings or may not be actively used in any current client production system. To address the Real World Testing requirements, MIE will be using a hybrid approach. Testing will primarily be conducted using de-identified real patient data from production systems as recorded in database tables and log files. For those criteria for which this live production recording is not available or minimal due to lack of client usage, client reported issues will be tracked and reported in addition to enacting automated tests of the certified functionality in a test system in a production environment. The automated tests will be run daily or weekly as appropriate in a system that is identical in substance and delivery to a client production system with the only exception being live real patient data. This blended approach will allow MIE to prove ongoing maintenance of WebChart EHR’s certified technology regardless of the level of implementation by current clients.

Standards Updates (SVAP and USCDI)

All certified criteria in WebChart EHR use the current standard or implementation specification version, and will continue conformance to that version throughout the 2025 Real World Testing period unless stated in the table below. Key current versions include the following:

  • QRDA Category I, Release 1, Standard for Trial Use Release 5.3 with errata (published December 2022)
  • QRDA Category III, Release 1 (published September 2021)
  • HL7® CDA R2 Implementation Guide: C-CDA Templates for Clinical Notes R2.1 Companion Guide, Release 2-US Realm, October 2019
  • United States Core Data for Interoperability (USCDI), Version 1, July 2020 Errata
  • HL7® FHIR® US Core Implementation Guide STU 3.1.1, August 8, 2020
  • HL7® FHIR® SMART Application Launch Framework Implementation Guide Release 1.0.0, November 13, 2018
  • HL7® FHIR® Bulk Data Access (Flat FHIR®) (v1.0.0: STU 1), August 22, 2019

FHIR SVAP - Planned

Standard and version
  • HL7® FHIR® US Core Implementation Guide STU 6.1.0, June 30, 2023
  • HL7® FHIR® SMART Application Launch Framework Implementation Guide Release 2.0.0, November 26, 2021
  • HL7® FHIR® Bulk Data Access (Flat FHIR®) (v2.0.0: STU 2), November 26, 2021
Updated certification criteria§ 170.315(g)(10) - Standardized API for patient and population services
Associated productWebChart EHR v8.4
Health IT Module CHPL ID0015E8UJ8KHX8QL
Method used for standard updateSVAP
Date of ONC ACB notificationTBD 2025, through quarterly attestation
Date of customer notification (SVAP only)TBD 2025

USCDI SVAP - Planned

Updated productWebChart EHR v8.4
Health IT Module CHPL ID0015E8UJ8KHX8QL
Method used for standard updateHTI-1 update attestation
USCDI updated certification criteria
  • § 170.315(b)(1) Transitions of care
  • § 170.315(b)(2) Clinical information reconciliation and incorporation
  • § 170.315(b)(9) Care Plan
  • § 170.315(e)(1) View, download, and transmit to 3rd party
  • § 170.315(g)(9) Application access - all data request
  • § 170.315(g)(10) - Standardized API for patient and population services
Planned SVAP versionUnited States Core Data for Interoperability (USCDI), Version 3, October 2022 Errata
Planned SVAP dateQ1 2025

Care Setting(s)

WebChart EHR is a scalable, web-based system designed for ambulatory practices and clinics. The same product is distributed to all care settings with many configuration options. Each practice can use the available configuration to tailor the product to fit their workflows and use requirements.

Care SettingJustification
Primary CareThe WebChart EHR clients are divided primarily between primary care and specialty practices. Testing in a primary care setting will cover a large and important portion of our business.
Specialty PracticeThe WebChart EHR clients are divided primarily between primary care and specialty practices. Configuration selections are all that differentiate WebChart EHR implementations; however, we will test with several specialty practices to ensure configuration does not impact the functionality of certified capabilities.
PediatricsPediatric clinics are typically configured differently than adult primary care clinics. We will test in a pediatric setting in addition to primary care to again ensure that configuration does not impact the functionality of certified capabilities.
Small/Rural/Underserved PracticeThe size and location of a practice can impact their interoperability options. We will test with both small/rural and large/urban practices to ensure all practices have full interoperability functionality.
Large Multi-practice ClinicThe size and location of a practice can impact their interoperability options. We will test with both small/rural and large/urban practices to ensure all practices have full interoperability functionality.

Measures Used in Overall Approach

The following measures outline and justify how each requirement of all criteria to which WebChart EHR is certified will be tested during the 2025 Real World Testing year. Please review the Criteria-Measure Matrix above to review which measure(s) will cover a specific requirement.

Measure 1: Clinical Quality Measures Outgoing

Description

This measure will review WebChart EHR’s ability to measure clinical quality and export the required information. Compliance will be tested both manually by developers and clients as well as automatically by reporting bodies and the Cypress CUV+ test system.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(c)(1): CQMs – record and export(c )(1)(i) - Record
(c )(1)(ii) - Export
§170.315(c)(2): CQMs – import and calculate(c )(2)(ii) - Calculate
§170.315(c)(3): CQMs – report(c )(3)(i) - Report

Justification

WebChart EHR should accommodate the full range of §170.315(c)(1), §170.315(c)(2), and §170.315(c)(3) to support providers participating in MIPS and other quality measures. Most data supporting these measures for existing clients will come from data generated internally by their standard clinical workflows of seeing patients or incorporating the CCDA of transitioning patients. Numerical compliance calculations and reporting will be monitored by MIE and the practices selected for testing. The export and report QRDA formats will be validated by reporting partners and Cypress CUV+ to ensure data collected and calculated in WebChart EHR remains interoperable.

Test Methodology

First, MIE will install an instance of Cypress 7+ on our production servers following all of our protocols for maintaining the security of PHI. Cypress CUV+ supports the validation of QRDA reports containing PHI and will be used monthly to validate a random selection of QRDAs from the care settings identified. Any errors identified by Cypress CUV+ will be tracked, reported, and addressed, then followed with testing of a larger sample of files.

Additionally, WebChart EHR has two customers that participate in quarterly attestations using both QRDA I and QRDA III reports. These customers regularly inspect their CQM compliance numbers and will alert MIE to any perceived errors. MIE will then collect and track the attestation results from the reporting bodies including any errors so as to report a success/failure rate.

Expected Outcome(s)

It is expected that calculation, reporting, and QRDA format errors will be rare. Any errors reported by customers or the recipients of their quarterly attestations will be tracked and reported as a baseline. Additionally, any formatting or coding errors identified by Cypress CUV+ will be tracked, reported, and resolved. These errors are also expected to be rare both before and after using SVAP to update the version of QRDA WebChart EHR supports.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 2: Clinical Quality Measures Incoming

Description

This measure will review WebChart EHR’s ability to measure clinical quality and export the required information. Compliance will be tested both manually by developers and clients as well as automatically by reporting bodies and the Cypress CUV+ test system.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(c)(2): CQMs – import and calculate(c )(2)(i) - Import
(c )(2)(ii) - Calculate
§170.315(c)(3): CQMs – report(c )(3)(i) - Report

Justification

WebChart EHR should accommodate the full range of §170.315(c)(1), §170.315(c)(2), and §170.315(c)(3) to support providers participating in MIPS and other quality measures. It is rare that an active production client will import a QRDA I file for use in their CQM calculations. To maintain that WebChart EHR is capable of importing and calculating when this does occur, QRDA I files from Cypress will be imported into a test system in a production environment, CQMs will be automatically calculated, and QRDA files will be exported back to Cypress for content and calculation validation.

Test Methodology

MIE will install an instance of Cypress 7+ on our production servers following all of our protocols for maintaining the security of PHI. Automated testing will download QRDA I files from Cypress for each certified CQM, import the files to WebChart EHR, calculate the CQMs, and export the QRDA files for Cypress validation of both the content and calculations to verify that the import was successful. Any errors identified by Cypress will be tracked, reported, and addressed.

Expected Outcome(s)

It is expected that calculation, reporting, and QRDA format errors will be rare. Any formatting, content, or calculation errors identified by Cypress will be tracked, reported, and resolved. These errors are expected to be rare both before and after using SVAP to update the version of QRDA WebChart EHR supports.

Care Setting(s)

Primary Care, specialties, small, large

Measure 3: E-Prescribing Messages Sent and Received

Description

This measure will verify that all supported e-prescribing message types are in use in WebChart EHR, including inbound and outbound message types.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(b)(3): Electronic prescribing(b)(3)(ii)(A)(1) - NewRx
(b)(3)(ii)(A)(2) - RxChangeRequest, RxChangeResponse
(b)(3)(ii)(A)(3) - CancelRx, CancelRxResponse
(b)(3)(ii)(A)(4) - RxRenewalRequest, RxRenewalResponse
(b)(3)(ii)(A)(5) - RxFill
(b)(3)(ii)(A)(6) - RxHistoryRequest, RxHistoryResponse
(b)(3)(ii)(A)(7) - Status
(b)(3)(ii)(A)(8) - Error
(b)(3)(ii)(A)(9) - Verify

Justification

WebChart EHR should support all of the required e-prescribing messaging types outlined in §170.315(b)(3). Messages are stored locally in each client system in addition to being transmitted to/from pharmacies via the Surescripts network.

Test Methodology

MIE will report a count of messages for each supported message type:

* NewRx
* RxChangeRequest
* RxChangeResponse
* CancelRx
* CancelRxResponse
* RxRenewalRequest
* RxRenewalResponse
* RxFill
* RxHistoryRequest
* RxHistoryResponse
* Status
* Error
* Verify

The report will also include a count of outbound messages unable to be transmitted due to connectivity issues or other errors, for each message type. This report will be based on the contents of each client’s local database table of stored messages. MIE will run the report for each client under consideration and aggregate the results.

Expected Outcome(s)

Each of the supported message types should have a total greater than zero,and the number of outbound messages with errors should be significantly lower than the total number attempted to be sent. Based on past usage patterns of outbound message types, the number of NewRx messages is anticipated to be greater than RxChangeResponse, CancelRx, or RxRenewalResponse.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 4: E-Prescribing Diagnosis Codes

Description

This measure will verify that all diagnosis elements are present in some e-prescribing messages as required by §170.315(b)(3), including inbound and outbound message types.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(b)(3): Electronic prescribing(b)(3)(ii)(C)(1) - Primary/Secondary Diagnosis

Justification

WebChart EHR must be able to send Diagnosis codes in outbound e-prescribing messages, and receive inbound messages that include them.

Test Methodology

MIE will report the contents of each stored message in a client’s local database table of stored messages, and counts the inbound and outbound messages that include Diagnosis elements. MIE will run the report for each client under consideration and aggregate the results.

Expected Outcome(s)

As Diagnosis code is not a required element for transmission, we expect that these elements will be present in some, but not all inbound and outbound messages.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 5: E-Prescribing Oral Liquid Units

Description

This measure will verify that prescriptions for medications with an oral liquid form will have a quantity unit of measurement of mL, not cc or English units as outlined in §170.315(b)(3).

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(b)(3): Electronic prescribing(b)(3)(ii)(E) - Metric Units

Justification

WebChart EHR should prevent prescriptions of oral liquid medications from being sent electronically if they have an inappropriate quantity unit of measurement.

Test Methodology

MIE will create a system report that examines the contents of each stored NewRx message in a client’s local database table of stored messages, limiting to oral liquid medications, and provides a count of each distinct quantity unit of measure used. MIE will run the report for each client under consideration and aggregate the results.

Expected Outcome(s)

It is expected that code C28254 (milliliters) to be the most commonly sent unit of measure for oral liquid medications. Others should be extremely rare.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 6: E-Prescribing Decimal Format

Description

This measure will verify that numeric amounts in prescriptions include leading zeros before decimal points and do not allow trailing zeros after a decimal point.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(b)(3): Electronic prescribing(b)(3)(ii)(F) - Decimal Format

Justification

WebChart EHR should prevent prescriptions from being sent electronically if they have directions or total quantity that are missing leading zeros or include trailing zeros. This is essential for preventing misunderstanding by pharmacists regarding the amount to dispense and patients regarding the amount of medication to take.

Test Methodology

MIE will create a system report that examines the contents of each stored NewRx message in a client’s local database table of stored messages, and provides a count of prescription messages that include inappropriate trailing zeros, and a count of those missing leading zeros. MIE will run the report for each client under consideration and aggregate the results.

Expected Outcome(s)

It is expected that prescriptions sent including inappropriate trailing zeros, or missing needed leading zeros will be very rare.

Care Setting(s)

Primary care, specialties, small, large

Measure 7: CDA Download

Description

This measure will verify that the system can accept a CDA document uploaded into the system, assign it to the appropriate chart in the system as appropriate, and display the document with a standard stylesheet with all sections being accepted and visible.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(b)(2): Clinical information reconciliation and incorporation(b)(2)(ii) - Correct patient.
§170.315(b)(1): Transitions of Care(b)(1)(ii) - All paragraphs
(b)(1)(iii) - All paragraphs

Justification

Webchart EHR should be able to accept a CDA document and place it into the correct chart based on information within the document. It should also be able to display the CDA documents with an appropriate stylesheet.

Test Methodology

MIE will report on the number of CDA formatted documents uploaded into tracked Webchart systems and the number of upload attempts that failed as stored in client databases and error log files.

MIE will report on the number of requests to view a CDA document within the system, and the number of times it displayed correctly, and when there were errors in display.
Any errors reported by customers or the recipients of their quarterly attestations will be tracked and reported as a baseline. These test assumptions for customer reporting align with the “visual inspection” aspects of the test lab tests.

Expected Outcome(s)

It is expected that CDA upload and stylesheet errors will be rare. Any errors reported by customers or the recipients of their quarterly attestations will be tracked and reported as a baseline.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 8: Application Access Documentation

Description

This measure will verify that WebChart EHR’s API documentation is publicly and perpetually available. Compliance will be recorded by an external uptime monitor and reported quarterly. Upon request, or in the event of downtime, data can additionally be reported in daily, weekly, or monthly increments.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(g)(7): Application access – patient selection(g)(7)(ii)(A)(1) - Functional Documentation
(g)(7)(ii)(A)(2) - Implementation Requirements
(g)(7)(ii)(A)(3) - Terms of Use
(g)(7)(ii)(B) - Public Link
§170.315(g)(9): Application access—all data request(g)(9)(ii)(A)(i) - Documentation
(g)(9)(ii)(A)(ii) - Implementation Requirements
(g)(9)(ii)(B) - Public URL

Justification

WebChart EHR should provide public access to all API documentation, implementation requirements, and terms of use as outlined in 170.315(g)(7) and 170.315(g)(9). This documentation should be available at all times throughout the year.

Test Methodology

An external uptime monitor will check the availability of all documentation available at https://docs.webchartnow.com/resources/system-specifications/application-programming-interface-api.html. Both up- and downtime will be logged to be reported quarterly. The cause of any downtime and the duration will also be logged In the event of any downtime, the amount of downtime can be reported at daily, weekly, or monthly intervals in addition to the quarterly reports, and the cause of each downtime occurrence will be reported.

Expected Outcome(s)

It is expected that all documentation will maintain an uptime of greater than 99.9%. Any downtime is expected to be caused by minor errors, such as a connection timeout or planned maintenance, and last for a duration of 10 minutes or less.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 9: Clinical Information Reconciliation and Incorporation

Description

This measure will verify that the system can take a CCDA transition of care/referral summary formatted according to the standards adopted §170.205(a)(3) and §170.205(a)(4) and read the data for medications, allergies, and conditions from the document, reconcile those into the chart, and that the data is fully incorporated into the chart.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(b)(2): Clinical information reconciliation and incorporation(b)(2)(iii)(A), (B), (C), (D)
(b)(2)(iv) - System Verification

Justification

Webchart EHR should be able to reconcile CCDA data for medications, allergies, and conditions into a patient’s chart as outlined in § 170.315(b)(2).

Test Methodology

MIE will report on the number of CDA formatted documents reconciled via our reconciliation process.

Following each reconcile, if a temporary CDA for the chart is created as part of the process, it will be validated to ensure the reconciled data can be incorporated into a CDA created free of schematic errors (the CDA document will NOT be kept, only the result of the validation). Additionally, any client complaints that data is not being imported correctly from the tool will be tracked, investigated, and reported.

Expected Outcome(s)

It is expected that errored reconciliations and invalid CDA checks should be extremely rare.

Care Setting(s)

Primary care, specialties, small, large

Measure 10: Transmission to Immunization Registry: Create Content

Description

This measure will verify that the system can generate a VXU conforming to the HL7 v2.5.1 standard, CDC guidance for communication to Immunization Registries and state/local guidance. The VXU messages shall contain information related to the demographics and vaccination administration record.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(f)(1): Transmission to immunization registries(f)(1)(i) - Create Content

Justification

WebChart EHR should be able to generate and send valid VXU messages.

Test Methodology

MIE will report from the database the number of successfully sent VXU messages acknowledged as received by the state immunization registry. MIE will also report from the database on the number of records rejected by the state registry due to error, whether the failure was due to registry internal errors, clinical data entry issues or a not well-formed message. Finally, MIE will report from the database the number of messages which declined to be generated due to data entry issues failing message pre-validation.

Expected Outcome(s)

It is expected that failures to generate messages will be rare, as will rejections of generated messages by the registry. In all cases of failures, users will be made aware of the failure through a registry status dashboard with information pertaining to how the errors may be resolved and options for attempting a resubmission of the vaccination record to the registry.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 11: Transmission to Immunization Registries: Query Records

Description

This measure will verify that the system can generate a QBP conforming to the HL7 v2.5.1 standard, CDC guidance for communication to Immunization Registries and state/local guidance. Furthermore, the system shall be able to retrieve, consume and display to the end user the results of any such query.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(f)(1): Transmission to immunization registriesf)(1)(ii) - Query Records

Justification

WebChart EHR should be able to request, consume and display an evaluated patient history and forecast.

Test Methodology

MIE will report the number of successful retrievals of evaluated history and forecasting operations from the database. MIE will report the number of failed retrievals, including those resulting from an internal error in the registry resulting in an inability to consume a response from the database. MIE will manually track, resolve and report issues resulting from WebChart EHR application errors as reported by end users.

Expected Outcome(s)

It is expected that failures will be rare. In the case of reported WebChart applications errors, issues shall be tracked and resolved. In the case of registry internal errors, the registry shall be notified of the issue.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 12: Direct Project: Send

Description

This measure will verify that the system can transmit a Direct project conforming S/MIME to a HISP. The measure will also verify the receipt of those transmissions by verifying the status of the resultant MDN messages.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(h)(1): Direct Project(h)(1)(i) - Send
(h)(1)(ii) - Message Disposition Notification: Processed
(h)(1)(ii) - Message Disposition Notification: Failed

Justification

WebChart EHR should be able to generate valid S/MIME messages, transmit them via Direct Project specifications and consume the resulting MDN from the recipient.

Test Methodology

MIE will report from log files the number of messages transmitted. MIE will report from logs the number of messages which failed to be transmitted whether due to internal error, external failures or inability to verify trust of the recipient. MIE will report from logs the number of Processed MDN messages received. MIE will report from logs the number of Failed MDN messages received.

Expected Outcome(s)

It is expected that errors will be rare with the most common cause of error being a lack of verified trust with the intended recipient.

Care Setting(s)

Primary, specialties, small, large

Measure 13: Direct Project: Receive

Description

This measure will verify that the system conforms to Direct Project message receipt requirements for validation.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(h)(1): Direct Project(h)(1)(i) - Receive

Justification

WebChart EHR should be able to receive, validate and deliver Direct Project messages transmitted to its HISP.

Test Methodology

MIE will report from logs the number of messages transmitted to the HISP. MIE will report from logs the number of messages failing to conform to Direct Project specifications. MIE will report from logs the number of messages which are successfully delivered to recipients.

Expected Outcome(s)

It is expected that many messages transmitted to the public facing HISP listener will not meet the requirements for Direct Project messages and will be rejected by the HISP. It is expected that failures for conforming messages from receipt through delivery will be rare.

Care Setting(s)

Primary, specialties, small, large

Measure 14: Patient Portal View

Description

This measure will verify that a patient can view various document types within the patient portal.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(e)(1): View, download, and transmit to 3rd party(e)(1)(i)(A)(1),(2),(3),(4),(5)
(e)(1)(i)(D)(1), (2)
(e)(1)(ii)(A)

Justification

WebChart EHR should be able to provide a mechanism for a patient to read documents sent to them within a patient portal as required by § 170.315(e)(1).

Test Methodology

MIE will report a number of measurements surrounding documents, including:

  • Number of documents made available to patients in the patient portal
  • Number of documents read by patients in the patient portal
  • Number of failures in the ability to read messages in the patient portal

Results will be retrieved from database tables and aggregated for reporting. Any failures will be reported from the information found in log files as well as any client reported issues tracked during the testing period.

Expected Outcome(s)

The percentage of messages read in the portal may be influenced by the engagement of the patients themselves and will be analyzed if it appears to be lower than expected. In the case of low viewership, MIE will encourage clients to investigate methods of increasing patient engagement, or validate sends of documents to patients truly interested in using an online patient portal.

Care Setting(s)

Primary, specialties, small, large

Measure 15: Patient Portal Download

Description

This measure will verify that a patient can download various document types within the patient portal.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(e)(1): View, download, and transmit to 3rd party(e)(1)(i)(B)(1), (2), (3)
(e)(1)(i)(D)(1), (2)
(e)(1)(ii)(A)

Justification

WebChart EHR should be able to provide a mechanism for a patient to download documents sent to them within a patient portal.

Test Methodology

MIE will report a number of measurements surrounding documents, including:

  • Number of documents made available to patients in the patient portal
  • Number of documents successfully downloaded from the patient portal
  • Number of documents unsuccessful in being downloaded from the patient portal.

Results will be retrieved from database tables and aggregated for reporting. Any failures will be reported from the information found in log files and third party reports as well as any client reported issues tracked during the testing period.

Expected Outcome(s)

There is expected to be an extremely low occurrence of messages unable to be downloaded from the patient portal.

If there is a lack of downloads in a certain timeframe by patients within the workflow, MIE may conduct internal testing of message downloads to maintain measure compliance.

Care Setting(s)

Primary, specialties, small, large

Measure 16: Patient Portal CCDA Transmit

Description

This measure will verify that a patient can transmit various document types within the patient portal to other entities.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(e)(1): View, download, and transmit to 3rd party(e)(1)(i)(C)(1), (2)
(e)(1)(i)(D)(1), (2)
(e)(1)(ii)(A)

Justification

WebChart EHR should be able to provide a mechanism for a patient to transmit documents sent to them within a patient portal to other entities.

Test Methodology

MIE will report a number of measurements surrounding documents, including:

  • Number of documents made available to patients in the patient portal
  • Number of documents successfully transmitted from the patient portal
  • Number of documents unsuccessful in being transmitted from the patient portal.

Results will be retrieved from database tables and aggregated for reporting. Any failures will be reported from the information found in log files and third party reports as well as any client reported issues tracked during the testing period.

Expected Outcome(s)

There is expected to be an possibly extremely low occurrence of messages unable to be transmitted from the patient portal to third parties.

MIE may produce test messages to transmit to mieweb.com and ccme.com (our Direct compliant email domain) to validate this functionality if necessary.

Care Setting(s)

Primary, specialties, small, large

Measure 17: Send Using Edge Protocol for SMTP / XDM

Description

This measure will verify that the system is able to utilize a SMTP edge protocol for sending and receiving Direct Project messages. As part of receiving messages, XDM shall be handled when applicable.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(b)(1): Transitions of Care(b)(1)(i)(A)(Alternative) - Send Using Edge Protocol for SMTP/IXE XDR
(b)(1)(i)(B)(Alternative) - Receive Using Edge Protocol for SMTP/IXE XDR
(b)(1)(i)(C)(Conditional) - XDM Processing

Justification

WebChart EHR should be able to receive and send Direct Project messages to a HISP utilizing a SMTP edge.

Test Methodology

MIE will report from logs the number of messages transmitted to the HISP by SMTP. MIE will report from logs the number of messages received from the HISP by SMTP. MIE will report from logs the number of XDM packages processed. In the case where insufficient real-world data is available, data resulting from regular testing with DirectTrust shall be included in the reporting.

Expected Outcome(s)

It is expected that the current usage shall be low, with the exception of regular DirectTrust testing. Any errors resulting from real-world transmission of messages are expected to be rare, but may skew results due to the low volume of transmitted messages.

Care Setting(s)

Primary, specialties, pediatrics, small, large

Measure 18: EHI Export

Description

This measure will review WebChart EHR’s ability to export a designated record set of patient data.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(b)(10): Electronic Health Information export(b)(10)(i) - Single patient export
(b)(10)(ii) - Patient population export

Justification

WebChart EHR should allow a user to export the designated record of a patient in a timely manner without requiring developer assistance. To ensure that this functionality is available when a user requires it, MIE will monitor requests and perform regular testing on the export.

Test Methodology

When a user requests data to be exported, if the response fails, an automatic error message will be generated and stored in a searchable database. Total number of attempts will be tracked based on activity logs in the EHR. Additionally, an example patient will be tested nightly to make sure that all expected data is returned and new errors are not introduced.

Expected Outcome(s)

It is expected that failures will be rare and that exports will succeed 95% of the time. Some potential causes of failure would be a failure in network connectivity or system timeout due to unexpected size of data. These will be tracked as they happen and resolved.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 19: CDA Validation

Description

This measure will verify that CDAs both created by and received by a Webchart EHR system pass basic CDA validation.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(b)(1): Transitions of Care(b)(1)(ii)(A)
§170.315(b)(6): Data Export(b)(6)(ii), (A)-(F)

Justification

Webchart EHR should be able to validate that CDAs that are stored within webchart either do or do not conform to basic CDA schema requirements.

Test Methodology

All CDAs stored within a Webchart EHR will be run through schema validation regardless of the document’s origin. Documents may originate within the WebChart EHR system or be imported from a third party application of manual upload. The schema validator will be installed within the MIE production environment to ensure the security of all PHI contained in the documents. Only results of the validation will be made available, document content will not be revealed to developers during testing.

The number of valid vs. invalid CDAs and their sources will be reported.

Expected Outcome(s)

Any formatting or stylesheet errors identified by the CDA validator will be tracked, reported, and resolved. These errors are expected to be rare for documents created by WebChart EHR. If CDA documents received from third parties are identified to have schematic errors, those third parties will be notified whenever possible.

Care Setting(s)

Primary, specialties, small, large

Measure 20: Patient Data requests VIA API

Description

This measure will verify that the API as outlined in WebChart EHR’s documentation is functional. A valid request for patient information must provide that information.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(g)(7): Application access – patient selection(g)(7)(i) - Query processing and response
§170.315(g)(9): Application access—all data request(g)(9)(i)(A)(1) - Demonstrate API
(g)(9)(i)(A)(3) - Data Classes
(g)(9)(i)(B) - Data Return

Justification

WebChart EHR should provide patient information to requesters with the proper access to the information. In production environments of WebChart EHR, the use of the documented API is rare; therefore, MIE will conduct dual level testing of the API first, using automated testing of a test system in a production environment and second, manually tracking any client reported issues with the API functionality against the automatically tracked API requests are made.

Test Methodology

To address the overall automated testing, the following test requests will be made daily against a test system in a production environment.

  • Issue a request in the browser to search for a patient (patient selection)
  • Issue a request in the browser to request demographics of a patient (data category request)
  • Issue a request using the export tool described in the documentation.

All API requests made in production systems are recorded in log files. The number of requests logged will be reported against the number of issues with API functionality that are reported.

Expected Outcome(s)

It is expected that automated testing will be successful more than 97% of the time. Any errors will be logged and reported; however, errors are expected to be minimal, such as a lost server connection. Both production uses of the API and reports of errors are expected to be rare.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 21: Web Content Accessibility

Description

This measure will verify that all certified content in the patient portal will maintain accessibility conformance as outlined in the Web Content Accessibility Guidelines (WCAG) 2.0.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(e)(1): View, download, and transmit to 3rd party(e)(1)(i) - Web Content Accessibility

Justification

The certified content of the patient portal should be accessible to all users regardless of abilities or impairments as outlined in the Web Content Accessibility Guidelines (WCAG) 2.0.

Test Methodology

MIE will conduct monthly third-party production accessibility scanning as well as automated nightly internal accessibility scanning of a test system in a production environment.

Expected Outcome(s)

It is expected that no urgent non-conformance issues will be identified and that the number of secondary issues will be minimal. Any non-conformance that reaches a production system will be tracked and reported.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 22: FHIR API Documentation

Description

This measure will verify that WebChart EHR’s FHIR API documentation is publicly and perpetually available. Compliance will be recorded by an external uptime monitor and reported quarterly. Upon request, or in the event of downtime, data can additionally be reported in daily, weekly, or monthly increments.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(g)(10): Standardized API for patient and population services(g)(10)(vii) - Documentation

Justification

WebChart EHR should provide public access to all FHIR API documentation, software components, software configurations, registration instructions, and terms of use as outlined in 170.315(g)(10). This documentation should be available at all times throughout the year.

Test Methodology

An external uptime monitor will check the availability of all documentation available at https://docs.webchartnow.com/resources/system-specifications/fhir-application-programming-interface-api/ and the linked subpages. Both up- and downtime will be logged to be reported quarterly. The cause of any downtime and the duration will also be logged In the event of any downtime, the amount of downtime can be reported at daily, weekly, or monthly intervals in addition to the quarterly reports, and the cause of each downtime occurrence will be reported.

Expected Outcome(s)

It is expected that all documentation will maintain an uptime of greater than 99.9%. Any downtime is expected to be caused by minor errors, such as a connection timeout or planned maintenance, and last for a duration of 10 minutes or less.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 23: CCDA Content

Description

This measure will verify that CCDAs generated in webchart systems have all USCDI data and other required data.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(e)(1) View, download, and transmit to 3rd party(e)(1)(i)(A)(1) - (e)(1)(i)(A)(7)

Justification

WebChart EHR should generate CCDAs that can generate the sections required by USCDI.

Test Methodology

We will have weekly automated tests that will choose a certain number of random patient CCDAs in specific live systems and test for the given sections to exist in the documents.

Expected Outcome(s)

All CCDAs tested should include all of the sections required, if applicable.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 24: Record and Change Care Plan

Description

This measure will track that users can create and change care plan data.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(b)(9): Care planRecord, Change and Access

Justification

WebChart EHR per certification requirements must be able to provide a way to enter and update Care Plan data.

Test Methodology

We will report on the following data elements being created or edited in patient charts:

  • Goals
  • Health concerns
  • Health status evaluations and outcomes
  • Interventions

Expected Outcome(s)

We expect to see user engagement in editing care plan data.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 25: Create Care Plan CCDA Documents

Description

This measure will track that users can create Care Plan CCDA Documents.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(b)(9): Care planRequirement from matrix

Justification

WebChart EHR per certification requirements must be able to generate Care Plan CCDA Documents.

Test Methodology

We will report on the number of encounters with Care Plan information, and the number of Care Plan CCDAs generated.

Expected Outcome(s)

The number of CCDA Care Plans generated should be the same as the number of encounters with Care Plan information.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 26: Receive Care Plan CCDA Documents

Description

This measure will track that the system can receive Care Plan CCDA Documents.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(b)(9): Care planRequirement from matrix

Justification

WebChart EHR per certification requirements must be able to receive Care Plan CCDA Documents.

Test Methodology

We will report on:

* the number of Care Plan CCDAs received from outside sources.
* Pass or fail count on the Care Plan CCDAs received.

Expected Outcome(s)

The number of CCDA Care Plans received into systems may not be very high. MIE may produce test Care Plans to send into test systems to validate this functionality.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 27: Create CCDA Documents with Security Tags

Description

This measure will track that the system can create CCDA Documents with valid security tags.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(b)(7): Security tags - summary of care - send(b)(7)

Justification

WebChart EHR per certification requirements must be able to generate CCDA Documents with valid security tags.

Test Methodology

We will have automated tests that run at minimum weekly to test that the software is still able to generate CCDAs with Security Tags.

If we determine that we are seeing usage of the security tagging within Production systems, we will report:

* the number of CCDAs generated during the RWT period.
* The number of CCDAs with security tags generated during the RWT period.

Expected Outcome(s)

We expect our software to show constant ability to generate CCDA documents with Security Tags.

From discussions with others around the industry who interact with large usage of CDA creation and transmission, there is little to no usage of DS4P within documents created by systems currently. We do not anticipate client usage of security tags in generated CCDA documents as well either, at least to start with in 2025. If this changes we will update the testing to accommodate tracking those numbers as listed above.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 28: Receive and Display CCDA Documents with Security Tags

Description

This measure will track that the system can receive CCDA Documents with security tags and properly display them to end users.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(b)(8): Security tags - summary of care - receive(b)(8)(i) and (ii)

Justification

WebChart EHR per certification requirements must be able to receive CCDA Documents with security tags and properly display them to end users.

Test Methodology

We will have automated tests that run at minimum weekly to test that the software is still able to receive and display generate CCDAs with Security Tags.

From discussions with others around the industry who interact with large usage of CDA creation and transmission, there is little to no usage of DS4P within documents created by systems currently. If we determine that we are seeing usage of the security tagging within Production systems, we will report:

* the number of CCDAs received during the RWT period.
* The number of CCDAs with security tags received during the RWT period.

Expected Outcome(s)

We expect our software to show constant ability to display CCDA documents with Security Tags.

From discussions with others around the industry who interact with large usage of CDA creation and transmission, there is little to no usage of DS4P within documents created by systems currently. We do not anticipate security tags in received CCDA documents as well either to our clients, at least to start with in 2025. If this changes we will update the testing to accommodate tracking those numbers as listed above.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 29: FHIR Sandbox Testing

Description

This measure will use the Inferno Test suite to validate all types of secure connections and search operations supported by the FHIR API within a publicly available production sandbox system.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(g)(10): Standardized API for patient and population services(g)(10)(i) - Data response: USCDI v1 + US Core STU v3.1.1
(g)(10)(ii) - Supported search operations
(g)(10)(iii) - Application registration
(g)(10)(iv) - Secure connection
(g)(10)(v)(A) - Authentication and authorization for patient and user scopes: SMART 1.0.0
(g)(10)(v)(B) - Authentication and authorization for system scopes
(g)(10)(vi) - Patient authorization revocation

Justification

WebChart EHR’s FHIR API is still newly available to clients and has no adoption as of writing this plan. Therefore to cover testing prior to live clients actively using the API, a publicly available production sandbox will be tested using Inferno. FHIR adoption is expected to be slow, but increasing, throughout 2025 leading to improved app support in WebChart EHR as well as increased real world data being available, at which time, Measures 30 and 31 will provide a more complete view of the production FHIR capabilities.

Test Methodology

MIE will run nightly automated testing on the public FHIR R4 sandbox system using Inferno, and using log files stored in a QA database, MIE will report the success rate of the full (g)(10) test suite. Any errors will be tracked, reported, and addressed.

Expected Outcome(s)

It is expected that test failures will be rare and that the automated tests will pass successfully in a minimum of 95% of nightly runs. Any failures are expected to be due to a failure in the testing infrastructure rather than in the functionality of the API.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 30: FHIR Patient Scope

Description

This measure will review WebChart EHR’s ability to connect to an app within a patient scope and provide the user with the requested data.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(g)(10): Standardized API for patient and population services(g)(10)(i) - Data response: USCDI v1 + US Core STU v3.1.1
(g)(10)(ii) - Supported search operations
(g)(10)(iii) - Application registration
(g)(10)(iv) - Secure connection
(g)(10)(v)(A) - Authentication and authorization for patient and user scopes: SMART 1.0.0
(g)(10)(vi) - Patient authorization revocation

Justification

WebChart EHR’s FHIR API is still newly available to clients, and has no adoption as of writing this plan. FHIR adoption is expected to be slow, but increasing, throughout 2025 leading to improved app support in WebChart EHR as well as increased real world data being available. Until that time when clients are actively using the FHIR API, MIE will conduct testing using a publicly available production sandbox system and a patient app recommended to our clients. As clients continue adoption of the FHIR API, real patient use of the patient app will be reported.

Test Methodology

MIE will report from de-identified log files an analysis of authentication and data searches using a patient app. Specific rates can be reported from the sandbox system as the automated testing setup will indicate what actions should yield successful authentication or data return. An overall analysis will be reported for the real world patient data since we cannot estimate failures due to patients correctly being denied access.

Expected Outcome(s)

It is expected that WebChart EHR will be conformant to all (g)(10) requirements and that overall error rates will be low.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 31: FHIR EHR Provider Scope

Description

This measure will review WebChart EHR’s ability to connect to an app within an EHR provider scope and provide the user with the requested data.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(g)(10): Standardized API for patient and population services(g)(10)(i) - Data response: USCDI v1 + US Core STU v3.1.1
(g)(10)(ii) - Supported search operations
(g)(10)(iii) - Application registration
(g)(10)(iv) - Secure connection
(g)(10)(v)(B) - Authentication and authorization for system scopes
#### Justification

WebChart EHR’s FHIR API is still newly available to clients, and has no adoption as of writing this plan. FHIR adoption is expected to be slow, but increasing, throughout 2025 leading to improved app support in WebChart EHR as well as increased real world data being available. Until that time when clients are actively using the FHIR API, MIE will conduct testing using a publicly available production sandbox system and a provider app recommended to our clients. As clients continue adoption of the FHIR API, real provider use of the provider app will be reported.

Test Methodology

MIE will report from de-identified log files an analysis of authentication and data searches using a provider app. Specific rates can be reported from the sandbox system as the automated testing setup will indicate what actions should yield successful authentication or data return. An overall analysis will be reported for the real world provider data since we cannot estimate failures due to providers correctly being denied access.

Expected Outcome(s)

It is expected that WebChart EHR will be conformant to all (g)(10) requirements and that overall error rates will be low.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 32: Transmission to public health agencies — syndromic surveillance: Create content

Description

This measure will validate that Webchart EHR can produce valid HL7 v2 ADT messages, conforming to HL7 V2.5.1 PHIN Messaging Guide Release 2.0 and associated Erratum, per a patient’s admission and discharge of care, as well as new registration or demographic updates.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(f)(2): Transmission to public health agencies — syndromic surveillance(f)(2) - Create content

Justification

WebChart EHR per certification requirements must be able to produce HL7 v2 ADT messages for specific patients. It is appropriate to distinguish between ambulatory settings and emergency department, urgent care and inpatient settings.

Test Methodology

We have automated tests that perform the scenarios to Register a patient, admit them both for ambulatory and urgent care, make demographic updates, and discharge them. The automated tests validate that the HL7 interface constructs valid ADT messages that conform to the certification specification. RWT in Webchart EHR would include creating 3 separate Refer to Systems end points with interface specific configuration configured and 3 separate Auto Routes to capture the new registrations, demographic updates, admissions, and discharges. When using Webchart EHR to register a patient, update demographics, admit, or discharge the system will automatically produce the HL7 messages and send them to the configured end point with the HL7 format that meets the requirements of the NIST validation tool to be accepted by the public health agency.

Expected Outcome(s)

It is expected that test failures will be rare and that the automated tests will pass successfully in a minimum of 95% of nightly runs. Any failures are expected to be due to a failure in the testing infrastructure rather than in the functionality of the API. In the case of reported WebChart applications errors, issues shall be tracked and resolved. In the case of public health agency internal errors, the public health agency shall be notified of the issue.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 33: eCR Reporting

Description

Public health case reporting is required by law in every state and territory. eCR is the automated generation and transmission of case reports that originate from a Healthcare Organization’s Electronic Health Record (EHR) system and are sent to state and local public health agencies (PHAs) for review and action. This measure tests how many eCRs were created based on trigger codes during the period in specific systems.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(f)(5): Transmission to public health agencies — electronic case reporting(f)(5)(i) - Consume trigger codes
(f)(2)(ii) - Match encounter to trigger codes
(f)(2)(iii) - Create case report

Justification

When a system is configured to report electronic case reports to the appropriate public health agencies for their jurisdiction, we need to verify that eCR CCDAs are being generated based on the consumed trigger codes. An eCR CCDA should be automatically generated, stored in the patient’s chart, and transmitted via Direct to the configured PHA.

Test Methodology

The number of eCR documents generated in configured systems will be tracked throughout the year using database queries which can be spliced by date to report various trends. Again using database queries, we will report on which codes trigger the eCR generation.

Expected Outcome(s)

Failures should be rare, with a success rate of at least 99% expected when systems are properly configured.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Measure 34: EHI Export Documentation

Description

EHI export documentation including the specifications of the export format should remain accurate and available at all times to users authorized to perform exports.

Associated Certification Criteria

Certification CriteriaRequirement(s)
§170.315(b)(10): Electronic Health Information export(b)(10)(iii) - Documentation

Justification

EHI export is designed to export the designated record set for a single patient or patient population. Regular code changes are pushed to clients every 2 weeks. These changes will be reviewed during each 2 week cycle and documentation will be updated if any code change would introduce a change to the designated record set, export process, or export format. All documentation will be reviewed for accuracy on a quarterly basis.

Test Methodology

During a standing bi-weekly code review meeting, code changes will be reviewed to determine if any modifications of the EHI export documentation are required. Reviews of the overall documentation, independent of the software lifecycle, will also be conducted on a quarterly basis. The number of documentation reviews and modifications will be tracked. Any required modifications to documentation will be made within 2 weeks of the code change and before the code change is pushed to live client systems.

Expected Outcome(s)

It is expected that updates to the EHI documentation will be rarely necessary; however, all updates will be made promptly prior to the live EHI functionality changing in client systems.

Care Setting(s)

Primary care, specialties, pediatrics, small, large

Schedule of Key Milestones

Key MilestoneCare SettingDate/Timeframe
Release of documentation for the Real World Testing to be provided to ACB and providersAll settingsOctober 31, 2024
Begin collection of information as laid out by the planAll settingsJanuary 1, 2025
Follow-up with providers and authorized representatives to understand any issues arising with the data collection.All settingsQuarterly, 2025
Data collection and review.All settingsQuarterly, 2025
Additional CQM or criteria certification as determined by the developerAll settingsQ3, 2025
Update standards via SVAP as determined by the developerAll settingsQ3, 2025
End of Real World Testing period/final collection of all data for analysisAll settingsDecember 31, 2025
Data analysis and report creationAll settingsJanuary, 2026
Submission of Real World Testing Results to ACBAll settingsPer ACB instructions

Attestation

This Real World Testing plan is complete with all required elements, including measures that address all certification criteria and care settings. All information in this plan is up to date and fully addresses the health IT developer’s Real World Testing requirements.

Authorized Representative NameDoug Horner
Authorized Representative Emailhorner@mieweb.com
Authorized Representative Phone260-459-6270
Authorized Representative SignatureDoug Horner
Date10/30/2024

WebChart Documentation

Last Updated:

Last Build: Thu, 05 Dec 2024 18:17:24 UTC
WikiGDrive Version: 2aacb51f060d0354a678419290943a99bd16aad1