Covered by this topic
Data migration is the process of transferring data between different storage types, formats, or systems, and is an important consideration for clients replacing an existing EHR. At WebChart , we have imported data from a number of different systems, including Medgate, OHM, Sentry, Flow Gemini, and many other custom databases and countless proprietary spreadsheets. All of these imports involve a similar process, as described in this document.
The client should weigh the cost of users or clinicians manually adding discrete values of the data after reviewing historical summary documents with the patient. The approach of converting discrete data into documents is more cost-effective for medications, injections, and other types of discrete data that may have limited quality and are routinely reviewed/entered in the course of patient visits.
The client must provide a backup or snapshot of their legacy database (DB). Including a mysqldump database backup is ideal. MS Access Database files, CSV, or other delimited text (UTF-8) backup files are also acceptable. Text files of legacy data must follow MIE’s data requirements for formatting.
WebChart may optionally add the legacy DB tables to the client’s WC system DB as custom tables, which may be accessed and queried later by built-in reporting and querying capabilities in the system. This allows errors or omissions to be corrected even after go-live.WC takes the table and column data provided by the backup or snapshot of the legacy DB and loads the schema into a data migration map in a spreadsheet. The data migration map documents all tables and columns from the DB along with usage statistics and (optionally) sample data from the backup for each column. **Tools for Data Acquisition** The following tools may be useful in exporting data to something we can consume.
- MySQL Workbench can export MSSQL databases to MySQL backup files: http://www.mysql.com/products/workbench/migrate/
- MSSQL Tools: Export to Access, and we’ll run MDB Tools to load the Access backups to MySQL
- CSV KIT can also read DBs to create properly formatted CSV files: https://csvkit.readthedocs.io
See this document for notes on building Python RPMs and packages for csvkit, wc-map-database, and MIE File Import.
The Onboarding Coordinators and Electronic Data Interchange (EDI) team review the data migration map with the client’s legacy DB subject matter experts (SMEs) to assist in the data mapping. It is imperative that the client understands the legacy DB and specifies which tables and columns should be extracted either as historical summary documents or discrete data where necessary. For discrete data conversion requests that are not part of the standard migration, WC references a library of migration code to search for existing modules and data previously encountered. Discrete data that is not supported by the standard legacy data import tools require custom interfaces that are scoped separately from the standard legacy data conversion. The contract or statement of work may also dictate what is or is not considered in scope for a data migration project. These issues are fleshed out during discovery.
Once the scope is understood, WC loads the legacy data into the client’s development (dev) system or a separate sandbox for legacy data conversion testing. The EDI team uses the in-house library of migration code and/or the data migration map to pull data from the legacy tables into historical summary documents, encounters, and discrete data. During this phase, rapid development and testing occurs in the following manner:
- Screenshots from the legacy system are captured and deidentified to use for internal verification of migrated data for each migration module.
- Modules are tested internally after they are converted using the standard legacy data conversion tools or custom interfaces as needed.
- Modules are tested by the client in a testing sandbox or their dev system with migrated data.
- Modules in the data migration map are verified as complete as they are approved by the client.
- Modules that require bug fixes that are the responsibility of WC are fixed.
- Modules that the client finds incomplete or were not mapped correctly during the data migration mapping phase are subject to the WC change control process.
- Repeat these steps until all modules are verified by the client.
After the client has verified the migration modules, WC and the client complete a timed final dress rehearsal. The following process is completed for the timed dress rehearsal:WC 1. creates a clean configured environment where the data may be loaded for testing. Typically, this is a clone of the configured production system before go-live. 2. The client provides a new snapshot/backup of the data. WC 3. obtains the data. WC 4. optionally converts the data to the client's WC DB as custom tables in the newly created target environment. WC 5. performs the data migration along with any custom interfaces to convert the data to documents and discrete data. Each step is timed. A data migration checklist is used to document the steps and timing for all steps. WC 6. onboarding specialists perform internal verification as data is loaded in the system. WC 7. onboarding specialists perform any post-process configuration that is required after data is migrated. Again--all steps are documented on the data migration checklist. 8. The client validates that the data in the dev system or sandbox is correct. Validation is also timed and documented. 9. A "go" or "no-go" is provided by the client to WC to signal the end of timing.
If a “no-go” is provided, a discussion is scheduled to discuss the issues that prevented a successful conversion. Changes are made as needed, adhering to the change control process if applicable. The data is converted in the dev system or sandbox if possible or the system is cleaned and reloaded if necessary, and the client validates again.
The go-live procedure is similar to the dress rehearsal. The data migration checklist created during the dress rehearsal is used to migrate the data step-by-step. The following is a sample procedure for the go-live of a migrated system:
- The client stops using their historical product and puts it in a read-only state. Depending on the size of the database and timing of the migration, this is often done on a Friday after the end of normal business hours while data is migrated over a weekend. If more time is needed to load the data and allow for validation, then the schedule is adjusted to allow for extra time. Note that while the historical product is in a read-only state and the WC system is not yet loaded with data, clinics will only be able to reference the legacy system. Any new data will have to be logged on paper charts and entered into the WC system after go-live.
- The client exports the data.
- audiometry (discrete data)
- caseclinichistory (summary documents linked to incidents)
- casemaster (summary documents plus discrete incidents)
- clinicaltest (summary documents)
- clinicvisit (discrete encounters)
- clinicvisitreferrals (summary documents with discrete orders)
- contraindications (discrete observations)
- employeedocuments (documents)
- employeeexposuregroup (summary documents)
- employeelosttime (discrete data)
- employeerestrictions (discrete data)
- immunizations (discrete vaccines, observations and orders for tests, plus summary documents)
- medications (summary documents)
- pft (discrete data)
- questionnaireresponses (summary documents plus discrete observations)
- respiratorfittesting (discrete data)
- visiontests (summary documents plus discrete observations and lab requests)
Our experience with Medgate has been converting versions 6.0 SP4c - rev 79885 and 6.0 SP3a - rev 46384. See Medgate Conversion Queries for example queries used.
- Vision Health
- Past Work History
- Blood Chemistry
- Micro Urinalysis