Author: Kerry Barker, RN, BSN
Epic upgrades, integrations, and new implementations are still necessary despite travel restrictions being in place long after anyone expected. Case in point is new units or new clinics created to handle COVID testing or COVID vaccine administration. We must continue delivering new hospital and clinic functionality to keep healthcare moving.
The show must go on(line)
To deliver needed Epic upgrades and new functionality, the show must go online. This includes adjusting testing processes to the virtual landscape in which we now work. We no longer have the option of gathering all the required analysts in a room for an in-person testing event. CereCore Epic Services team members are located in different cities, different states, and different time zones and support a large number of Epic-based facilities. Here we discuss some tools and tips that worked best for our team for virtual Epic testing.
- Utilize online tools for tracking, communicating, and project management.
Our central point for test scripts and trackers is SharePoint. These scripts and trackers were all created in Microsoft Excel and are small enough to be utilized in Excel Online so that multiple users can update the trackers simultaneously. On occasion, a script or tracking file is too large for online usage. In this case, we utilize SharePoint’s library-style check-out and check-in function to not “overwrite” another team member’s edits. We also ensured our entire team was comfortable utilizing SharePoint by hosting online in-service meetings and providing educational materials.
During testing, we created a dedicated chat space in Webex Teams for each application. During integrated testing, a separate space was created for each integrated script and the “baton” was passed from user to user during the script in that space along with communicating any issues. This was implemented after we had tried utilizing one chat space for all our scripts. We quickly learned that the chat became too large and analysts lost their messages and spot in the script. By each script having a dedicated chat, users could easily see where the baton had been passed and on which line to take over.
Testing trackers were utilized to track script updates, data prep, and testing progress. Trackers not only listed out the script but which applications/individuals were involved in testing the integrated scripts including interfaces and third-party vendors for downstream testing. If defects were discovered during testing, they were entered into a column for issue tracking. Our scripts include very specific goals to keep the focus on the priority needs for each workflow. They are kept in an online library where we can update and re-use as needed.
Another tool we utilized was Epic’s Orion software for project management. Project managers were able to assign tasks for the various applications and individuals by each stage of planning. For testing, each testing date was assigned to script owners and application leads to communicate deadlines. We found that more specific tasks were the most helpful for our analysts. This software gave quick feedback to the project management group on gaps and successes.
- Create rules of engagement.
At the start of integrated testing, we met as a group and agreed to rules of engagement or etiquette for virtual testing. This reduced friction and established realistic expectations for everyone involved.
One of these rules focused on our team and users being spread across multiple time zones. We set up clear expectations that users were only expected to test within their time zone from 8 am to 5 pm (business hours). A spreadsheet was also posted on SharePoint with the time zone for each analyst.
Some other rules indicated how to pass the script on to the next analyst in the script, how long each member had to respond to a baton pass, how long each analyst had for the steps, and how to communicate defects. Setting up these expectations allowed for dealing with issues efficiently and support script progression.
- Utilize user-friendly tools
It took some trial and error before choosing Sherlock for Epic HER as our tool for defect reporting. We tried various tools for tracking defects. Many involved obtaining detailed security access. Some were not user-friendly.
The Epic testing toolkits included suggestions for utilizing Sherlock as a defect tracking tool. CereCore Epic analysts were already familiar with Sherlock as a means to reach out to Epic for issue resolution. To add this use case would mean less ramp-up and no security changes.
Within Sherlock, we created “tags” for each phase of testing (app testing, integrated testing, etc.) to extract the data easily. For issues where Epic support wasn’t needed we used the status “Customer – In Progress” and the ticket type of “project tracking”. To analyze defects reported during testing, we utilized the search functionality to search by the specific tag and download the search results in a .xml file. From there, were easily able to create an Excel pivot table for analysis.
- Create an environment of ownership and collaboration
Internally referred to as the Epic “One” team, our CereCore Epic resources were brought together to support growth and enhance collaboration. This created a unique challenge in that some of these team members had not interacted with each other before. We supported collaboration by empowering multiple script owners (who were application leads) to oversee each script and to create their own community during testing. The testing coordinator oversaw script owners and helped to answer questions and navigate any roadblocks.
Script owner meetings pulled in all project leads for each application that was needed for their script. The leadership group was intentionally not included in these discussions to allow the teams to work together, complete the scripts, and have candid conversations. Dates were assigned by the testing coordinator for data prep, script completion, and testing. The rest of the efforts were then handled by the application analysts. During testing, twice daily huddles were where each script owner reported on the status of their script. Any defects were discussed as a group and all applications were represented in the discovery discussion. If there were script fails, the team discussed how to retest or mitigate issues.
Through these virtual testing communities, our team members became much more familiar with each other. They began to quickly identify gaps in workflows that needed to be fixed, as well as lean on each other’s strengths as they talked about script goals. In my role as a testing coordinator, I noted that the team became much more at ease openly discussing issues without leadership to intervene.
Results that speak for themselves
We have been able to successfully complete testing virtually for both software upgrades and new implementations. In reviewing our most recent upgrade to those prior, we reduced our scheduled downtime for the move to production by one hour (from a four-hour window to three). We also were able to reduce the time needed to support upgrade issues from one week to three days because normal reporting incident levels resumed within a couple of days of the upgrade. This can be attributed to multiple factors such as strong team coordination, transparency in the process, and solid project management, but the results speak for themselves. I hope the insights we’ve learned can encourage EHR leaders using any platform to embrace new methods for project execution.