Annotation Tool

Medal 2019
responsive priview
About Medal

Medal enterprise software enables secure, HIPAA compliant sharing and cataloguing of clinical information from any electronic medical records system, including reports and summaries using natural language processing. Simple and inexpensive, it installs in seconds, creating a virtual, secure, shared record for each patient encounter while managing access and consent. Up-to-date individual data is collected and used effectively to create holistic patient profiles to feed decision support for providers and patients, driving improved health outcomes.

What are we solving for?

Over 450 million faxes and forms are submitted by members and providers every year, often times with missing data. The cost of manual data entry when processing these documents are high. Medical Errors: Bound to happen, human data entry errors are common. Patient test results and reports can have a myriad of data points, and when clinicians are making decisions in real-time based  they rely on those data points being accurate to correctly diagnose and treat patients. But if the reports were somehow incorrect or missed completely, it can threaten patient care drastically-medical error is thethird leading cause of death in the U.S.

High Administrative Costs: The act of manually processing documents, such as recording test results or submitting insurance claims or going through the whole prior authorization process, is physically demanding and time consuming. Time that could be spent on patient care is instead spent on processes that can be automated at a faster rate and with higher accuracy.

  • Automate data entry to save time and costs
  • Increase data accuracy to reduce catastrophic errors
  • Introduce a co-editing or teamwork aspect to reduce errors that may happen
  • Introduce versioning for annotations so every annotator can edit independentally
The Team
  • 2 Front-end Engineers
  • 1 Product Manager
  • 2 Back-end Engineers
  • 1 Machine Learning advisor

And me, tasked with leading all design efforts for Medal.

Medical Annotators want to
  • be able to automate the annotation process
  • have the ability to collaborate and correct AI-driven annotations
Admins want to
  • be able to view, edit, and manage teams of annotators on any given record
  • have the ability to customize their menu so they have control over the organization of labels

Figuring out Versioning

The next step was figuring out how versioning would actually work when multiple people are working on the same document. We decided to save a version per person's unique session, and have each session available only to the original annotator and the admin or manager overseeing the project. The MVP features weren't fleshed out yet, but this was a good point to start prioritizing what could be on the roadmap. For example, being able a custom category was essential, since there's not way to predict how many or which labels would be needed.

Brainstorming a Solution

The first step in figuring out how to design a solution starts with a sketch. I brainstormed how a team of healthcare professionals could collaborate and thought a task list could make sense. But, building a notification system that linked towards versions of an annotated medical document was not feasible with our timeline and could potentially cause issues with HIPAA. The most vague email messaging we could send would be "An update on < redacted file name > made on < date >" and that did not seem useful.

I started to experiment with post-its to figure a different way to capture versions by different annotators. I had this idea of a changelog that would only be viewable to users whenever a new version of annotations was available. I still played with the idea of sending an email notification as a means to redirect them to the annotion view, since there was no other way to notify users outside the platform at this time. This post-it exercise helped define the skeleton of what I wanted to achieve in our final designs: capture versions, make those version accessible somehow, notify users of those changes - but there was a pivotal part missing. How are we going to allow users to better navigate a menu for annotating records? At this point, the dropdown was one massive dropdown menu, with a search field at the top. It was a known user issue that the experience was frustrating because the menu required scrolling and lacked any sort of organization.

Annotation Menu Explorations

I explored how the annotation menu could function with hotkeys after I hosted a couple user interviews. The users were users of Medal's annotation system both mentioned desiring a faster and more efficient way to navigate the menu. The menus can be vast, and while searching for a label was a useful feature, it still required some amount of scrolling to find the exact thing they were looking for. We discussed hotkeys as a method of quickly navigating the menu and these are those explorations:

Final Design
Annotations for Medical Records

After a couple more iterations, we landed on the designs below. I introduce color-coding labels to make it easy at a glance to distinguish labels from one another. These colors can be customized through the Admin Portal. For the default colors, we adhered to how Medal color-coded parts of their patient summary page: blue for problems, purple for allergies, and orange for medications.

Manage Annotation Mocks

Below are mocks showing how users would manage their annotations from the Admin Portal. The general flow would be for admins to log in, click on "Manage Annotations" and either upload a CSV of labels (this bulk import was implemented later) or add each label category manually. Once a category was added, then users could add individual labels to those categories. The idea of creating categories before labels was an effort to force users to think about the organization of their working menu of labels. We are essentially letting admins to design their own labeling dropdown menu for their teams to use. While creating a label, users can also customize the label color to help differentiate the labels. My main concern with this was contrast - so we implemented a live preview to show users what it the labels would look like and forced a white or black font color based on the hex code picked.

User Flow
Editor View

This is a basic mock flow of how individual editors would create their own annotations. Each version is only viewable to the original annotator and the admin.

User Flow
Admin View Experience

An admin logging in to view annotations will be able to see everyone's contribution to the document and can select which one will be shown on the patient profile. Admins will also be allowed to invite specific users to annotate any version.

We should test more often.

Without interviewing users, I wouldn't have realized how annoying the menu could be to use. The position of the menu posed an issue due to its length, and would sometimes overlay pertinent information the user needed to see to annotate accurately. I recorded my sesson, and delivered my insights to the team to prioritize the menu design as part of annotations because I knew it would add value and improve their experience.