Jessie McGill​While a great resource, technology should never replace the clinical decision-making of a licensed nurse or interdisciplinary team (IDT) member. The resident assessment instrument (RAI) process is complex and case-specific; it requires professional expertise to navigate appropriately.

While technology mishaps can affect anyone, the nurse assessment coordinator (NAC) and the IDT members responsible for coding the minimum data set (MDS) are particularly vulnerable. In addition to the complexities of the process, the software itself can present its own challenges. Inexperience, second-guessing a decision, or being overly trusting of the technology often lead to coding mistakes, which can have an adverse effect on care plans, resident outcomes, survey, and reimbursement. By following these two steps, the NAC and IDT members can reduce the risk of technology-induced errors.

1. Validate autofill data.
When juggling many tasks and multiple deadlines, it can be tempting to just accept as correct information that automatically populates into the MDS, but that poses the risk of inaccurate coding. When signing for an MDS item or section at Z0400, the assessor attests to the accuracy of the coding to best of their knowledge. The attestation statement does not give the assessor a pass if the information is pulled from elsewhere in the electronic health record (EHR).

Consider the following scenarios:

  • ​Delayed data entry. Mr. Linden experiences a decline in function and ambulation and meets the criteria for a Significant Change in Status Assessment (SCSA). Upon identification of the decline, the physician orders physical therapy. The NAC sets the assessment reference date (ARD) of the SCSA to capture the first five days of therapy. The facility where Mr. Linden resides is in a Medicaid case-mix state, so capturing five days of therapy is necessary to achieve a rehab case-mix group. The morning after the ARD, the NAC refreshes section O of the MDS module to automatically populate the data with the therapy days, minutes, and modalities from the therapy EHR. The NAC is not aware that the therapy software was down the day before, so the therapist has not yet entered all treatment minutes. The NAC signs the completion of the MDS without double-checking the case-mix score to ensure achievement of a rehab group.
  • Not using all available data. The MDS software pulls information for section G, activities of daily living (ADLs), from the nurse aide electronic documentation. The software is very complex and calculates the Rule of 3 based on the episodes of care documented throughout the seven-day look-back period. The nurse aide documentation reflects that Mrs. Buckthorn did not ambulate during the look-back period. The NAC accepts the auto-populated ADL coding without verification. The NAC also codes Mrs. Buckthorn's restorative programs in section O, which includes a walking program. The documentation supports that Mrs. Buckthorn ambulated with a four-wheeled walker and the restorative aide provided balance support and guided maneuvering for 25 feet in the corridor on six days during the look-back period. The NAC does not recognize the error in the ADL coding in section G and does not consider other supporting documentation in the medical record, observation, or discussion with direct care staff.

When using autofill for any MDS item, it is important that the facility has a process to prevent errors. For example, before signing the completion of any auto-filled item, the IDT members must refresh the MDS assessment with any new or updated information, such as the Medicaid number or other data. For any autofill MDS item, the IDT must validate the source of the information—either by the medical record, source document, or a report. Additionally, if the MDS item autofills from one source of supporting data, it is important to also consider all other medical record documentation that would impact coding. For some items, the coder may also need to interview direct care staff to validate the item.

2. Use clinical decision-making when considering MDS coding suggestions.
MDS software and scrubbers are critical tools that can help ensure the integrity and accuracy of the assessment—when used correctly. Such software may automatically check or suggest diagnoses for section I or provide helpful tips on MDS coding. However, the assessor who signs the completion of these MDS items must ensure accuracy based on RAI coding instructions.

Consider these scenarios:

  • Software suggests or automatically checks diagnosis based on diagnosis list. Mrs. Elderberry admitted nearly three months ago with a diagnosis of pneumonia and a functional decline. The NAC is completing a quarterly assessment on Mrs. Elderberry and just refreshed the diagnoses for section I. The software has checked I2000, indicating an active diagnosis of pneumonia. The NAC checks the diagnosis list and verifies that the physician signed the diagnosis list in the last 60 days and accepts the diagnosis in section I. The NAC failed to follow all the MDS coding instructions for section I, which also requires that the diagnosis must be active during the seven-day look-back period. The NAC must make a clinical decision to determine if the pneumonia diagnosis has a “direct relationship to the resident's current functional, cognitive, or mood or behavior status, medical treatments, nursing monitoring, or risk of death during the 7-day look-back period" (RAI User's Manual, p. I-7). The software cannot make this decision.
  • Scrubber software suggests changing coding. Mr. Spruce recently had a surgical procedure. The documentation during the seven-day look-back period supports the presence of a surgical wound and surgical wound care; however, the NAC only coded M1040E, Surgical wound, and missed coding the surgical wound care at M1200F on the SCSA. Prior to submitting the assessment, the NAC used the scrubber software per the facility policy. It generated 12 alerts, including an alert asking if the resident had received surgical wound care since there was a surgical wound present. The NAC reviews the first several alerts, notes that no changes are needed, and to hurry the process, resolves the remaining alerts without reviewing. The NAC submits the assessment with the error of the omitted surgical wound care.

The errors in these scenarios are avoidable. The NAC or IDT members responsible for signing for the item must complete the clinical decision-making process, apply the RAI User's Manual coding instructions, and validate that the coding is correct. Scrubber software is an additional check to supplement the clinician's efforts. One should never apply or ignore software's alerts or suggestions without the clinical decision-making process.

The bottom line: technology is great, but it cannot replace the need for the clinical decision-making of the human brain.

Jessie McGill, RN, RAC-MT, RAC-MTA, is curriculum development specialist for the American Association of Post-Acute Care Nursing (AAPACN).​