Top E/M Office Visit Chart Audit Findings

Top E/M Office Visit Chart Audit Findings

In my work with clients, I often identify potential coding issues around the frequency of evaluation and management (E/M) visits compared to a benchmark. For example, based on Medicare distribution data, Chart 1 illustrates possible over-coding (relative to 46 percent level 3 and 53 percent level 4 visits) and under-coding (relative to 0 percent level 2 and 1 percent level 5). Chart 2 shows another example of potential under-coding of established patient visits.
Top-EM-Findings_Chart1 Top-EM-Findings_Chart2
When we find E/M distributions like those illustrated, above, we recommend a small chart audit of E/M codes for the physicians. The resulting recommendations tend to be repetitive. Below, we’ll review the top findings from a sampling of recent E/M chart audits.
Common E/M Documentation Shortcomings
Support for E/M coding comes down to documentation. In addition to the NCQA Guidelines for Medical Record Documentation (see the accompanying sidebar “The Basics of Medical Record Documentation”), use the correct regional Centers for Medicare & Medicaid Services’ (CMS) E/M chart auditing scorecard for your practice. For clients in Colorado, we rely on the “Novitas E/M Score Sheet,” available at: Using this or a similar template, we have identified common weaknesses in E/M documentation for many practices.
New patient vs. established patient: 
The perspective of a chart auditor is usually limited to the hard-copy printout of the chart received. Unlike the provider, the chart auditor needs to know, within each record, whether the patient is new or established. This is not clear on each date of service, but we often infer it. To support the higher reimbursing New Patient codes, it’s advantageous to note clearly, “This is a new patient,” or similar, within the entry.
Chief complaint:
Chief complaint (CC) is a concise statement that describes the symptom, problem, condition, diagnosis, or reason for the patient encounter. The CC is usually stated in the patient’s own words (e.g., “I’m here for a rash”). Don’t use CC as an internal note to the staff regarding scheduling, who needs to see the patient, or who referred the patient to the practice, for examples.

  • Review of systems (ROS): Often, all charts in a practice include a generic statement, such as, “All others negative except those mentioned in HPI.” This is not ideal. A simple change, including the patient’s ROS, can affect the level of history supported.
  • Past/Family/Social history (PFSH): PFSH should be incorporated into the patient paperwork at the practice. A simple change may affect the practice’s level of history (and, therefore, E/M code) supported.

With the list of medications is pulled forward from every prior visit, it’s rarely ever clear when it was updated. When auditing ask, “What was the old list? What are the new medications?” and “Was it really reviewed during this visit?” Physicians should use more specific language, not template entries such as, “Medication list reviewed and reconciled with the patient.”
Often, it’s difficult to discern whether the 1995 or 1997 Documentation Guidelines for Evaluation and Management Services were used. Establish a practice policy on which exam guidelines are used (if used 100 percent of the time), or when exceptions are made.
Typically, comparing documentation templates (prompts for the providers) to the 1995 and 1997 guidelines reveals areas of extra documentation that can support additional organ systems or body areas examined. An example for vision providers is a prompt in the EHR to document both “Exam of lids” and “Exam of conjunctiva,” instead of a generic prompt, such as “Eyes.” Another example is to be sure that one additional vital sign, such as pulse, respiration, or height is added to the EHR documentation template if only temperature and weight are included.
Further exam element clarifications allow the documentation to flow better with the Medicare exam documentation guidelines, which makes chart auditing foolproof. Examples include:

  • For “Assessment of hearing,” a practice may want to add “conversational speech,” if this is how it’s done in an exam.
  • Change an EHR prompt from “Facial mobility” to “Assessment of facial strength.”
  • Change “Neurologic” from “Higher integrative functions: Normal orientation, memory, attention span and concentration, language, and fund of knowledge” to add “Orientation to time, place and person” and “Mood and affect (e.g., depression, anxiety, agitation).”

Note: We see the most “cloning” in the exam documentation in the EHR. Every chart looks identical, no matter the complexity. Be careful that the visit isn’t embellished beyond the reason why the patient is there for the exam.
There’s usually room to improve capturing diagnosis codes to create a full picture of the complexity of the patient’s story. For example, many conditions are alluded to in the medication list or past medical history, but are overlooked in the ICD-10 list.
Another area of omission is when a provider lists detailed ICD-10 codes in the plan/assessment, but doesn’t pull them into the claim form for the patient that day.
Another problem found is the “search and replace” error in an EHR: The ICD-10 codes are correct, but some descriptions remain as they were under ICD-9, or vice versa. For example, J01.90 may be listed as Acute sinusitis, unspecified; however, the updated ICD-10 diagnosis code description is Acute bacterial rhino-sinusitis. Also, be sure to verify how the ICD code is pulled from the chart, and how it is linked to each of the visit’s procedures.
Medical decision-making: 
In general, the first two components of an E/M visit — history and exam — are relatively easy to audit. Code selection typically boils down to medical decision-making, and that documentation is lacking. This is the most subjective part of the audit, and providers are well-served to improve their MDM documentation:

  • Number of diagnoses: It is often difficult to ascertain if the condition is new or established to the provider, worsening or improving, etc. These documentation cues are useful to include and help auditors to “score” the first component of MDM.
  • Additional services: Did the patient provide the history? Was another physician consulted on the case? Were records reviewed? Usually, the record is silent on these possible areas of additional MDM points.
  • Consultation, laboratory, and imaging reports often are filed in the chart, but are not included in the documentation for the corresponding office visit. Verify the status and timeliness of reviewing results, and make sure follow-up plans are noted for abnormal results. If the order and results are kept outside of the E/M visit, and no mention is made, the provider gets no credit for reviewing/considering these additional sources of clinical information.
  • Table of Risk: In the absence of a specialty-specific Table of Risk, we make a lot of interpretations (judgment calls) on where things fit in the Table of Risk (what is low? moderate? etc.) For example, do antibiotic injections at the time of the visit count as prescription drug management? We recommend additional research on what is supported here, and this might be a great area of collaboration for AAPC members. For another example, “Where would a nebulizer treatment fall in the Table of Risk?”

Rarely are face-to-face start and end times with the patient documented; however, providers often put in template language something as, “I spent 50 percent of the visit counseling and coordinating care.” Additional support is necessary if time — instead of history, exam, and MDM — is relied on to determine the E/M service level.
When auditing E/M visits, also look at the charges billed on the claim. We often see areas for improvement, such as billing for supplies, correct doses of injections, and arranging CPT® codes in descending relative value unit order.

Medicare Data Sets the
Standard for E/M Distribution

Because Medicare primarily serves patients aged 65 and over, Medicare claims data isn’t necessarily comparable to all patients in a practice; however, the “bell curve” that Medicare’s data illustrates for established and new patient visits is an industry norm. In the absence of electronic health record (EHR) benchmark data for a certain specialty, we use the Medicare MEDPAR claims database, which has data from 2015. You can find Medicare E/M distribution data on the CMS website.

The Basics of Medical Record Documentation

One tool you can use to look at the overall acceptability of a practice’s documentation is the National Committee for Quality Assurance (NCQA) Guidelines for Medical Record Documentation. This is the industry standard for consistent, current, and complete documentation in the medical record.
There are 21 elements that reflect commonly accepted standards for medical record documentation. The NCQA considers six of the 21 elements as core components, indicated by an asterisk (*). Of the 21 elements, there are usually problems with the practice’s medical records in the following areas (based on our sample of chart audits):

  • Personal Biographical and Patient Data: The No. 1 item on the NCQA list is, “Each page in the record contains the patient name or ID number.” The second item on the list stresses the importance of having the patient’s basic demographic information, and that it is easy to read. You may augment personal biographical data to include address, employer, home and work telephone numbers, and marital status.
  • Authorship: This is No. 3 on the NCQA list. All entries should contain the author’s identification. In most audited charts, there is no indication of who is entering the items in the medical record (e.g., scribe, medical assistant, physician, nurse). Be sure this function is turned “on” in your EHR by entry (not for the entire chart).
  • Medication Allergies: This core component of the NCQA list is No. 7. Often audit findings show there is no documentation noting medication allergies, or adverse reactions, or no known allergies (NKA) status. If a
    patient has allergies, it should be noted prominently in the medical record — especially if it’s a medication allergy.
  • Follow-up: Item No. 18 on the NCQA list is “Consultation and abnormal laboratory and imaging study results have an explicit notation in the record of follow-up plans.” Notes should include follow-up care, calls, or visits, if applicable. The specific time of return should be noted in weeks or months, or as needed.
  • Chart Sign-off: This isn’t on the NCQA list, but Medicare requires that documentation in charts must be completed within 48 hours ( This is a significant, common area of problems. For example, one practice averaged nine days before physicians signed off on charts, with a range of 0-58 days. This meant that claims often were filed and paid before the chart was done.


Evaluation and Management – CEMC

Marcia Brauchler
Latest posts by Marcia Brauchler (see all)

About Has 5 Posts

how best to negotiate managed care contracts, increase reimbursements to the practice, and stay in compliance with healthcare laws. Brauchler’s firm sells updated HIPAA policies and procedures at She is a member of the South Denver, Colorado, local chapter.

Comments are closed.