Get Answers to 5 Burning AI Questions From Coding Industry Experts
Could coders be used to train AI models in the future? Medical coders and auditors have voiced their concerns about the growth and expansion of artificial intelligence (AI) into the revenue cycle management (RCM) space. At AAPC’s AUDITCON 2025, panelists held an information-rich session where they candidly answered questions from attendees about AI in medical coding and what they see for the future. Read on to find out what the panelists had to say about AI’s role in medical coding and auditing. What Skills Should Coders and Auditors Focus On as AI Evolves? One skill that coders and auditors should build as AI continues to evolve is critical thinking. Until an AI model is properly trained for your organization, it has the potential to generate incorrect results, look to the wrong information in a medical record to suggest codes, and inaccurately interpret guidelines. If your critical thinking skills are strong, then you have less chance of being replaced by an algorithm. “What the AI can’t replace is the human aspect of looking at a chart from top to bottom and saying, ‘Yes, this does appear in the medical record documentation, but with all the other context of the record, it doesn't make sense,” said Colleen Gianatasio CPC, CPCO, CPC-P, CDEO, CPMA, CPPM, CRC, CPC-I, during the session. Gianatasio encouraged attendees to look at their organization’s internal policies and evaluate if they’re looking at the information in the medical records correctly and check to see if official guidance is being applied or they are applying a random policy. Another area where critical thinking comes into play is being able to apply regulatory and compliance expertise correctly. “AI might be able to assign codes quickly, but it struggles with clinical nuance, right? It can't really, truly reflect medical necessity or apply guideline exceptions,” answered Rhonda Buckholtz, CPC, CDEO, CPMA, CRC, CENTC, CGSC, COBGC, COPC, CPEDC, AAPC Approved Instructor, when posed the question. How Do You Ensure AI Is Interpreting Guidelines Correctly? Once again, your critical thinking skills as well as your ability to verify correct information is crucial to making sure that AI is interpreting the official coding guidelines correctly. The AI is a tool that can point you to an ICD-10-CM official guideline, but then it should be on you to verify that the guideline does in fact apply to the AI-suggested code or codes. “I think our roles will be augmented. It’s going to be more of the validation and monitoring of the outputs. You’re never going to take the answer as the answer without verifying it. You’re going to audit the output and make sure that it’s agreed on,” said Raemarie Jimenez CPC, CDEO, CIC, CPB, CPMA, CPPM, CPC-I, CANPC, CRHC, president of Membership and Content at AAPC. How Can You Document and Defend AI-Assisted Decisions if You’re Audited? Ultimately, the buck stops with you, so you need to know who built the AI tool that your organization is using. Scenario: Your physician coded medical records, and you approved the claims for submission. Later on, an external entity audited the claims and found several records to be incorrect. If the auditor asked you who was responsible for coding claims that were incorrect, you couldn’t claim that the provider was responsible for the incorrect codes because it was your responsibility to review the codes before submission. The same goes for handling claims coded by AI. “That’s where you have to know who built your AI. You’re responsible for whatever you put out there. The buck stops with you, so you really do need to know how the AI rules were built,” Gianatasio said. You should perform regular audits on your vendors, especially on vendor contracts, so you’re aware of what they’re supposed to be doing and what they’re not doing. “Whether it’s technology or a person in your organization that created that output, you are still responsible,” Jimenez added. How Do You Make Sure Innovation Doesn’t Outpace Governance? You can’t because innovation will always outpace governance. “You can’t make up the rules before you know what you’re making the rules up for,” Buckholz said. However, you can examine a new technology or innovation to look at all the different ways that a device or software will interact with your systems, and then put rules in place to protect yourself and the organization until industry regulations are established. What Will Coders and Auditors Be Doing 5 Years From Now? The growth of AI will offer ample opportunities for professionals in the RCM space, such as medical coders and auditors, to train and retrain the AI models. Medical codes, healthcare rules, and AI regulations are changing frequently, so human personnel will be needed to ensure the AI models are operating correctly. At the same time, coders will have a chance to be in the room as the AI models are being developed and tested. “As somebody who has purchased a lot of different solutions from vendors, a vendor who brings a coder to the table for one of these solutions as part of that initial conversation already is bumped up five times in my head,” Gianatasio shared. As coders, you can answer the specific questions that coders would have about the AI technology, such as whether the model can handle combination codes. “I think this expands our opportunities to be part of the building, and then be part of the implementation, the selling, etc. I think it’s going to result in a lot of job opportunities,” Gianatasio added. Mike Shaughnessy, BA, CPC, Production Editor, AAPC
