Leading the CHARGE: AI Leadership Insights Series #4
- CHARGE
- Feb 25
- 5 min read
Featuring Helen Lu, FNP-BC, Informatics & Analytics
AI is transforming healthcare, but can smaller health systems keep pace? While large organizations have the resources to invest in cutting-edge AI, smaller health systems face steep barriers—from high costs and governance gaps to integration challenges and regulatory hurdles—widening the divide in care quality.
For the fourth edition of AI Leadership Insights, we sat down with Helen Lu, Clinical Director of Informatics & Analytics, to discuss the real-world challenges of AI adoption in under-resourced health systems. Helen shares her personal insights on navigating AI costs, establishing responsible oversight, and addressing the growing AI divide—along with practical strategies for making AI accessible across all healthcare settings.
Read the full conversation below

Q: Can you share a bit about your role and what inspired you to focus on AI in healthcare?
A: I am currently the Clinical Director of Informatics and Analytics at a community health center, my role is a role that spans across all clinical applications within Epic; I focus a lot of my work on ongoing training, data analytics, AI integration, and optimization of the EHR to enhance clinical workflow, improve provider efficiency, and drive data decision making. I was inspired to focus on AI as I saw the opportunity leveraging technology to improve outcomes and provider workflows, with the EHR being a leading cause in burnout - there had to be a better way to utilize tech and the vast amount of data that we generate in healthcare. I realized that informatics, combined with AI, could be transformative in bridging the gap with clinical practice.
Q: How did your health system start incorporating AI into its operations, and what challenges have you faced along the way?
A: We have taken a phased approach, regular clinical decision support tools (rule based) and algorithms and pathways can be considered artificial intelligence, as well as machine learning models like determining risk of opioid abuse and risk of admission from heart failure but I think what's different is generative AI. Our journey to generative AI is relatively new - starting with Ambient listening technology for clinical documentation and then AI generated patient advice message drafts. The challenges faced were around workflow and EHR integration, change management, driving adoption, data quality, regulatory and ethical concerns. For example, when implementing any AI tools regarding data privacy, we have to find common ground - a health systems legal/compliance department's goal is to protect the health system at all times, Informatics is in the middle trying to navigate, vendors want to keep the data to improve their data models, patients either don't want their data shared or do (for research purposes). The data is so rich that everyone has something to do with it.
And, patients have questions. In my system, new technology, goes through a rigorous evaluation to ensure it complies with various federal and state laws that protect the privacy, security and confidentiality of patient information. We ask for written authorization from patients so they are clear on how we use Ambient listening technology and can make an informed decision about whether to participate. All vendors listed in the authorization agreement have required business agreements that satisfy regulatory obligations that data stored is retained and secured in a manner that is required by law. The recordings and data are never sold or shared with any individuals or outside companies or organizations. Overall, the feedback from patients and providers has been very positive.
Q: You mentioned the divide in AI adoption for smaller health systems. Could you elaborate on the unique challenges these systems face?
A: I think a lot of smaller systems are going to lack the budget to invest in AI tools, as well as the infrastructure, training, and talent to support these tools. There's no general way that any of these tools are priced currently, some are priced by # of tokens, others on a per user basis, enterprise contracts, etc. When price is based on token or usage, it's impossible for health systems to budget. Many tools are also designed for large health systems or integrated health systems, which makes it less adaptable for smaller health systems. Most importantly, many smaller health systems may lack the legal and ethical oversight necessary to evaluate AI risk and bias. IT manages IT software assets, Analytics manages data assets, who manages AI assets and tools?
Q: How do the high costs of AI tools exacerbate inequities between well-resourced and under-resourced health systems?
A: The largest inequity will be in predictive analytics – well-resourced health systems can use tools for early disease detection and risk stratification to get patients care sooner; versus those without the tools will not have that same ability or potentially outcomes. Imagine having a stroke and a well-resourced hospital was able to identify it right away using an imaging tool with AI, versus an under-resourced health system without that technology.
Q: Given their cost, how should health systems measure their return on investment when adopting AI tools?
A: Close tracking of KPIs and honing in on what truly matters. A hard ROI (measurable increase in revenue, tangible cost savings, cost avoidance, improved patient care) are important. However, things like provider satisfaction and reducing burnout don't exactly have a measurable $ amount tied to them, which makes it hard to make a case for health systems to buy the tool. The other thing is, AI shouldn't be implemented just to implement or because it sounds cool, it should be implemented as part of a goal to solve a specific problem.
Q: How can we address the high costs of AI tools to make them more accessible for under-resourced systems?
A: I think a lot of the tools need to be democratized and open source for clinical decision support, imaging, etc. Public funding/grants can be provided to support AI adoption for healthcare systems that may be underfunded (like county, rural and community hospital systems). Shared AI models between academic medical centers and larger health systems with smaller health systems could also help. In addition, community-based systems like Epic’s "cosmos" could be utilized to share and host AI powered tools and analytics.
Q: Beyond cost, what are some other barriers that smaller health systems face when adopting AI tools?
A: Lack of AI / data literacy and training. People don't understand the basics of data, which then effects quality and the risks of AI tools. Interoperability will also be a challenge. Currently, AI is often treated as an add-on tool rather than an integrated part of clinical workflows, but as it evolves into more of an active role like a clinical assistant, it’s seamless integration will be critical.
Q: What best practices or strategies have you seen or implemented to help ensure successful AI adoption?
A: Starting with low-risk cases – such as ambient documentation and piloting with champions who will understand limitations and deal with the shortcomings of the tool early on. Tools that are supported by the highest level of the organization will fare better than individual smaller department tools. It also helps to have all impacted departments within an organization at the table well before adoption to evaluate the tool and ensure it meets privacy standards and will have the intended impact.
Q: How should smaller health systems think about governance of their AI tools to ensure their responsible deployment?
A: Federated learning; Smaller health systems need to establish AI oversight by adding to other governance structures that already exist (like data governance or IT systems steering committees); AI policies need to be in place to ensure responsible use with the proper people involved. Organizations need to think about who is responsible for the tools and monitoring, as well as have a deep understanding of how tools work.
Q: What advice would you give to smaller health system leaders who are just beginning to think about their AI strategy?
A: Prioritize clinician and patient experience, collaborate with vendors and larger health systems, invest in AI education and change management (and not just in the tool), ensure that tools implemented align with the strategic goals of the organization and solve problems.