Predictive analytics is transforming health care
“If you challenge conventional wisdom, you will find ways to do things much better than they are currently done.” — Billy James as quoted in Michael Lewis’ Moneyball: The Art of Winning an Unfair Game
The health care field is on the cusp of entering the era of “Moneyball” medicine. Just ask Eric Topol, M.D., director of San Diego-based Scripps Translational Science Institute. He recently hired Paul DePodesta, A data analytics guru who transformed the Oakland Athletics baseball team and now has his algorithms set on doing the same for health care. And he is not alone. More than 200 data analytics companies are vying for the attention of health care organizations, which are sitting on an untapped trove of data.
With the near universal adoption of electronic health records, large hospitals and health systems have begun to recognize something that consumer retailers have relied on for more than a decade: With the right analytics, data can predict the future and help organizations get out in front of consumer trends. In the context of health care, predictive analytic systems are being used, for instance, to understand which patients are at higher risk for hospital readmission, to reduce hospital stays after joint replacement and to anticipate staffing needs while reducing overtime.
Some health care organizations have opted to build their own data analysis systems to suit their needs, while others have found industry partners offering predictive-analytic systems tailored to the health care industry. Analytics-provider options range from huge general use vendors such as SAS, IBM and Oracle to niche players developing specialized tools for the health care market. For organizations looking to implement analytics, those who have already made the plunge suggest starting by taking stock of your organization’s current state.
Before the sexy stuff
“The first thing you need to know is what is happening in your population,” says Rishi Sikka, M.D., senior vice president of clinical transformation for Advocate Health Care in Illinois. “Do you know who is an attributed patient in your population? Do you know who is being readmitted today in your population? Do you know who is visiting the ER? Everyone wants to do all the sexy models and advanced analytics, but just understanding that current state, what is happening, is the first and the most important challenge.”
Operating as a value-based care organization for several years, Advocate has been moving from fee-for-service revenue models to value-based reimbursement, which is driving health care organizations toward sophisticated analytics systems that can quantify quality measures and track process improvements.
Sikka says that five years ago Advocate had a big problem with siloed data spread across many EHR systems that did not play well together. With motivation to improve patient care and control costs, leaders chose to invest in a partnership with Cerner, a cloud-based analytics platform that could integrate data from all of the EHRs within Advocate’s existing information technology infrastructure.
Cerner’s EHR-agnostic approach was attractive. “Being able to have a data platform that can take in all that data regardless of source and then push it out into a variety of EMRs was important,” says Sikka.
Capture clinicians’ hearts, minds
Cerner worked directly with the physician-led teams to pick a case study in which an intervention could truly make a difference in patient care. To get physicians on board, the Advocate clinical innovation team held focus groups and explained how analytic models that integrate data from many sources can have more predictive power than models that rely solely on retrospective administrative data.
“As an industry, when we talk about population health we tend to talk a lot about cost and utilization and those are extremely important … but that’s not something that gets clinicians excited,” Sikka says. “If you start with the clinical scenario and why is it important to patients … that’s where you really start to capture the hearts and minds of physicians.”
Like many hospitals, Advocate was struggling to reduce 30-day hospital readmissions, a key benchmark for Medicare reimbursement. An earlier risk-stratification model developed in-house at Advocate had yielded a model with little predictive power, a result that has been typical with attempts to use administrative data alone to classify patients. The model developed in partnership with Cerner automated the process, identifying patients deemed at high risk of readmission and embedding the information within the EHR. The first iteration of the model yielded only modest predictive value. But after only a year of use, readmissions from all causes had dropped by 20 percent among the highest-risk patients within the Advocate system when comparing outcomes in the first half of 2013 with the same period in 2014. Cerner has since made the model, now called Project Boost, available to clients in more than 100 hospitals nationwide, says Bharat Sutariya, M.D., chief medical officer and vice president of population health for Cerner. Northern Arizona Healthcare, a Cerner client hospital, has reduced its readmission rate by more than 40 percent since implementing the model in mid-2014, he adds.
Developers acknowledge that the model is only as good as the data available to it, and Cerner’s model is still missing some relevant data, such as socioeconomic factors, which may limit its predictive ability.
Further, Sutariya says, being able to identify a person as high-risk doesn’t always mean that you can mitigate that risk.
The team currently is working on stratifying groups of patients into clusters and targeting those for whom intervention is possible and may make a difference in outcome. For instance, the team is implementing a predictive tool to identify patients who could most benefit from care managers for care coordination, Sutariya says. He counsels prospective clients to think about using predictive analytics in clinical areas where changes are feasible and can make the biggest impact.
Making analytics predictive
True predictive analytics differs from bread-and-butter statistical methods such as logistic regression models that deal in probabilities. It doesn’t require human intervention to identify relevant variables or formulate a hypothesis. Often, predictive models use machine-learning techniques that find patterns in data where no one thought to look.
Mercy, the seventh largest Catholic health care system in the United States, includes 45 acute care and specialty hospitals, primarily in the Midwest. Mercy’s leadership team had already committed to develop care pathways that deliver optimal care across all of their clinical operations. As early adopters of EHRs, they had many years of historical data but found that traditional descriptive analytics was not providing solutions for what they saw as unacceptable variation across its member hospitals.
“We found out that descriptive analytics will tell us how we’ve done in kind of a rearview-mirror look, but we need to find ways to predict into the future,” says Vance Moore, senior vice president of operations at Mercy.
Mercy, he says, wants to use analytics to make meaningful improvements on both the clinical and operational side of its business. “We are trying to predict the future and stage human and physical assets near their point of need so we can service the customer much better and we become much more efficient in doing so, as well,” he says. He gives the example of the common hospital practice of waiting until a patient is ready to be discharged to find a wheelchair when it should be possible to anticipate that need well in advance.
To meet its goals, Mercy partnered with Ayasdi, an analytics company that combines machine learning with algorithms that generate geometric patterns in big data. The visual aspect of the search structure helps analysts identify data forming meaningful clusters that can be further investigated. In an early use case, Mercy wanted to understand the factors that influence hospital length of stay after joint replacement surgery.
Searching across Mercy’s network, the Ayasdi algorithms identified a cluster of patients with shorter length of stay. Looking more closely, the analyst team identified a group of providers that was using pregabalin, a neuropathic pain reliever, in the acute aftermath of surgery. The patients who received this treatment used less opioid pain reliever and were ambulatory more quickly than other patients.
“It would have been a much longer process to have arrived at this finding any other way,” says Sangeeta Chakraborty, chief customer officer at Ayasdi.
Mercy has now implemented an automated system of 84 care paths covering 80 percent of the care delivered within its system using Ayasdi algorithms. Data from each patient encounter are evaluated in real time, and as new technologies are introduced, the software constantly measures whether the technology is making a positive difference for patients. The plan, Moore says, is to measure the difference between target and reality for each care path and make changes as the data warrants, so that what works in terms of cost, quality and efficiency rises to the top.
Doctors are fully informed about why any change of care is being considered and have input into any changes, says Todd Stewart, M.D., vice president of clinical integrated solutions at Mercy.
But health care analytics vendors emphasize that hospital systems must be open to going where the data take them, even if the answers defy expectations. “Our mission is to get into the hands of doctors all of these findings that are coming out of these reams of data,” says Chakraborty. “Then it is up to the hospital system to decide the best way to make change happen.”
Homegrown analytic solutions
Some health systems have opted to develop their own enterprise systems from within by hiring data scientists and developing their own predictive analytic systems.
The University of Pennsylvania Health System has invested heavily in a centralized data warehouse and development of machine-learning models that can make forecasts and then push the results back out to Penn’s EHR system.
Chief data scientist Michael Draugelis is its systems architect. Coming from a data-analysis career in the missile defense industry, Draugelis was well-versed in delivering predictive analysis under tight deadlines. When he was hired 18 months ago, he promised to deliver a working model in six weeks and to provide information of value every month.
“Typically, when going though this [process] for the first time, there is not an appetite to say ‘we’ll spend seven months of really expensive people to come out with a failure,’” says Draugelis. “So we are picking projects where there is immediate value, and we know we can capture some.”
Penn’s early projects have centered on spotting patients at high risk for heart failure and sepsis. Until the machine-learning model was implemented, only heart and vascular service line patients were being evaluated for heart failure, and the system was only capturing about half the patients at risk, says Draugelis. Now, all patients deemed at high risk through the machine-learning algorithm are flagged for follow-up, and the hospital is piloting a project to connect high-risk patients with home care.
“Really, the value is in the data,” Draugelis says. “With a small team of developers and data scientists, you can build a system. What’s happening in the marketplace right now is there are a lot of vendor solutions out there, but I’m not sure the market is valuing the right thing right now.”
Penn plans to scale up its model, named Penn Signals, and publish it as an open source solution for others to use in mid-2016. “What I want to do is make the technology freely available, and I think that’s a first step to lowering the barrier to making these solutions available,” he says.
Draugelis does point out that any institution that wanted to deploy his open-source system also would have to invest in the data scientists who know how to use its capabilities.
First steps
For health systems with more modest resources, Seattle-based Tableau offers a data-visualization system that can import and combine data from a myriad of sources and display them in a visually intuitive dashboard. Given its origins as a collaborative visualization project by graduate student Chris Stolte and his adviser, Pat Hanrahan, founding member of the movie animation company Pixar, the software is flashy, with lots of visualization options. Its biggest strength may be its ability to import data from disparate sources and combine them in an intuitive dashboard for data novices. That ease of use may account for its explosive growth. In a 2014 HIMSS survey, Tableau was reported to be the most commonly used data-visualization software.
Andy Dé, Tableau’s managing director for health care and life sciences, says the beauty of the software is that it allows clinicians to ask and answer their own questions. The system can be deployed across an organization in weeks, not months.
While Tableau works with traditional spreadsheet-style data, it has the flexibility to incorporate much more sophisticated back-end predictive analysis, says Dé. For instance, Tableau software can be used to produce visualizations based on its predictive analytic tools.
Rajib Ghosh, chief data and transformation officer at Community Health Center Network in central California, says Tableau helps his organization’s eight community health centers manage the attributed population of about 200,000 within their accountable care organization. For two years they have been using the system to combine financial, EHR and pharmacy data to understand how people were accessing services; now they are working to predict how demographic shifts will affect utilization. During this early phase of rollout, the organization has focused on putting a data-governance structure in place and making sure data definitions are uniform.
Ghosh says that for his organization, having business analysts on staff who can look at the data critically to solve priority issues is crucial. “Tableau; data; those are tools,” he says. “These are the means to an end, they are not an end in itself. If you don’t have a person who can do that and you just have some IT developers, that’s not going to cut it.”
Karyn Hede is a freelance writer in Chapel Hill, N.C.
Trustee Takeaways
Pam Arlotto, president and chief executive of Maestro Strategies, a health care consulting firm based in Alpharetta, Ga., is the former president of the Healthcare Information and Management Systems Society. She works with large health care systems and state health exchanges, providing strategic planning services around information technology design and implementation, with an eye toward building accountable care organizations.
There any many vendors vying for market share in the analytics space, so it would be tempting for boards to make lists and bring them in for interviews. But Arlotto says that evaluating vendors should not be the first consideration.
She suggests a series of steps that trustees should take ask when delving into the question of whether to invest in an analytics system.
- First, senior leadership should understand the business case for analytics in your health system. Analytics is not “one size fits all” and depends on enterprise strategy, readiness, direction and market needs.
- Evaluate the “current state” of your analytics environment. What tools and systems do you already have in place? How are these systems currently being used?
- What business and clinical problems are being solved today using data? “For example, analytics for enterprise performance improvement is very different than that required for population health management,” she says.
- Define the vision and desired “future state” of analytics for your organization. What tools, dashboards and people will be required to be successful? What will be the role of IT, data scientists, business/clinical intelligence teams, etc.?
- Identify a potential “proof-of-concept” or demonstration project focused on helping the organization learn and show the value of analytics.
- Develop a data-governance process and define decision-making processes.
-
Finally, start a vendor short list. Develop detailed requirements based on findings from current and future state analyses. Example questions might include:
-
• Does the vendor offer a comprehensive enterprise solution or just a point solution?
-
• Does the offering align with health system needs, across all organizational levels and functional areas?
-
• What are the corporate vision and long-term plan for development?
-
• How does the vendor complement our existing systems?
-
• Does the vendor offer true analytics or just query and reporting?
Facts you should know going in
- Uniform, high-quality data are essential for any analytical results to be trustworthy.
- Predictive analytics is only effective if stakeholders buy into using the information.
- The chief analytics officer (CAO) is a new C-suite position being created as organizations realize that analytics can be a competitive differentiator.
- Strategic analytic leadership matters. Hospital systems with a CAO on the executive team have greater analytics maturity than organizations where the CAO is not on the executive team.
- Executive leadership at companies with the highest analytics maturity place high importance on the use of data throughout the organization. They also recognize their organizations are not as effective as they need to be in using data.
Sources: “The State of Analytics Maturity for Healthcare Providers,” Feb. 24, 2014, survey report of HIMSS, International Institute for Analytics; “Data Needed for Systematically Improving Healthcare,” July 31, 2015, report of the National Quality Forum.