Friday, September 20, 2013

On Obamacare: Here's What You Should Tell Congress


The House defunds Obamacare... for the 40th time.

This is a ridiculous state of politics. It would be a disaster to repeal the law in its entirety and start over.

The vast majority of the concepts behind Obamacare are very sound and should be retained, but the way the law was written creates enormously costly and inefficient bureaucratic overhead. The overhead is a disincentive to reform.

This is the message that we should send to Congress: "Keep the core concepts of Obamacare, but rewrite and trim down the law. Remove the long-standing laws that Obamacare retained which allow the pharmaceutical and insurance companies to operate as protected monopolies. Rather than telling the healthcare industry how to reform, act like the world's largest healthcare customer-- that's the US government-- and simply tell the industry to reform in three years."  

This is what Congress and a new version of Obamacare should say to the healthcare industry: "We are not going to pay you for every procedure anymore. We are going to pay you treat, and prevent, the entire injury or disease, from start to finish, and provide high quality, affordable care to everyone in the country. Those organizations that provide the highest quality care and health at the lowest price possible will earn more of our business. You, the healthcare industry, figure out how to transform yourself."


Tuesday, September 3, 2013

The 7 Simple Practices Of Data Governance


Data is now one of the most valuable assets in any healthcare organization, especially as we transition into a more analytically driven industry. Data is the longest lasting asset in any organization, outliving facilities, devices and people.
In the past few years, as the value and longevity of data have become better realized, the term ‘data governance’ has emerged to describe the concept of managing and influencing the collection and utilization of data in an organization. The adoption and creation of ACOs is motivated as much by the acquisition of more data to manage risk and understand outcomes, as it is by the acquisition of clinicians, patients, and facilities. If we accept the assertion that healthcare is a knowledge delivery industry, it is our obligation to exploit the data assets in our environment to augment and optimize that knowledge.
While information and data security is a long-standing body of practice and knowledge in corporations, data governance is less mature, especially in healthcare. As a result, there is a tendency to operate in extremes, with either too much governance or too little. Over time, as data and analytic maturity increases, the healthcare industry will find a natural equilibrium. For example, in the Healthcare Analytic Adoption Model, a robust data governance function is required in order to achieve Level 5 maturity.
A new body of knowledge can be a ripe ground for confusion and over-complication; and there are many vendors and consultants that have an inclination to benefit unfairly from this complexity in these formative stages. Below are the seven simple practices of data governance that can be used as a self-guided tour through the maze of puzzling advice.
1. Balanced, Lean Governance
The Data Governance Committee should practice a cultural philosophy that believes in governing data to the least extent necessary to achieve the greatest common good. Quite often, organizations will either over-apply data governance in their enthusiasm for the new function, or under-apply it due to their lack of experience. The best approach is to start off with a broad vision and framework — but limited application — and expand the governance function incrementally (only as needed, and no more). The Data Governance Committee should be a subcommittee to an existing governance structure, with the influence necessary to institute changes to workflows, resolve data quality conflicts, and develop complex data acquisition strategies to support the strategic clinical and financial optimization of the organization.
The Committee should also enlist front-line employees as Data Stewards who are knowledgeable about the collection of data in source transaction systems such as the EMR, cost accounting, scheduling, registration, and materials management systems. CIOs who function horizontally, across business lines, at the application and data content layers of the IT stack (as opposed to those who operate primarily at the infrastructure layers) are a natural fit for facilitating and leading the Data Governance Committee.
When in doubt, govern less, not more. Keep it lean. Grow slowly and carefully into the need for more.
2. Data Quality
Overseeing and ensuring data quality is probably the single most important function of data governance. When low quality data has a negative impact on the accuracy or timeliness of the organization’s decision making, the Data Governance Committee must be capable of quickly reacting to these issues and enforcing the changes required in source data systems (not the analytic systems) and workflows that are necessary for raising data quality. Simply defined, Data Quality = Completeness of Data x Validity of Data x Timeliness of Data. The Committee must make each of these variables in the data quality equation a leadership priority.
3. Data Access
Increasing access to data across all members of the enterprise, including external stakeholders, members of the community, and especially patients, is a critical function of the Committee. While the information security committee tends to protect data and restrict access to data, the Data Governance Committee should create a productive tension in the opposite direction. In the most effective organizations, the data governance and information security committees are combined, thus forcing the members to balance the tension internally and streamlining what can otherwise be lengthy decision making and reconciliation between the two committees.
4. Data Literacy
It serves no purpose to increase the quality of or access to data if the intended beneficiaries are not educated on the interpretation of data as it applies to their role in the organization. Data literacy can be increased by: (1) teaching the users how to distinguish good data from bad data in the context of their decision-making environment and role in the organization; (2) data analysis tools; (3) process improvement techniques that are driven by data; (4) statistical techniques that can be applied to improve decision making when data is incomplete or scarce; and (5) the very deliberate collection and dissemination of metadata, especially that which is associated with enterprise data warehouse (EDW) content. The Data Governance Committee should champion the cause of data-driven decision making and data transparency around quality and cost. These campaigns should include the use of slogans, spokespeople, role models and other attributes of successful causes.
5. Data Content
The Data Governance Committee should plot a multi-year strategy for data acquisition and provisioning, seeking to constantly expand the data ecosystem. Activity-based costing data, genetic and familial data, bedside devices data, and patient-reported observations and outcomes data are all critically important to the evolution of analytics in the industry. Building and acquiring systems to collect this data is the first step in the analytic journey and can take as long as five years to complete. All of the aforementioned data sources are required to progress through the Healthcare Analytic Adoption Model.
6. Analytic Prioritization
The Data Governance Committee should play a major role in developing the strategic and analytic plan for the C-level suite, and play a key role in ensuring the requirements of that plan are implemented. Inevitably, there will be more demand for analytic services than there are resources available to meet that demand. The Data Governance Committee cannot resolve every priority, but it can balance top-down corporate priorities with bottom-up requests from the clinical and business units by advocating a resource allocation of 60/40 between centralized and decentralized analytic resources — that is, 60 percent of the organization’s analytic resources should be dedicated to top-down, centrally managed priorities, while 40 percent of the resources should be distributed to support the tactical requirements of departments, business units, clinical service lines, and research.
7. Master Data Management
As the organization progresses in analytic maturity and utilization, the Data Governance Committee will become the steward for defining, encouraging the utilization of, and resolving conflicts in master data management. This role will cover local data standards (facility codes, department codes, etc.), as well as regional and industry standards (CPT, ICD, SNOMED, LOINC, etc.). In addition to coded data standards, the Committee will also become involved in binding data into analytic algorithms that should be consistently used throughout the organization, such as calculating length of stay, defining readmission criteria, defining patient cohorts, and attributing patients to providers in accountable care arrangements.
If you are struggling to understand and implement a data governance function in your organization, following these seven simple practices will help you avoid all of the major pitfalls of either under-governing or over-governing. Of utmost importance, a lean and balanced data governance function will help your organization maximize the value of your data to deliver the best possible care at the lowest price.

Tuesday, July 30, 2013

The Five Critical Information Systems of Accountable Care


More and more, healthcare is molded and critically impacted by the software and information technology that surrounds and supports the industry.  As a consequence, the C-level suite beyond the CIO must actively participate in the evolution of their organization’s IT strategy, particularly at the layer of technology where software directly supports workflows and business processes.
There are five information systems that are indispensable to the success of an Accountable Care Organization (ACO). Those five critical information systems are listed below.
  1. An Electronic Medical Record (EMR) used in a consistent and meaningful way across the accountable care enterprise to document patients’ healthcare status and treatment and support safe, evidence based care.
  2. A Health Information Exchange (HIE) to enable the sharing of patients’ clinical data across disparate EMRs in the accountable care enterprise.
  3. An Activity Based Costing (ABC) system to enable detailed, patient-specific collection of cost data that in turn enables the accountable care organization to precisely understand cost of production and revenue margins in capitated payment models.
  4. A Patient Reported Outcomes (PRO) system to enable the complete understanding of clinical outcomes and quality, from the patient’s perspective. This is not a patient satisfaction system—it is a clinical outcomes assessment system, tailored to the patient and their protocols of treatment.
  5. An Enterprise Data Warehouse (EDW), which is central to enabling the analysis of data collected in the information systems described above—and more. Without the EDW, the data collection systems described above are relegated to small or non-existent ROI.  It is the exposure and integration of the data in the EDW that liberates the ROI from those systems.  It is common for EDWs to realize an ROI as high as 450% in two years.
aco-gears-simple
Over the past seven years, significant national policy and strategy attention was focused on EMR and HIE adoption, but very little attention has been paid to the other three components of ACO IT described above. As evidenced by the recent departure of several organizations from the Pioneer ACO program, the inability to integrate and analyze data in an EDW is “the single most frustrating issue” those organizations face.  Even less national attention is being paid to the development and adoption of systems to collect detailed cost accounting data—Activity Based Costing and Patient Reported Outcomes data.  Without the former, C-levels cannot accurately measure the true cost of care and the risks of capitated payments. Without the latter—Patient Reported Outcomes data—C-levels cannot accurately understand patient- and protocol-specific clinical quality.  Also, precise predictive analytics and patient risk stratification is unachievable without patient reported outcomes.
At the center of these data collection systems is the EDW—the platform that enables the analysis of clinical, financial, and patient reported data in a single database repository.  The EDW is the system that drives the Return in the ROI of healthcare IT for the ACO.

Background: ACO

The term ACO is used here in a broader sense, beyond the burdensome federal definition.  In this context, the term ACO implies very simply to the shift in reimbursement from procedure-based, fee-for-service towards fee-for-quality, disease, or condition-based reimbursement with capitated payments to healthcare delivery organizations on a per-case and per-capita basis.  I also imply strongly that the governance structure of the accountable organization offers an insurance plan alongside healthcare delivery services, under a single CEO and Board.  A quick assessment of the most successful U.S. healthcare systems, such as Intermountain Healthcare, reveals one very simple common trait:  They offer health insurance plans and health delivery services under a single CEO who balances the economics of care with access to care and quality of care.
Federal funding has essentially guaranteed the industry’s adoption of EMRs and HIEs. One could argue that the quality of the products underwritten by federal incentives (e.g., Meaningful Use) and grants is mediocre at best, but, nonetheless, some version of these products is necessary to electronically collect and share patient care data. There is little argument in opposition to that concept. But, despite the investment in EMRs and HIEs, our national health strategy has not gone far enough in its encouragement of IT adoption.  The EDW is the most important link in the chain of IT products required for accountable care.
In the past six years, the U.S. healthcare industry experienced an unprecedented investment in software and technology, particularly EMRs and HIEs. Conservative estimates place that investment at over $100 billion.  Despite those massive investments, there is no compelling or defendable evidence of any notable return on investment.  The needle is not moving on the dashboard of U.S. quality of care nor per capita cost of care.  To worsen the bleak picture, physician satisfaction with the EMR products stimulated by these federal incentives is only 39% and declining—six out of 10 physicians are dissatisfied with the EMR they must use to support the treatment of their patients.  Seventy-eight percent of public HIEs fail due to financial insolvency after federal and state grant monies are removed from the model.  One-third of the organizations in the CMS Pioneer ACO Program are dropping out because, despite their investments in EMRs and HIEs, these organizations are unable to adequately quantify the quality of care and financial risk for managing patients in the ACO.
The federal policy and financial incentives for increasing the adoption of EMRs and HIEs was nearsighted. Without an investment in the software and technology of an EDW, the healthcare industry will not reap any worthy benefits from the investment in EMRs and HIEs.
Unlike the ROI track record of EMRs and HIEs, EDWs have consistently shown GAAP-certified, double and triple digit ROI in under two years. In other industries, the average ROI from an EDW is 431% over five years. At Intermountain Healthcare, the EDW that my team and I designed and implemented with a late binding architecture, was assessed in 2004 by a GAAP-certified 3rd party and the results were amazing—1,468% ROI in less than two years. Allina Healthcare’s EDW was recently assessed for ROI and that very conservative case study revealed a 52% ROI in 18 months.
Surprisingly, and despite these impressive ROI results, our national strategy for the computerization of healthcare stopped one step short of effective.  It’s not enough for us to invest in EMRs and HIEs.  Less than 25% of healthcare organizations have any type of EDW and less than 10% of those few organizations are operating consistently at Level 5 of the Healthcare Analytic Adoption Model, summarized in the figure below.  C-level executives need to invest in EDWs on their own, and those healthcare organizations that do so will soon be in a position to distance themselves from competitors and the mediocrity of the current U.S. healthcare market.
slide-4-638

Organizations like Intermountain Healthcare and Allina have been operating on their own as accountable care organizations for decades, without federal incentives for IT investments and without the complicated business model and administrative overhead of a CMS ACO. They understand the value of an EDW in maximizing the quality-cost ratio of healthcare delivery. If you want to be an ACO—federal or otherwise—follow their role model.

Summary

Although the healthcare industry is progressing in its adoption of information technology to enable healthcare that is accessible, affordable, and measurable, we are overlooking three of the five most important IT investments necessary for accountable care—i.e., the EDW, activity based cost accounting systems, and patient reported outcomes systems. Vendors are not yet offering products that support activity based costing or patient reported outcomes. The lack of these products is a significant gap in our national health data strategy, but also an opportunity for entrepreneurs to develop highly valued products.
The EDW will enable a deep understanding of care quality, variations in care, and costs of care. The EDW market is emerging, with over 40 vendors offering various products in this segment. In the next three years, we will see major market consolidation and shakeout, similar to that seen with EMR vendors. At the end of this consolidation period, I predict six vendors will remain standing that are capable of supporting analytics up to Level 8 in the Analytic Adoption Model. Unfortunately, healthcare executives cannot wait three years for the market to consolidate. Market forces are accelerating the need for analytics as a tool to survive, first, and thrive, second. C-levels must choose a partner now and hope that the partner they choose can provide a solution that will survive the shakeout.

Deep Background: Principles of IT

There are three principles of information technology that are common across industries. As healthcare becomes a more tech-savvy, computerized industry, it’s important for healthcare executives beyond the CIO to understand these principles and their impact on the business.

Principle #1: Business Moves at the Speed of Software

Information technology—specifically software—is now the dominant variable in business agility and adaptability.  In the past, the dominant variables were the culture and skills of the people of the organization.  But even the best, most talented employees can perform their jobs no better and no faster than the functionality of the software and information technology that surrounds them.  The best software and information technology can drive unprecedented business insight and business models.  The worst software can drive a business into the ground.  Accepting this principle implies that business leaders in the C-level must engage themselves in the strategy and decisions about their organization’s software acquisitions and configurations to an unprecedented level of detail.

Principle #2:  Big Value Comes From Big Data

The software and underlying technology that enables the analysis of patterns in large data sets is also enabling extraordinary new ability for businesses to identify opportunities for greater efficiency, higher quality, lower overhead, higher margins, and new products.  Big data, business intelligence, analytics—these are all buzzword synonyms used to describe the same underlying combination of software and technology—that is, the EDW.

Principle #3: Data Management Evolves Predictably

Every industry passes through three phases of data management maturity. Phase 1 is the Data Collection Phase. Data is collected in a transaction workflow system, such as a point-of-sale system, to support a particular workflow (i.e., a purchase). Phase 2 is the Data Sharing Phase. The data that is collected in the transaction systems of Phase 1 is shared with other downstream data collection systems and workflows, such as a general ledger or supply chain management system. Phase 3 is the Data Analysis Phase. In this Phase, the data that is collected and shared to support specific transactions and workflows in Phases 1 and 2 is aggregated across thousands and millions of those individual transactions to analyze macroscopic patterns in the data. In healthcare, Phase 1 is characterized by registration, scheduling, laboratory, and electronic medical record (EMR) systems. Phase 2 is characterized by the implementation of health information exchanges. Phase 3, barely underway, is characterized by the implementation of the EDW.

Healthcare Analytics: Looking Past The Smoke & Mirrors


I had a recent opportunity to engage in an online discussion with a well-known healthcare analytics vendor about the value of comparative analytics, predictive analytics, and natural language processing (NLP) in healthcare. This vendor was describing a beautiful new world of the future, in which comparative data in particular would be the cornerstone of our industry’s turnaround. The executive summary of my response: Beware the smoke and mirrors.
My full response is below.
We’ve had comparative data for years in the US healthcare system and it hasn’t moved the needle toward the better, at all. In fact, the latest OECD (Organisation for Economic Cooperation and Development) data rank the US worse than we’ve ever been in healthcare quality and cost. Comparative data, like the OECD, is interesting and certainly worth looking at, but it’s far from enough to drive improvements in an organization down to the individual patient. To drive that sort of change, you have to get your head and hands dirty in your own data ecosystem — not somebody else’s that is, at best, a rough facsimile of your organization.
There are too many variables and variations in healthcare delivery right now that add too much noise to the data to make comparative analytics as valuable as some pundits advocate. We don’t even have an industry standard and clinically precise definition of patients that should be included (and excluded from routine management) in a diabetes registry, much less the other 15 chronic diseases and syndromes we should be managing.
We’ve also had predictive analytics supporting risk stratification for years in healthcare, particularly in case management. But without outcomes data, what are we left to predict? Readmissions. That’s a sad state of affairs. Before we start believing that predictive analytics is going to change the healthcare world, we need to understand how it works, technically and programatically. Without protocol and patient specific outcomes data, predictive analytics is largely vendor smoke and mirrors in all but a very small number of use cases.
We’ve had NLP for years in healthcare as well, with essentially no impact on the industry. When I joined healthcare, I brought with me a deep background in NLP and predictive analytics from the military, national intelligence, and credit reporting environments, in hopes that we could revolutionize the industry. We’ve made incremental progress, but there are fundamental gaps in our industry’s data ecosystem — missing pieces of the data puzzle — that inherently limit what we can achieve with NLP.
Google revolutionized the world of NLP, but Google leveraged a metadata ecosystem that is layered on top of traditional NLP strategies to achieve the revolution. In healthcare, we don’t have the same metadata ecosystem within the current generation of EMRs, much less across EMRs. In today’s EMRs, we have little more than expensive word processors. I keep hoping that the Googles, Facebooks, and Amazons of the world will quietly build a new generation EMR, but with $29B in federal money now squandered on our existing generation of EMRs, there’s very little motivation in the market for innovation.
Does my cynicism and caution imply that we should turn away from comparative data, predictive modeling, or NLP? Absolutely not. I’ve been advocating these analytic tools in healthcare for 17 years. It implies that we should take advantage of the easy value analytic victories, first. Don’t chase the asymptote. Don’t chase the latest fad and vendor hype. Deliberately, but quickly, move your organization up the levels of the Analytic Adoption Model. It’s going to take at least five years, maybe longer, and a new generation of EMRs, patient reported outcomes systems, and activity based cost accounting systems before we can close the gaps in our data ecosystem to make predictive models and NLP widely valuable in the industry.
Comparative data will not be as valuable as it should be until we squeeze the variability out of our healthcare practices and standardize our data definitions of diseases and syndromes. If that happens in five years, I’d be pleasantly amazed. Variability analysis, not benchmarking, might be the most useful application of comparative data in our current healthcare environment.
Beware the vendors that oversell what’s possible, my friends. We are in the uphill climb of the hype cycle right now.

Wednesday, May 29, 2013

The CIO as Analytic Hero


As Vi Shaffer of Gartner said in her Fall 2012 CHIME keynote:  “The I in CIOs of the future stands for analytics.”

All of us quietly yearn to be heroes. CIOs are no exception.  We want to harness the power of information technology to dramatically improve healthcare quality and costs.

Despite their privileged position atop the IT food chain, though, only a handful of healthcare CIOs ever get to realize this dream.

Why? Simply put, CIOs never own both the data content and application layers of any meaningful technology, at the business transformation level. With rare exceptions, the CIO’s role in any enterprise-wide technology implementation is second chair to leaders in other verticals.  The CEO, along with HR, owns the enterprise resource planning (ERP) system.  Clinicians ultimately own the EHR.  CFOs own revenue cycle and general ledger.  CIOs own email.  In every case, the institutional power of the vertical most affected by the technology tends to lead its implementation.

Which is why the Enterprise Data Warehouse (EDW) represents a CIO’s chance to be a transformational hero in healthcare.  I’ve been a CIO for 22 years of my 30-year career, with a good track record in a variety of areas in that role, but the one area that is consistently recognized as the most valuable to the organizations that I served is my leadership of the EDW and analytics strategy.

Unlike other enterprise-wide applications, the EDW crosses all verticals but fits comfortably within none.  It draws data from multiple source applications serving multiple verticals, including the EHR, ERP, revenue cycle, performance management and patient satisfaction applications. It empowers leaders in virtually every vertical from clinical to quality to finance. And due to its highly technical nature, the EDW is one application that few outside of IT will have skills or interest in understanding. It fits the data-driven, nuts-and-bolts details personality of a CIO like a (sensor-laden, wearable computing) glove.

For a CIO who’s up to the challenge, the EDW is a good thing to own. It’s that rare opportunity to be involved at both the application and data content layers in a transformative way. Through the EDW, a CIO can truly manage the data needs of everyone in an organization rather than one vertical at a time.  Drawing financial, clinical, operational and patient experience data from isolated silos across the network, the EDW integrates and combines data, making it, ultimately, actionable by the organization. No one is better suited to this task than the CIO—the technology specialist and the business generalist in the organization.

The Three Types of CIO

A CIO’s odds of becoming an analytic hero are deeply affected by their capabilities, their leadership style, their affinity for technology, and the culture of their organization.  There are three basic types of CIOs that I’ve witnessed in CIOs and seen in myself – the Technologist, the Businessperson, and the Integrator.

The Technologist: This first mode of CIO leadership describes someone whose primary interest is infrastructure technology. In the IT stack, they are naturally attracted to the layers below the application and data content layers, focusing on data centers, networks, operating systems, storage, servers, security, desktops, and smartphones.  Many CIOs are highly skilled in this area and return a lot of value to their organizations in this role.  While offering big value to their organizations, these CIOs are not often included in the circle of strategic decision making with the rest of the C-levels.

The MBA: These CIOs see themselves more as business leaders than technologists, and they typically do not have a deep IT background. They are attracted more to the upper layers of the technology stack, where the software and data meets the vertical business and clinical users.  They play a more significant role as a member of the C-level suite.  The downside of this type of CIO is that they are challenged working their way down into the technology stack, just as the Technologist is challenged working their way up in the stack. This type of CIO can get into trouble by under-managing the importance of a solid and affordable technology infrastructure.  They often spend too much or too little on infrastructure, either of which has dangerous consequences.  More often than not, I see CIOs in this category overspending on the infrastructure layers, throwing money at misplaced risk management, because they know no better.

The Integrator: The third type of CIO can move up and down the layers of the IT stack, with ease.  These are the rare veterans that many of us aspire to be.  He or she can talk to the CMO, CMIO, or CNO and understand the organization’s needs at the application and data content layers, near term and long term. The Integrator can then go back to their IT teams and understand the capabilities and the possibilities of the technology, and work with vendors to help bridge any gaps. Because of their mastery of both the data content and application layers, Integrator CIOs can make a huge contribution to the leadership of a healthcare organization.  They sit at the executive table and are highly respected in the C-level suite.

Probably fewer than 5 percent of CIOs fall into this last category – largely because the culture of healthcare doesn’t yet recognize that this ours is fundamentally an information-enabled industry. As a result, many CIOs who are capable of becoming Integrators never get the chance.  Their executive teams simply can’t envision a CIO contributing to the strategic decisions and value of the company.  There’s no seat at the executive table, even if the CIO is capable of sitting in it.

For CIOs who have the skills, attraction, and aspiration towards the Integrator role, leading the strategy behind analytics and the EDW can be the winning ticket to becoming an Integrator and thus an Analytic Hero for your organization.  Not only does the EDW supersede verticals in its potential ROI, but when properly deployed in tandem with cross-functional teams from clinical, quality, analytics and finance, it can relieve the IT department from its “report factory” mode. In turn, the CIO is freed to become a strategic contributor to the larger organization.

If you’re a CIO, do these three models accurately represent your experience of the role?  If they do, which of the three modes best describe your role in the organization?  Are you already an analytic hero in your organization?  If not, leading the charge on the deployment of an EDW offers that chance.

Thursday, April 11, 2013

From Governance to Standards: Data Warehousing Decoded


Originally appeared on www.healthsystemcio.com:
This post consists of questions posed by a healthcare organization looking to implement a data-warehousing solution. The answers come from Dale Sanders, senior vice president at Health Catalyst and CIO mentor and senior technology advisor for the Cayman Islands Health Services Authority. Sanders, a former CIO at Northwestern Medical Faculty Foundation, has spent the last 15 years applying IT to improve quality and efficiency while also reducing expenses.
The questions were posed by representatives from eHealth, the IT department of a healthcare system.
Dale Sanders, Senior Technology Advisor & CIO Mentor, Cayman Islands HSA
Dale Sanders, Senior Technology Advisor & CIO Mentor, Cayman Islands HSA
Q: We want to proceed with a data warehousing initiative in increments with budgets of 300-500K at a time. Steps must be opportunistic in order to avoid mistakes due to poor planning and a lack of insight. We need to know what crucial decisions we need to make now. There are some cultural issues in regards to data ownership and data sharing. We want to ensure we don’t make fundamental mistakes early on. 
Although eHealth is the technology department, there are pockets of data, analysts, and developers which are outside of eHealth. The Center for Health Policy has its own data repository, governance, and processes. We would like to label our top operational systems (ADT, Emergency). By demonstrating results, we will be able to fund a system expansion. From the business perspective, if we had the top 20 data marts, we could start conforming the dimensions that are common to the data marts. This will support buy-in by reducing maintenance and offering more comprehensive reporting.
A: One of the challenges you will have is to decide whether to go back to the source systems or to extract from your existing data marts. In either case, it is essential to identify which core dimensions of analysis are most important (provider ID, patient ID, etc.). For these core dimensions, you need to define naming conventions, data types, values, etc. You’ll need to add the attributes to the source systems or the data marts; probably the latter.
The first thing you should do is define approximately 20 dimensions (standards and naming conventions). I call this the core data element of the “Bus Architecture.” Having this in place will allow you to query across all of the different systems — whether through a source system or a data mart– without remodeling the data.
Q: What is the best approach to ensure that we cover standards for the systems in the data warehouse (demographics, naming conventions, etc.)? We currently have the standards, but we don’t know if all of the source systems are using them. 
A: First you should consider the ROI for the data acquisition. You want to go after the high-visibility and high-value data sources first. My mantra for organizations is: “No data governance before it is required.” Oftentimes organizations try to govern too much and too early. You do not want to govern just for the sake of governing. Allow for and tolerate a certain amount of ambiguity in data governance, initially.  Focus those limited governance activities on increasing access to data; improving data quality; and increasing the data literacy of the organization.  The key is to start small and build some success stories. Others will be attracted to that, and the governance structure will coalesce around that rather than inhibit the evolution of the data warehouse.
One concept that is a bit counter intuitive to most health IT types is that you do not have to conform all of the dimensions a transaction system standard like HL7 might follow. Here’s an example: a team got wrapped around itself standardizing the data types and lengths for patient and provider names that did not have any analytics use cases. All that effort was unnecessary because analysts rarely join across source systems based on name, but rather use a numeric person identifier– an MPI. Do not try to conform all of the dimensions and adopt all the standards that are out there, all at once. Focus on a small number of core dimensions of analysis. Over time, as the number of analytic use cases expands, you can standardize as you need to. Don’t try to boil the ocean.
Q: How do you approach bringing in data sources while integrating enterprise information standards?
A: The data model issue is the greatest cause of failure for healthcare enterprise data warehouses (EDW) in the U.S. There are four data models to consider with a healthcare EDW:
1)    Star schema, advocated by Ralph Kimball
2)    Enterprise Information Model/Corporate Information Model, advocated by Bill Inmon and Claudia Imhoff
3)    I2B2, which is a variation on a star schema designed at Harvard Medical Center to facilitate information exchange between academic medical centers
4)    The “Late Binding Bus Architecture” that I advocate. This is the approach we used at Intermountain, Northwestern and another 20 to 30 organizations across the U.S. This model is significantly different from the others, because there is very little data modeling that goes on.  The focus is on data relating, not data modeling.  Why remodel the data when the source systems have already modeled it for you?
When defining high-value data sources, consider the following:
  • The volume and breadth of data. Simply put, when there is a lot of data, there is usually value, analytically.
  • What are the pressing issues in your organization or patient population? Start where there is a hot item of analysis or vexing issue where there is strong leadership around solving the issue. By addressing these high profile questions, first, you will attract the attention of other opportunities for analytic use cases .
  • Analytic needs and strategy for the patient population in Province.
Q: What would the warehouse look like, and where would business rules be applied?
A:  Business rules in a data warehouse can be applied in one of six “binding points” in the flow of data, from the source system (Binding Point 1) to the visualization layer (Binding Point 6).  The fundamental healthcare value equation is quality divided by the cost of production.  Clinical effectiveness analytics — which is the attempt to measure the clinical quality and effectiveness and adherence to best practices — is becoming a greater focus of the healthcare analytics community across North America, so this is something to take into account when designing the warehouse.  The rules around quality are changing all the time, so you want to bind to those rules very late in the flow of data in the EDW.  The rules around cost are not quite as volatile, even though the data quality is terrible in the US.  But you can bind to cost rules earlier in the architecture.  In Canada, I suspect you would need to build the infrastructure around the numerator (quality), first, but then build the data content to support the cost of production over the upcoming years.  In the US, the cost of care is an increasing concern.
Q: What are your thoughts on what governance should be in place and how it would develop as the EDW develops?
A: There are several things you can do to successfully evolve and gain buy-in for data governance structures:
  • Communicate to let broad stakeholders know you are engaged in a new and integrated EDW. This will alleviate the knee-jerk concerns that an EDW can evoke.
  • Consider publishing a three-year roadmap that outlines development of content, security, auditing and layers of the EDW to the executive sponsors and stakeholders. This roadmap can also demonstrate the evolution of the data stewardship and data governance structures.
  • Consider involving the CIO heavily as part of the governance structure to help break down barriers of access to data.
  • I’ll share with you what I call the Library Metaphor. There isn’t a lot of need for governance of a library while it is being constructed. It isn’t until books and periodicals are ordered that governance becomes an issue. Access and security comes after the building. Over time, those working with the EDW as a career become librarians who don’t always understand how the content is being used, but know that people are accessing it and doing useful work. As the community literacy with the data expands, so will the need to expand the library. The same is true with the EDW: start out with the core data, and then users will ask for new and different content which needs to be expressed in different ways. As librarians, we need to be aware and in tune with the community we are supporting.  It makes no sense to build a highly complex and capable data warehouse in a community that has little or no data literacy.
Q: Regarding the roles for those working directly with the EDW, would they come from a core group or from different functional areas?
A: I recommend a staffing model consisting of 60 percent regionally or centrally assigned personnel and 40 percent business-unit assigned. This holds true both for projects and operationally. This model balances nicely for the evolution of the EDW. It allows for business cases to percolate up while allowing for centralized analytic use cases. You definitely want to allow for both."

Wednesday, March 27, 2013

Commit and The Path Will Clear

Wednesday afternoon, 2:30.  I've told a longer version of this story before, but it's good for me to revisit it.  Revisit the gratitude.

Just got off the phone with Mom for our afternoon check-in call.

My father passed away when I was 17.  In reality, his death came a year sooner, because the stroke left him a shell of the person he once was, for the last year of his life.  I was the youngest of six children and my siblings had all moved away and started families of their own by the time Dad died.  After his death, I changed college plans, stayed in Durango, graduated from the small local college, and then went into the Air Force Officers Candidate School and information systems engineering program.  With my departure, Mom started living alone in 1983.  For virtually my entire adult life, I wondered and worried how, given the nature of my profession, I would ever be able to move back to Durango, make an affordable living in a town of 12,000 population, and care for Mom when the time came.

In December of 2010, while sitting on the couch one morning, staring out at the beach in the Cayman Islands, I decided to stop worrying about how a move back to Durango might occur and simply commit to the decision and hope that the "how" of the details would iron themselves out, leaving those details to God. I was willing to live like a pauper if that's what it took.  Mom's time on earth was slipping away while I let the fear of an unknown path stand in the way of the commitment.

The rest is history. It all came together.   Later this evening, I'll stop by Mom's house on he way back from the gym and check in on her.  She'll be 89 years old in a few months.

It seems there's a bigger lesson underlying this specific situation. The path is not always clear, but the commitment can be. Make the commitment, and if the commitment is pure, the path will clear.



SpaceX Inspirations

SpaceX launched a two-astronaut crew yesterday, on a mission to dock with the International Space Station. It was the first human spaceflig...