Hospital Cybersecurity in Healthcare
Why is it like this? Why has it come down to this point? Security in Information Technology is the next big thing.
Thank you all for joining me on my second iteration for Healthcare Information Technology. Since my first post, I’ve had a few posts in between regarding data breaches and overall a more general cybersecurity discussion. This might not follow the overall order, but this will continue on from the first part so if I do repeat some things that have been mentioned, SOWWI! Simply put, I ended off last time discussing EHR and its original comeuppance into the mainstream healthcare industry. Though we’ve had HITECH implemented, as I’ve mentioned, there has been a lack of usability in those products and very few of them actually have an interest in listening to customer complaints and push for a newer generation of technology. However, it is not just the product vendors’ fault, but some of the blame could be pushed upon the hospitals and environments that employ the use of EHR’s. Perhaps it is their unwillingness to upgrade their technologies from Windows XP and old CRT monitors that innately inhibit product vendors from advancing either. Perhaps the current generation is simply getting too old for the incoming generation?
Although there was the EHR rollout, it simply was not enough to pay doctors to use EHRs, although there were a few examples in which EHR did not lead to a systemic failure, the majority of systems have been unable to achieve the growth that was expected. Such examples are Kaiser and the VA, both LARGE healthcare systems that work and employ primarily within its own system, as a result, within the own system there has been good software that can sustain frequent and nimble changes that are done to IMPROVE the quality of care and not necessarily to check off another box. Aside from being a large health system itself, the reason it is so successful is because it is primarily meant for a set of people, more specifically its members. They have a large number of both clinics and hospitals that have effectively coordinated care between locations, the financial incentives that they earn encourage improved quality of care rather than just increasing the amount of care.
Now that EHRs and its containing system have been introduced and explained, it’s now important to understand the phrase “Meaningful Use”.
An attempt by the U.S. government to define the baseline for what an EHR should be able to accomplish, meaningful use has been a very ambiguous and confusing standard these recent years. It’s confusing because it is not just one set of requirements, its definitions constantly change, and it continues to become more and more stringent as time has passed due to the increased awareness and understanding of risks that are associated with EHR until it is able to encompass the necessary aspects to improve clinical care. Although it didn’t specifically include the deployment of a “flexible, software that conforms to clinicians’ needs as a goal”, it means that the EHR today is only created to fulfill the meaningful use requirements of today, and maybe not for tomorrow.
These requirements are a series of standards and metrics that are measurable and reportable. If a system is able to demonstrate as such and use it in a valuable way, then it is able to earn the financial incentive. It was planned that institutions that adopt systems could earn about $50,000 per doctor in total payments by being meaningful users of certified EHR technology. Though most institutions fought early on, many others/late adopters were able to see what works despite earning less money from the financial incentives. The biggest will change in 2017, where Medicare/Medicaid will actually cut payments to systems that have not yet adopted EHR systems, trying to emphasize that the meaningful use criteria will stay, but the financial process will change from being a carrot, into a whipping stick. As I mentioned in the first post, money was set aside for HITECH that was meant for government incentives to hospitals and institutions that switch over to EHR right? $840 billion was allotted for the stimulus, and $30 billion of that was meant to go to hospitals and providers, and at this point about $28 billion have been spent, but there has yet been a significant improvement in the problems that were there prior to HITECH. Some of the largest companies today aren’t because of their revolutionary technologies, the only reason they rose to their market shares is because they were the fastest to release a comprehensive suite of programs, software, and services. Reading through this article from Mother Jones, I could not help but steal this quote. Only reason why these large companies are used is because of the breadth of their services, which means easier setup and maintenance without having to string together a variety of software, however the only thing that lacks is a framework that allows interoperability.
All together, it’s like the Microsoft Office of health care software—more comprehensive than any of its competitors, even if its individual components are kind of meh.
In fact, it was found in a study that 43 percent of physicians thought that EHR’s make their entire process more difficult and time-wasting. There is an increased amount of time spent on inputting data, but the painful reality is that it isn’t used for anything else other than for internal references. For most situations, records are printed and scanned or faxed to outpatient services or primary care offices. Furthermore, in the Mother Jones article, “EHR companies were engaging in ‘information blocking’ ‘to control referrals and enhance their market dominance.'”
Putting all things front end and interface interaction aside, (I’ll save that for another post and iteration)… one of the problems with the rush of meaningful use is the lack of standardization and interoperability of data if needed. At first glance, the meaningful use requirements are short and concise, making everybody happy, however the discomfort comes into play when medical information is required by doctors or a system that does not use the same EHR or even if they use the same EHR, they are not interconnected because they are a different institution. How does data transmit? How do the fields work? All of these things were decisions that were not specified in Meaningful Use and as a result, there was no standardization of data. This makes it difficult for providers who treat the same patient but have different EHR systems. Let’s take a look at dates as an example, primarily birthdates. Anyone that has taken any coding or databasing class knows that dates are not “one size fits all”, and in fact there are many variations.
You get the point, are the years abbreviated? What about the month? Is it MM/DD/YYYY or is it DD/MM/YYYY? Each one is a difference in value, now pair that with different fields that can have variations such as a name? Jim Smith vs Jimmy Smith? Or Richard Walker vs Richie vs Rich vs Dick vs Dickie… who knows how it is and if different nicknames that are used can be consolidated without accidentally merging the wrong health information? How will you know if a person with a very common name has the correct SSN? Does the system react? Does the end user react? Do the providers react?
Meaningful Use just makes me feel like this is what’s happening when institutions receive a request to share medical information for a shared patient:
Fundamentally, Meaningful Use is a good concept and has a good meaning that can make a different in healthcare without making it too difficult to fulfill. Although it has taken a decade to get up to speed on Health IT, it was a decade of confusion and frustration, however with the miscommunication and misunderstanding out of the way, it is possible to streamline and create a new standard and EHR systems efficiently without consuming too much time, effort and resources.
The sorry fact is that electronic
health records, which in theory should reduce errors and allow for more consistent delivery of medical services, were instead designed only with patient billing and control over doctors in mind.
It was meant to transform health care, to improve the quality of care and the services provided to patients while containing costs and rationing resources effectively. In short, you could say that the initial rollout was almost a tragedy of the commons, a disappointment to the wicked, and I suppose the worst of them all, the overrated of the most expected. It’s a shame, that for a process that was meant to push the health community into a better place had ended up “crappifying” it for a bit. Now, don’t get me wrong, I am ALL for technology but I feel that in order to really push for greater innovations its important to learn and study the foundation and the reasons it hadn’t succeeded. Why else are there case studies? Why else do we learn history? To learn from the past and to avoid making similar mistakes in the future. Going through my Comparative Health Systems course, I’ve learned that many countries, developing and developed countries have all found relative success with Health IT without a costly addition in EHR implementation, start up fees, and overhead costs… does that mean it would be possible for the US to try and learn from overseas implementation and try and do something similar? Unfortunately that wouldn’t work, I’m sure it’s obvious to think about it, but health systems are just so vastly different, although there may be similarities… on the internal levels and government infrastructure there is too much politicking to ever make a similar implementation be successful. It’s not about the product all of the time, but also about implementation practices. I learned that it isn’t about the “concept or thought” of Health IT and EHR, but it was about the poor implementation that has made EHR such a shudderable word in healthcare today. Bad health IT is the norm.