I/O Performance a Big Problem for Big Data in Healthcare

For all of the difficulties linked to utilizing artificial intelligence (AI) and big data in the healthcare sector, one glaring inefficiency seems to have been neglected. According to results of a recently released IT survey, I/O performance is proving to be problematic for the uptake of big data for healthcare applications. Things will only get worse if I/O inefficiencies are not addressed.

I/O stands for input/output, the system by which a human being or a piece of software interacts with computer hardware. The concept of input and output is really what governs our use of computers to do everything from checking e-mail to keeping track of budgets. If I/O performance is not operating at peak, bad things happen.

Databases represent just one area in which I/O performance isn’t making the grade. Applications running on SQL are apparently the most vulnerable. According to the 2019 I/O Performance Survey from Condusiv, nearly a third of IT pros responsible for I/O performance receive regular complaints about sluggish applications.

I/O Performance in Healthcare

Performance issues related to I/O are problematic everywhere. But in the healthcare sector, issues are exacerbated by two things: the push to convert to electronic health records (EHRs) and the introduction of artificial intelligence as a means of utilizing massive amounts of data.

If the healthcare sector had kept up with technology all of these years there wouldn’t be as much of a problem, explains Rock West Solutions. But now you have a big data situation similar to managing heavy 21st-century traffic on old world roads that were never built to handle so many vehicles.

Healthcare’s IT infrastructure was not up to the task of EHRs when they were first proposed a decade ago. It wasn’t capable of handling big data, either. Yet seemingly overnight, healthcare IT professionals suddenly found themselves stuck in a quagmire of having to deal with two big problems requiring a brand-new foundation.

As such, poor I/O performance in general creates additional challenges in healthcare. For example, you are going to have a challenging time convincing a large healthcare system to adopt an EHR package that is sluggish and prone to crashes. Likewise, a company such as Rock West Solutions will have a more difficult time creating medical diagnostic devices if the only software it has access to suffers from I/O performance issues.

Not Solvable with Hardware

An interesting part of the Condusiv report is the conclusion that new hardware alone won’t solve the problem. The report explains how IT departments are adding more hardware – especially larger, more robust servers – to address some of the I/O performance problems. But when performance is related to a database or application not fit for the task at hand, increasing server size isn’t going to help.

According to Condusiv, the biggest problem with modern I/O performance are the inefficiencies of the Windows write system. And because Windows controls the desktop environment, it has the single largest influence on I/O performance – or the lack thereof.

In essence, improving I/O performance has to occur at the software level. Perhaps that means completely overhauling SQL. Perhaps it means abandoning SQL altogether. It might even mean creating a whole new operating system separate from Windows and Mac.

At any rate, artificial intelligence and big data for healthcare both have a long way to go. They are going to have a harder time getting there as long as I/O performance continues to be a problem. Maybe it’s time for the IT sector to back off from big data and AI and, at least for the time being, focus on improving I/O performance.

Your Turn To Talk

Your email address will not be published.