Degraded I/O Performance: Software as a Solution

eb769034-5656-435c-947a-7f93c6b65571

By James D’Arezzo, CEO of Condusiv Technologies

Companies can use software to solve degraded I/O performance, rather than spending significant money on hardware that often is not the best solution.

The latest worldwide survey of IT professionals by Condusiv Technologies revealed the growing impact of degraded I/O performance on key applications—and the fact that costly hardware can be unnecessary.

While global IT spending is expected to reach $3.87 trillion by 2020, many companies are not paying enough attention to their overwhelmed IT infrastructure. In particular, they overlook massive data-related performance issues and associated costs.

And the data tsunami will only grow. Experts estimate that 40 zettabytes (43 trillion gigabytes) of data will be created by 2020 at a rate of 2.5 quintillion bytes per day.

To put it in perspective, that is 300 times the amount of data reportedly in circulation in 2005.

With that increase, of course, will also come a massive shift in how organizations use that data, as we are now seeing many enterprises transition from “big data” to “fast data.” In other words, the focus shifts from having huge amounts of data available to being able to use that data rapidly, accurately and for a host of applications in daily operations.

Cost vs. Performance Challenges

In over 30 years in the IT business, I can count on one hand the number of times I’ve heard an IT manager say, “The budget is not a problem–cost is no object.”

Increasing pressure on the IT infrastructure, rising data loads and demands for improved performance are pitted against tight budgets. 

IT management and operations professionals are becoming more and more recognized as essential to company operations, as IT and automation have become integral to the success of most any business.

However, the demand for end-user performance has skyrocketed, the amount of data processed has exploded and the growing number of data applications is like a rising tide threatening to swamp even the most well-staffed and richly financed IT organizations.

Our survey of nearly 1,000 IT professionals in the third and fourth quarters of 2018 found that:

• I/O performance is important. The vast majority of those surveyed (88.6 percent) consider I/O performance an important part of their responsibilities.

• Application performance is suffering. Nearly half of respondents (45.8 percent) indicate they have applications that are difficult to support from a systems performance standpoint.

• SQL is the most troublesome application. The survey confirms that SQL databases are the top business-critical application platform and also the environment that generates the most I/O traffic. Nearly a third of respondents (27.7 percent) report experiencing staff/customer complaints due to sluggish applications running on SQL.

The balance between keeping IT operations up and continuously serving the end-user, while keeping costs manageable, is one of the IT industry’s biggest challenges.

Companies need to invest in capital expenditures, like infrastructure, and operational expenditures like personnel and cloud-based services. An IT executive must be very knowledgeable about changes in key areas: technology, the business itself and the existing infrastructure. This is essential to extending the life of equipment while meeting performance demands—and knowing what expenditures will actually solve problems.

Performance demands keep IT professionals awake at night. Server crashes or network shutdowns during a critical business period (such as end of year closing, peak sales season, or inventory cycle) are extremely costly.

Windows I/O Problems

Our survey revealed that nearly half of IT pros don’t realize that 30 to 40 percent of performance is robbed by small, fractured, random I/O generated by the Windows operating system. While Windows is an amazing system used by the majority of companies globally, it handles I/O logically rather than physically. This means it breaks down reads and writes to the lowest common denominator, creating tiny, fractured, random I/O resulting in a “noisy” environment. Add a growing number of virtualized systems into the mix and you really create overhead (known as the “I/O blender effect.”)

Software vs. Hardware

One of the biggest mistakes an IT decision-maker can make is to think that the only way to improve system and application performance is to buy new hardware. Certainly, at some point hardware upgrades are necessary, but much of performance degradation is a problem that can be solved by software. New hardware can result in improved performance for a while, but it will inevitably become bogged down again. That’s because a primary cause of I/O bottlenecks is the way data is handled by software, and that can’t be fixed with new hardware. A software solution to this software problem can offload 30 to 50 percent of  I/O to dramatically improve performance. Not only costing much less, software avoids the disruption of migrating to new systems, rip and replacement, end-user training and other challenges.

About the Author

Jim D’Arezzo is the CEO of Condusiv Technologies, a world leader in I/O reduction software.

error: Content is protected !!