A better understanding of your mainframe’s performance is one of the most important and cost saving actionable items your team will continue to execute.
Mainframe Performance and Optimization Services for z/OS Sub-Sub-System and DB2
The heart of the technical infrastructure is your mainframe. It is entrusted with your organization’s critical data and crucial processes. There is direct reliance on its stability and availability to keep the business running.
Sub-system files are the building blocks on which ALL your mainframe software is constructed and operated. Without the necessary improvements to sub-system file definitions on a regular schedule performance inconsistency and stability issues become disruptive and costly. We’ve harnessed the expertise we’ve gained from 36 years to easily identify and correct sub-system file issues causing instability, performance concerns and more importantly the need for unnecessary and costly upgrades.
Mainframe Cost Containment
Technology is being relied on to reduce bottom line costs. The data center is on average 20% – 25% of IT spending after stripping out application development and support costs. Modernization initiatives, whilst providing long term benefits of agility, loose coupling, hardware independence and other benefits, call for significant upfront investments and long ROIs. Often these investments are unnecessary and cannot be justified.
Even with all the recent technological advances about 80% of modern-day enterprise IT systems are still running on what some still call legacy platforms. Reinforcing the value of the mainframe to organizations, the 2019 Arcati Mainframe User Survey found that 84 percent of sites surveyed have seen some kind of increase in capacity. Therefore, maintaining the efficiency of these modernized mainframe platforms is crucial to cost containment and begins with ensuring performance at the sub-system level.
Outcome Based Mainframe MIPS Reduction
Mainframe performance tuning for “optimum performance” assumes a new significance this year with IBM price model changes and tight economic conditions. Also proliferation of batch jobs, multiple test environments, sub-systems and shared resources are thrusting optimization into the forefront. Mainframe performance management offers a means to measure infrastructure and application health as well as lower IT costs relatively fast. For this reason, it is receiving increased attention in organizations across the globe.
Even a well-tuned application can have a performance problem over a period of time as program file attributes change. Mainframe performance “ignorance” reduces application availability, causes an escalation in costs due to extraneous processing, surges in batch processing with excessive CPU utilization and unnecessary job wait time.
Any mainframe optimization initiative that uses major design changes is risk prone. Optimization through local sub-system file definition improvements averts program changes and reduces structural inefficiencies. Mainframe optimization and validation of environmental factors has high degree of success achieved in a short time-frame elevating faster ROI’s.
Mainframe Db2 and SQL Optimization
One of the top MIPS consuming operations in the production environment involves the use of DB2. Query optimization is one of the factors that affect application performance and therefore CPU consumption. Achieving MIPS reduction at low-risk and low-cost by improving SQL code is the basis of our Db2 optimization platform solution. Identify and improve inefficiencies in Db2 and reduce bottom line IT costs.
MIPS & MSU’s
Mainframe CPU consumed is expressed in MIPS (Millions of instructions per second). Mainframe capacity is designated MSU (Million service units); on the most recent Mainframe machine (for example the latest z14), 1 MSU equals around 8 MIPS.
According to Gary Crook, CEO of Heirloom Computing (June 2018), for a large mainframe of more than 11,000 MIPS, the average annual cost per installed MIPS is about 1,600 USD (12,800 USD per MSU) for larger users exclusive of software MLC.
Consequently, it is determined the annual MIPS cost for an 11,000 MIPS mainframe is approximately 18 million USD. For smaller users this cost escalates considerably.
Using the figures above the choice to optimize, as a function of cost reduction, becomes simple. A 10% reduction of CPU consumption equates to annual savings of nearly 2,000,000 USD. Our platforms have been proven to reduce CPU consumption in excess of 30% annually.
Is your mainframe at risk?
Companies that decide to not take any action put themselves at risk.
Ignorance is a recipe for catastrophe! In business, some of the worst problems are those identified too late. Why wait for a business interruption to realize that the mainframe sub-system is in need of optimization? Sub-system optimization can prevent instability and ABENDS ensuring employees are productive and customers are happy.
Some companies stop or reduce Mainframe budget allocations under the assumption that nothing can be done to cut the related costs. Critical Path Software employs a unique “gain-sharing” approach allowing companies to significantly reduce data center cost and experience fast, elevated ROI’s while nearly eliminating risk.
Key challenges to mainframe sub-system optimization:
Analytics teams often overlook important opportunities within IT initiatives where fact-based decision making can deliver clear, measurable value. Let Critical Path Software show you the facts at our cost.
Leaders often made sub-optimal, value-destroying or bad decisions that end up with across-the-board budget cuts, because their cost optimization programs lacked an analytical fact base. Critical Path Software provides verifiable, measurable analytics.
Analytics teams are often not at the table for cost optimization meetings. Their opportunity to guide better prioritization is lost. Critical Path Software wants your analytics people looking at our impossible-to-deny, cost reducing analysis. When they see the potential executing on our platforms provides our process will become your priority.
Critical Path Software’s TurboTune & TurboTuneSQL
A comprehensive solution designed for performance improvement in Applications Management is an ideal solution for production involving large-scale Online/batch workloads and data processing.
Each data center is unique. Our experts prioritize improvements to the system ensuring the largest savings are achieved with the least amount of effort.
Historical as well current MIPS operational metrics are collected and analyzed to identify the “hotspots”. These can be on-line transactions, batch jobs or DB2 packages.
Our process achieves the following:
- Free up critical resources (CPU, DASD) for further expansion
- Defer expensive MIPS upgrades
- MIPS/CPU Reduction (Subsystem and Db2)
- Improve on-line performance
- Batch window reduction
- On-line transaction optimization
- Critical path optimization
- Application optimization
Critical Path Software provides immediate benefits in terms of ROI, by reducing MIPS & IO usage.
Critical Path Platforms Eliminate:
High CPU costs:
- Inefficient/poorly structured code
- Inefficient SQL
- High DASD costs:
- Redundant data
- Poor archiving policies Batch overruns:
- Jobs in the critical path taking enormous amount of time
- Contention of files
- Excessive IO calls on VSAM
Key Features of Our Platforms:
- Leverage the existing operational management tools available in the environment. (IBM vanilla)
- Provide an elevated and fast ROI, typically in months.
- Easily pays for itself from the savings in operational costs of the IT organization.
- No inherent risk. The core logic of the components is left “as is”.
- There is no fundamental redesign of databases in Db2.
- Strategic cost optimization programs provide the opportunity to align with higher-level initiatives that may occur as follow-ons.
- Critical Path Software can offer flexible billing models available to meet the needs of the customer, including an “outcome” based, gain-sharing model allowing Critical Path Software to prioritize your savings gain above our own fees.
- The metrics produced provide analytics beneficial to capacity planning.