How to Understand HANA’s High CPU Consumption

Executive Summary

  • HANA has high CPU consumption due to HANA’s design.
  • The CPU consumption is explained by SAP, but we review whether the explanation makes sense.

Introduction to HANA CPU Consumption

At Brightwork we have covered the real issues with HANA that are censored by SAP, SAP consulting firms and IT media. As a result of this information being censored, many SAP customers have been hit with many surprises related to implementing HANA.

In this article, we will address HANA’s CPU consumption or overconsumption.

HANA CPU Overconsumption

A second major issue in addition to memory overconsumption with HANA is CPU consumption.

HANA’s design is to load data that is not actually planned to be used in the immediate future into memory. SAP has decreased it rather unrealistic position on “loading everything into memory,” but it still loads quite a lot into memory.

The reason for the CPU overconsumption is this is how a server reacts when so much data is loaded into memory. The CPU spikes. This is also why CPU monitoring along with memory monitoring is considered so necessary for effectively using HANA, while this is not an issue normally with competing databases from Oracle, IBM or others.

SAP’s Explanation for Excessive CPU Utilization by HANA

SAP offers a peculiar explanation for CPU utilization.

“Note that a proper CPU utilization is actually desired behavior for SAP HANA, so this should be nothing to worry about unless the CPU becomes the bottleneck. SAP HANA is optimized to consume all memory and CPU available. More concretely, the software will parallelize queries as much as possible to provide optimal performance. So if the CPU usage is near 100% for query execution, it does not always mean there is an issue. It also does not automatically indicate a performance issue”

Why Does SAP’s Statement Not Make Sense

This entire statement is unusual, and it does not explain several issues.

Why HANA Times Out if the CPU is Actually Being Optimized?

Timing out is an example of HANA not being able to manage and throttle its resource usage. Timing out requires manual intervention to reset the server. If a human is required to intervene or the database ceases to function, this obviously means that resources are not being optimized. We have common reports at some companies of their HANA server needing to be reset multiple times a week.

This is supposed to be a standard capability that comes with any database that one purchases. Free open-source databases do not have the problem that HANA has.

If an application or database is continually consuming all resources, the apparently the likelihood of timeouts increases. It also presents a false construct that HANA has optimized the usage of the CPU, which is not correct. It seeks to present what is a bug in HANA as a design feature.

Why HANA Leaves A High Percentage of Hardware Unaddressed

Benchmarking investigations into HANA’s utilization of hardware indicate clearly that HANA is not addressing all of the hardware that it uses. While we don’t have both related items evaluated on the same benchmark, it seems very probable that HANA is timing out without addressing all of the hardware. Customers that purchase the hardware specification recommended by SAP often do not know that HANA leaves so much hardware unaddressed.

The Overall Misdirection of SAP’s Explanation

This paragraph seems to attempt to explain away the consumption of hardware resources by HANA that in fact, should be a concern to administrators. This statement is also inconsistent with other explanations about HANA’s use of memory, as can be seen from the SAP graphic below.

Notice the pool of free memory.
Once again, notice the free memory in the graphic.

This is also contradicted by the following statement as well.

“As mentioned, SAP HANA pre-allocates and manages its own memory pool, used for storing in-memory tables, for thread stacks, and for temporary results and other system data structures. When more memory is required for table growth or temporary computations, the SAP HANA memory manager obtains it from the pool. When the pool cannot satisfy the request, the memory manager will increase the pool size by requesting more memory from the operating system, up to a predefined Allocation Limit. By default, the allocation limit is set to 90% of the first 64 GB of physical memory on the host plus 97% of each further GB. You can see the allocation limit on the Overview tab of the Administration perspective of the SAP HANA studio, or view it with SQL. This can be reviewed by the following SQL
statement

select HOST, round(ALLOCATION_LIMIT/(1024*1024*1024), 2)
as “Allocation Limit GB”
from PUBLIC.M_HOST_RESOURCE_UTILIZATION”

Introduction to HANA Development and Test Environments

Hasso Plattner has routinely discussed how HANA simplifies environments. However, the hardware complexity imposed by HANA is overwhelming to many IT departments. This means that to get the same performance of other databases, HANA requires not only far more hardware but far more hardware maintenance.

Development and testing environments are always critical, but with HANA the issue is of particular importance. The following quotation explains the alterations within HANA.

“Taking into account, that SAP HANA is a central infrastructure component, which faced dozens of Support Packages, Patches & Builds within the last two years, and is expected to receive more updates with the same frequency on a mid-term perspective, there is a strong need to ensure testing of SAP HANA after each change.”

The Complications of So Many Clients and Servers

HANA uses a combination of clients and servers that must work in unison for HANA to work. This brings up complications, i.e., a higher testing overhead when testing any new change to any one component. SAP has a process for processing test cases to account for these servers.

“A key challenge with SAP HANA is the fact, which the SAP HANA Server, user clients and application servers are a complex construct of different engines that work in concert. Therefore, SAP HANA Server, corresponding user clients and applications servers need to be tested after each technical upgrade of any of those entities. To do so in an efficient manner, we propose the steps outlined in the subsequent chapters.”

Leveraging AWS

HANA development environments can be acquired through AWS at reasonable rates. This has the extra advantage that because of AWS’s elastic offering, volume testing can be performed on AWS’s hardware without having to purchase the hardware. Moreover, the HANA AWS instance can be downscaled as soon as the testing is complete. The HANA One offering allows the HANA licenses to be used on demand. This is a significant value-add as sizing HANA has proven to be extremely tricky.

SAP’s Inaccurate Messaging on HANA as Communicated in SAP Videos

Fact-Checking SAP’s HANA Information

This video is filled with extensive falsehoods. We will address them in the sequence they are stated in this video.

SAP Video Accuracy Measurement

SAP's Statement
Accuracy
Brightwork Fact Check
Link to Analysis Article
HANA is a Platform
0%
HANA is not a platform, it is a database.How to Deflect You Were Wrong About HANA
HANA runs more "in-memory" than other databases.
10%
HANA uses a lot of memory, but the entire database is not loaded into memory.How to Understand the In-Memory Myth
S/4HANA Simplifies the Data Model
0%
HANA does not simplify the data model from ECC. There are significant questions as to the benefit of the S/4HANA data model over ECC.Does HANA Have a Simplified Data Model?
Databases that are not HANA are legacy.
0%
There is zero basis for SAP to call all databases that are not HANA legacy.SAP Calling All Non-HANA DBs Legacy.
Aggregates should be removed and replaced with real time recalculation.
0%
Aggregates are very valuable, and all RDBMS have them (including HANA) and they should not be removed or minimized in importance.Is Hasso Plattner Correct on Database Aggregates?
Reducing the number of tables reduces database complexity.
0%
Reducing the number of tables does not necessarily decrease the complexity of a database. The fewer tables in HANA are more complicated than the larger number of tables pre-HANA.Why Pressure SAP to Port S/4HANA to AnyDB?
HANA is 100% columnar tables.
0%
HANA does not run entirely with columnar tables. HANA has many row-oriented tables, as much as 1/3 of the database.Why Pressure SAP to Port S/4HANA to AnyDB?
S/4HANA eliminates reconciliation.
0%
S/4HANA does not eliminate reconciliation or reduce the time to perform reconciliation to any significant degree.Does HANA Have a Simplified Data Model and Faster Reconciliation?
HANA outperforms all other databases.
0%
Our research shows that not only can competing databases do more than HANA, but they are also a better fit for ERP systems.How to Understand the Mismatch Between HANA and S/4HANA and ECC.

The Problem: A Lack of Fact-Checking of HANA

There are two fundamental problems around HANA. The first is the exaggeration of HANA, which means that companies that purchased HANA end up getting far less than they were promised. The second is that the SAP consulting companies simply repeat whatever SAP says. This means that on virtually all accounts there is no independent entity that can contradict statements by SAP.

This is an example of a statement by SAP that is designed to make a bug and design issue, something that occurred very early in the development of HANA, but which SAP never addressed (or thought they could address by simply adding more memory) into a positive feature. Other databases do not have this problem. There is no open-source database that we have ever heard of having this problem. It is just a very odd problem. HANA projects and implementations run into issues that they can’t get straight answers from SAP or from their SAP consulting firm.

The Necessity of Fact Checking

SAP and the consulting firms rely on providing information without any fact-checking entity to contradict the information they provide. This is how companies end up paying for a database that is exorbitantly priced, exorbitantly expensive to implement and exorbitantly expensive to maintain. When SAP or their consulting firm are asked to explain these discrepancies, we have found that they further lie to the customer/client and often turn the issue around on the account, as we covered in the article How SAP Will Gaslight You When Their Software Does Not Work as Promised.

We ask a question that anyone working in enterprise software should ask.

Should decisions be made based on sales information from 100% financially biased parties like consulting firms, IT analysts, and vendors to companies that do not specialize in fact-checking?

If the answer is “No,” then perhaps there should be a change to the present approach to IT decision making.

In a market where inaccurate information is commonplace, our conclusion from our research is that software project problems and failures correlate to a lack of fact checking of the claims made by vendors and consulting firms. If you are worried that you don’t have the real story from your current sources, we offer the solution.

Financial Disclosure

Financial Bias Disclosure

Neither this article nor any other article on the Brightwork website is paid for by a software vendor, including Oracle, SAP or their competitors. As part of our commitment to publishing independent, unbiased research; no paid media placements, commissions or incentives of any nature are allowed.

Search Our Other HANA Content

References

The Risk Estimation Book

 

Software RiskRethinking Enterprise Software Risk: Controlling the Main Risk Factors on IT Projects

Better Managing Software Risk

The software implementation is risky business and success is not a certainty. But you can reduce risk with the strategies in this book. Undertaking software selection and implementation without approximating the project’s risk is a poor way to make decisions about either projects or software. But that’s the way many companies do business, even though 50 percent of IT implementations are deemed failures.

Finding What Works and What Doesn’t

In this book, you will review the strategies commonly used by most companies for mitigating software project risk–and learn why these plans don’t work–and then acquire practical and realistic strategies that will help you to maximize success on your software implementation.

Chapters

Chapter 1: Introduction
Chapter 2: Enterprise Software Risk Management
Chapter 3: The Basics of Enterprise Software Risk Management
Chapter 4: Understanding the Enterprise Software Market
Chapter 5: Software Sell-ability versus Implementability
Chapter 6: Selecting the Right IT Consultant
Chapter 7: How to Use the Reports of Analysts Like Gartner
Chapter 8: How to Interpret Vendor-Provided Information to Reduce Project Risk
Chapter 9: Evaluating Implementation Preparedness
Chapter 10: Using TCO for Decision Making
Chapter 11: The Software Decisions’ Risk Component Model

Risk Estimation and Calculation

Risk Estimation and Calculation

See our free project risk estimators that are available per application. The provide a method of risk analysis that is not available from other sources.

project_software_risk