The Hidden Issue with the SD HANA Benchmarks

Executive Summary

  • SAP has used the SD benchmarks for years for other databases.
  • Four years into S/4HANA, SAP has no benchmarks for S/HANA SD or ECC SD on HANA.

Introduction

SAP has used the SD benchmarks for years to test for transaction processing performance, but something peculiar happened when SAP released HANA, and the benchmarking was to be performed for S/4HANA. SAP has been playing tricks with the SD benchmarks to promote HANA and S/4HANA by not using the SD benchmark and proposing that it is no longer valid, as HANA would not have performed well in that benchmark as it is a transaction processing benchmark. You will learn about the hidden issues with SAP SD benchmarks.

Our References for This Article

If you want to see our references for this article and other related Brightwork articles, see this link.

Notice of Lack of Financial Bias: We have no financial ties to SAP or any other entity mentioned in this article.

  • This is published by a research entity, not some lowbrow entity that is part of the SAP ecosystem. 
  • Second, no one paid for this article to be written, and it is not pretending to inform you while being rigged to sell you software or consulting services. Unlike nearly every other article you will find from Google on this topic, it has had no input from any company's marketing or sales department. As you are reading this article, consider how rare this is. The vast majority of information on the Internet on SAP is provided by SAP, which is filled with false claims and sleazy consulting companies and SAP consultants who will tell any lie for personal benefit. Furthermore, SAP pays off all IT analysts -- who have the same concern for accuracy as SAP. Not one of these entities will disclose their pro-SAP financial bias to their readers. 

The SD Benchmark for HANA

Here are the SD benchmarks for HANA listed on the SAP website.

This looks like an impressive list of benchmarks for different hardware and different operating systems and database releases. At the time we reviewed this data, we counted 1066 benchmarks.

The Databases

Now, notice the databases that were used for the benchmarks.

Each benchmark typically is a single database version or variant of the version, and there are many database versions or variants. There is a separate benchmark for Oracle 10g, Oracle 10g with Real Application Cluster.

The databases are as follows.

  1. DB2: (41 Benchmarks)
  2. Adabas: (1 Benchmark)
  3. SQL Server: (17 Benchmarks)
  4. SAP ASE (4 Benchmarks)
  5. Informix: (7 Benchmarks)
  6. MaxDB: (4 Benchmarks)
  7. SAP DB: (4 Benchmarks)
  8. Oracle (26 Benchmarks)

The Operating Systems

  1. IBM AIX
  2. Red Hat Enterprise
  3. IBM OS
  4. Solaris
  5. SUSE Linux
  6. Windows, 2000, .NET, NT, Enterprise Server, Enterprise Edition, etc..

Most benchmarks were run for Windows Enterprise Server 2003 with 135.

ECC/R/3 Versions

The benchmarks span 52 different versions of ECC. The most frequently benchmarked ECC/R/3 versions were the following.

  1. ECC EHP 5 for ERP 6.0: (242 Benchmarks)
  2. R/3 4.6C: (124 Benchmarks)
  3. SAP EHP 4 for ERP 6.0: (118 Benchmarks)

Hardware Environment

  1. Bare Metal: (984 Benchmarks)
  2. Cloud: (44 Benchmarks)
  3. Virtualized: (36 Benchmarks)

This means 92% of the benchmarks were run on bare metal and premises.

Hardware Vendors

The benchmarks used hardware from 28 different hardware vendors and AWS (not a hardware vendor but a cloud services provider that uses its own open specified hardware, as we covered in How the Open Compute Project Reduced the Power of Proprietary Vendors).

The largest number of benchmarks per hardware vendor was as follows:

  1. Dell: (86 Benchmarks)
  2. Fujitsu (57 Benchmarks)
  3. Fujitsu Siemens (106 Benchmarks)
  4. Hitachi (49 Benchmarks)
  5. HP/HPE (209 Benchmarks)
  6. IBM (201 Benchmarks)
  7. NEC (48 Benchmarks)
  8. Sun Microsystems (64 Benchmarks)
  9. Cisco Systems (37 Benchmarks)

AWS only had 14 benchmarks, but this list of SD benchmarks goes back for years. The first benchmark was performed in 1995 and went up to 2019. AWS does not get its first published benchmark until 2013.

For example, in recent years, were 32 benchmarks performed in 2018.

The Natural Question That Arises

S/4HANA was introduced in 2015 and just recently had its fourth anniversary. HANA is going on its eighth anniversary. Within that context, let us take note of the following facts about SAP’s benchmarking.

  • There is no single benchmark for S/4HANA SD.
  • HANA has no benchmark for any ERP or other transaction processing systems.
  • SAP has only published benchmarks for BW (both the BWH and the BWAML benchmarks)

The HANA benchmark could be run without S/4HANA as HANA works for ECC in a suite on HANA. However, this setup was never benchmarked.

Conclusion

SAP made enormous claims for both HANA and S/4HANA. However, it has published zero transaction processing benchmarks for HANA (with ECC) or S/4HANA. Lenovo performed the only HANA benchmark published where there was a comparison, and its problems are covered in the article The Problems with the Strange Lenovo HANA Benchmark.

Information reported to us from the field, covered in the article HANA as a Mismatch for S/4HANA and ERP, illustrates that HANA is a weak performer in transaction processing. Our information shows that HANA underperforms previous versions of databases used for ECC in transaction processing, which is the dominant processing type in ERP systems.

It seems quite likely that SAP has published no ERP benchmarks for HANA because it would show that SAP’s statements about being able to master both OLTP and OLAP from one database are false.