How Accurate Was John Appleby on His 2015 HANA Predictions?

Executive Summary

  • John Appleby made bold predictions on HANA.
  • How accurate were his predictions for HANA in 2015?

Introduction

John Appleby’s article on the SAP HANA blog was titled 10 SAP HANA Predictions for 2015 and was published Dec 15, 2014.

The Quotations

Prediction #1: Large-Scale Suite on HANA adoption? 

“Hasso described very nicely in a comment to the linked blog how he believes he will see large-scale customers adopt Suite on HANA.

A company with a 100 tera byte ERP system on AnyDB will need

– 20 tera bytes using ERP on HANA,

– 10 tera bytes using sERP (no aggregates, no redundant tables, less db indices),

– 2-4 tera for actual data and 8-6 tera for historical data with the business logic based partitioning.

I believe that we will see such systems in 2015, and this will give the typical SAP customer with 5-10TB of ERP the confidence that HANA works for them too.”

This did not happen in 2015. In fact, SAP stopped reporting numbers in 2015 because the growth leveled out in 2015 as we covered in Why Did SAP Stop Reporting HANA Numbers After 2015?

Suite on HANA was only moderately implemented.

Prediction #2: HANA for Consolidated Workloads?

“Most HANA deployments today are a single database on a single appliance. Because HANA is so high-value, this has been acceptable, but this will change in 2015.

In HANA SPS09 we have the capability to consolidate databases into a single appliance using containers, which are logically separated and quite secure. Less well-known is that the inter-database security mechanisms have already been built. There are two main scenarios for this:

1) Regional systems with consolidated reporting. You can have regional BW databases on one appliance, and do group reporting on top of the regional systems, with no duplication of data.

2) Consolidation of Enterprise Systems. You can put ERP, CRM, SCM, BW all on one HANA appliance but in separate databases, to retain segregation for security purposes. Already you don’t need the BW PSA, because that is duplication. You can remove some BW data for operational reporting, and federate directly into the other databases.

Neither of these are supported configurations yet, but we will see them in 2015.”

This did not change in 2015 as predicted. Most companies stopped at implementing in BW.

Prediction #3: HANA Has Done an Incredible Job Winning Over the SAP Installed Base and Acquired Tons of Good IP from Sybase?

“SAP HANA has done an incredible job of winning over the SAP install base, with over half of the 3000+ customers, running SAP BW (that’s over 20% of the BW install-base, in 2 years). There have been big wins in startups too, with the Startup Focus program attracting around 1500 startups.

However, the focus on the Business Suite has meant that the tools to develop standalone applications haven’t had as much attention. SAP acquired a ton of great IP for this from Sybase and this hasn’t been integrated into SAP HANA yet, including database tooling like bulk loads, and stored procedure languages.”

SAP exaggerated its customer uptake, particularly up to 2015 as we covered in Why Did SAP Stop Reporting HANA Numbers After 2015?

It is true that most HANA implementations have been for BW, and that has continued up until the present day (2019).

The Sybase acquisition is really a failure. Sybase databases are in a severe decline since being purchased as we covered in How Accurate Was SAP on the Sybase Acquisition?

Transaction Processing is 2x Competing DBs — Out of the Box?

“Progress has been made in the Business Suite around optimization – we find that out the box, response times are 2x faster on HANA than other databases for transactional tests.”

No, even in 2019, our data points still contradict this even in 2019.

Transaction processing is also only shown as faster than competing DBs in a minority of respondents to our HANA survey.

HANA is the Most Comprehensive Data Platform in the World?

“HANA is the most comprehensive data platform in the world by now, and it requires a lot of documentation. When each SPS is released, I make a point to download it all and consume it, so I have all of the documentation for each release. In SPS06, it was 27 documents and 120MB in size. In SPS09 this has grown to an incredible 89 documents and 375MB. The biggest PDFs are 896, 884 and 828 pages long.

In particular, it can be tricky to figure out the details of what is new in the details of each component. Part of the reason for this is because HANA does so much, but there does need to be a move to simplify the documentation so it is easier to consume.”

HANA is not a data platform. It is a database.

One of the major complaints around HANA is its lack of documentation. So while Appleby may be pointing to voluminous documentation, the commentary is that many of things in HANA are not documented.

HANA does not do more than other databases. The issue is that the documentation is confusing and that HANA, particularly in 2014, but even recently keeps changing with new components switching in and out.

Prediction#4: BI Suite Integration?

“In 2014, we saw integration of the BI Suite into HANA, Lumira Server, Design Studio for HANA. In 2015 I believe that BI and HANA will become synonymous. There was a HANA version of Explorer in Lumira Server 1.14, but it was removed, and I think we will see something like this resurrected.

We will also see the maturity of EIM, Master Data Management and a full version of the BI Suite embedded into HANA – codenamed Project WHO. For Business Suite on HANA customers, this will mean that BI is embedded into the HANA platform.”

BI suite on HANA is BW. Lumira is a discontinued product as we covered in Why Does S/4HANA Need B4HANA for Analytics?.

Design Studio is now barely used. BI is not synonymous with HANA, as most BI/BW implementations are still on Oracle. Master Data Management is a dead product. The last parts of the quote are difficult to validate.

Prediction #5: Overhaul of the HANA Modeler?

“The HANA Modeler has been more-or-less static since SPS04 – Attribute Views, Analytic Views and Calculation Views. It has had enhancements, especially to performance, but it hasn’t had a major upgrade. We will see this in 2015, I believe with a single unified modeler based on the existing Calculation View designer. Analytic and Attribute views will be deprecated.

This will come with the deprecation of CE Functions, and ways to call out to SQLScript, integration direct with Predictive Analysis functions and Business Functions and much more advanced OLAP features. It will provide functionality equivalent to Business Objects universes.”

Well, views increased, but the predictive analytics of HANA either were not delivered, but never became a relevant factor for HANA projects.

Prediction #6: The End of the Row Store?

“HANA SPS08 quietly introduced a row buffer to the column store, which is one of the reasons why HANA has great OLTP performance. High-frequency updates are inserted into a row buffer, which are then inserted into the delta store in bulk. There are currently very few reasons to use the row store, and SAP often hides it in marketing literature. The two main scenarios are for queues, and configuration tables.

In 2015, we will see the end of the row store, with a unified table construct that will choose the right table format based on data. With this, the CREATE COLUMN TABLE syntax will be deprecated.”

HANA has never had “great OTLP performance.” Earlier Appleby stated that HANA was not developed to improve OLTP, now according to Appleby, it is. However, the original proposals by Hasso Plattner were that the entire database was column oriented. Why was a row buffer added in SPS08? This means the design evolved over time. But furthermore, it contradicts the initial claims about HANA. Naturally, without a row-oriented design, HANA would be extremely poor in transaction processing/OLTP.

The row store did not end. HANA still has row oriented tables.

Multiple Predictions: SAP HANA Predictions for 2015

If you look at the HANA roadmap slides, the “Future Themes” aspect is very telling.

“Continued Big Data & IoT platform enhancements, integration, capabilities and optimization

Continued cloud deployment capabilities & options

New and enhanced advanced analytic capabilities, features, & application services

Continued third-party tool & application support

Continued transformation of SAP Business Suite processes

It’s also telling if you read my SAP HANA Predictions for 2014 post – last year, I predicted a ton of new things that didn’t exist. This year, I predict a refactor of existing functionality, and doing existing things better.

I also believe that 2015 will be a watershed year for SAP HANA. In the last month, I did a pilot of the Business Suite on HANA in a customer. In one week, we migrated a 7TB MSSQL database onto HANA (1.2TB of data in HANA). In another week, we transformed finance and payroll business process. There was no muss, no fuss and no problems. Transactions were 2x faster and analytics were 100x faster and it lays the foundation a transformational change program.

Once customers realize that this is the case, and that HANA just works, and it provides consolidation, TCO reduction and all the new functionality in their Business Suite applications, they will move. I believe the number of customers will double in 2015. Let’s wait and see!

Appleby was in constant communication with SAP so he would have been apprised of new things coming down the pike. So many of these things are predictions, they are just information about what SAP was working on at the time.

Appleby and Bluefin Solutions certainly did not migrate a 7TB DB in one week or transform finance and payroll in a second week. The actual implementation times for doing these things is much longer than this. This matches the types of sales claims made by people who don’t spend time on projects and are in sales mode.

Transactions may have been faster, but Appleby is leaving out any effect of hardware as we covered in How Much of Performance is HANA? However, the hardware that the HANA database would have had would have been much larger than that used for SQL Server.

Analytics were listed as 100x faster. But this is inconsistent with the messaging from Bill McDermott, who promised 100,000x performance increase as we covered in How Accurate Was SAP About HANA Being 100,000x Faster Than Any Other Technology in the World? But even that reported number is likely not accurate. As the same issue of hardware applies. And is unmentioned by Appleby.

The database compression was proposed by Appleby was 1.2/7 = 83%. However, information from the field is that compression is closer to 30%. Secondly to get more size reduction, one needs to perform extensive archiving — which would be difficult to do in a week.

It is also unlikely that there was “no muss and no fuss.” All HANA implementations have both technical issues, as well as long term high maintenance overhead.

Conclusion

This article scores a 3 out of 10 for accuracy.

SAP’s Inaccurate Messaging on HANA as Communicated in SAP Videos

Fact-Checking SAP’s HANA Information

This video is filled with extensive falsehoods. We will address them in the sequence they are stated in this video.

SAP Video Accuracy Measurement

SAP's Statement
Accuracy
Brightwork Fact Check
Link to Analysis Article
HANA is a Platform
0%
HANA is not a platform, it is a database.How to Deflect You Were Wrong About HANA
HANA runs more "in-memory" than other databases.
10%
HANA uses a lot of memory, but the entire database is not loaded into memory.How to Understand the In-Memory Myth
S/4HANA Simplifies the Data Model
0%
HANA does not simplify the data model from ECC. There are significant questions as to the benefit of the S/4HANA data model over ECC.Does HANA Have a Simplified Data Model?
Databases that are not HANA are legacy.
0%
There is zero basis for SAP to call all databases that are not HANA legacy.SAP Calling All Non-HANA DBs Legacy.
Aggregates should be removed and replaced with real time recalculation.
0%
Aggregates are very valuable, and all RDBMS have them (including HANA) and they should not be removed or minimized in importance.Is Hasso Plattner Correct on Database Aggregates?
Reducing the number of tables reduces database complexity.
0%
Reducing the number of tables does not necessarily decrease the complexity of a database. The fewer tables in HANA are more complicated than the larger number of tables pre-HANA.Why Pressure SAP to Port S/4HANA to AnyDB?
HANA is 100% columnar tables.
0%
HANA does not run entirely with columnar tables. HANA has many row-oriented tables, as much as 1/3 of the database.Why Pressure SAP to Port S/4HANA to AnyDB?
S/4HANA eliminates reconciliation.
0%
S/4HANA does not eliminate reconciliation or reduce the time to perform reconciliation to any significant degree.Does HANA Have a Simplified Data Model and Faster Reconciliation?
HANA outperforms all other databases.
0%
Our research shows that not only can competing databases do more than HANA, but they are also a better fit for ERP systems.How to Understand the Mismatch Between HANA and S/4HANA and ECC.

The Problem: A Lack of Fact-Checking of HANA

There are two fundamental problems around HANA. The first is the exaggeration of HANA, which means that companies that purchased HANA end up getting far less than they were promised. The second is that the SAP consulting companies simply repeat whatever SAP says. This means that on virtually all accounts there is no independent entity that can contradict statements by SAP.

Being Part of the Solution: What to Do About HANA

We can provide feedback from multiple HANA accounts that provide realistic information around HANA — and this reduces the dependence on biased entities like SAP and all of the large SAP consulting firms that parrot what SAP says. We offer fact-checking services that are entirely research-based and that can stop inaccurate information dead in its tracks. SAP and the consulting firms rely on providing information without any fact-checking entity to contradict the information they provide. This is how companies end up paying for a database which is exorbitantly priced, exorbitantly expensive to implement and exorbitantly expensive to maintain. When SAP or their consulting firm are asked to explain these discrepancies, we have found that they further lie to the customer/client and often turn the issue around on the account, as we covered in the article How SAP Will Gaslight You When Their Software Does Not Work as Promised.

If you need independent advice and fact-checking that is outside of the SAP and SAP consulting system, reach out to us with the form below or with the messenger to the bottom right of the page.

The major problem with companies that bought HANA is that they made the investment without seeking any entity independent of SAP. SAP does not pay Gartner and Forrester the amount of money that they do so these entities can be independent as we covered in the article How Accurate Was The Forrester HANA TCO Study?

If you need independent advice and fact-checking that is outside of the SAP and SAP consulting system, reach out to us with the form below or with the messenger to the bottom right of the page.

Inaccurate Messaging on HANA as Communicated in SAP Consulting Firm Videos

For those interested in the accuracy level of information communicated by consulting firms on HANA, see our analysis of the following video by IBM. SAP consulting firms are unreliable sources of information about SAP and primarily serve to simply repeat what SAP says, without any concern for accuracy. The lying in this video is brazen and shows that as a matter of normal course, the consulting firms are happy to provide false information around SAP.

SAP Video Accuracy Measurement

SAP's Statement
Accuracy
Brightwork Fact Check
Link to Analysis Article
HANA runs more "in-memory" than other databases.
10%
HANA uses a lot of memory, but the entire database is not loaded into memory.How to Understand the In-Memory Myth
HANA is orders of magnitude faster than other databases.
0%
Our research shows that not only can competing databases do more than HANA, but they are also a better fit for ERP systems.How to Understand the Mismatch Between HANA and S/4HANA and ECC.
HANA runs faster because it does not use disks like other databases.
0%
Other databases also use SSDs in addition to disk.Why Did SAP Pivot the Explanation of HANA In Memory?
HANA holds "business data" and "UX data" and "mobile data" and "machine learning data" and "IoT data."
0%
HANA is not a unifying database. HANA is only a database that supports a particular application, it is not for supporting data lakes.
SRM and CRM are part of S/4HANA.
0%
SRM and CRM are not part of S/4HANA. They are separate and separately sold applications. SAP C/4HANA is not yet ready for sale. How Accurate Was Bluefin Solutions on C-4HANA?
Netweaver is critical as a platform and is related to HANA.
0%
Netweaver is not relevant for this discussion. Secondly Netweaver is not an efficient environment from which to develop.
HANA works with Business Objects
10%
It is very rare to even hear about HANA and Business Objects. There are few Buisness Objects implementations that use HANA.SAP Business Objects Rating
Leonardo is an important application on SAP accounts.
0%
Leonardo is dead, therefore its discussion here is both misleading and irrelevant.Our 2019 Observation: SAP Leonardo is Dead
IBM Watson is an important application on SAP accounts.
0%
Watson is dead, therefore its discussion here is both misleading and irrelevant.How IBM is Distracting from the Watson Failure to Sell More AI and Machine Learning
Digital Boardroom is an important application on SAP accounts.
0%
SAP Digital Boardroom is another SAP item that has never been implemented many places.

Financial Disclosure

Financial Bias Disclosure

Neither this article nor any other article on the Brightwork website is paid for by a software vendor, including Oracle, SAP or their competitors. As part of our commitment to publishing independent, unbiased research; no paid media placements, commissions or incentives of any nature are allowed.

Search Our Other John Appleby Fact Checking Content

References

https://blogs.saphana.com/2014/12/15/10-sap-hana-predictions-2015/

TCO Book

 

TCO3

Enterprise Software TCO: Calculating and Using Total Cost of Ownership for Decision Making

Getting to the Detail of TCO

One aspect of making a software purchasing decision is to compare the Total Cost of Ownership, or TCO, of the applications under consideration: what will the software cost you over its lifespan? But most companies don’t understand what dollar amounts to include in the TCO analysis or where to source these figures, or, if using TCO studies produced by consulting and IT analyst firms, how the TCO amounts were calculated and how to compare TCO across applications.

The Mechanics of TCO

Not only will this book help you appreciate the mechanics of TCO, but you will also gain insight as to the importance of TCO and understand how to strip away the biases and outside influences to make a real TCO comparison between applications.
By reading this book you will:
  • Understand why you need to look at TCO and not just ROI when making your purchasing decision.
  • Discover how an application, which at first glance may seem inexpensive when compared to its competition, could end up being more costly in the long run.
  • Gain an in-depth understanding of the cost, categories to include in an accurate and complete TCO analysis.
  • Learn why ERP systems are not a significant investment, based on their TCO.
  • Find out how to recognize and avoid superficial, incomplete or incorrect TCO analyses that could negatively impact your software purchase decision.
  • Appreciate the importance and cost-effectiveness of a TCO audit.
  • Learn how SCM Focus can provide you with unbiased and well-researched TCO analyses to assist you in your software selection.
Chapters
  • Chapter 1:  Introduction
  • Chapter 2:  The Basics of TCO
  • Chapter 3:  The State of Enterprise TCO
  • Chapter 4:  ERP: The Multi-Billion Dollar TCO Analysis Failure
  • Chapter 5:  The TCO Method Used by Software Decisions
  • Chapter 6:  Using TCO for Better Decision Making