Skip to Content
SAP HANA

Getting Started with In-Memory Business Data

Tags:

The concept of managing the entire business data in-memory has been researched and studied for many years. But it is only in the very past years, with increasing memory reliability and decreasing prices along with the availability of 64-bit operating systems, that the technological restrictions for in-memory databases have been removed. In-memory business data management is about to become available for many applications, that are in dire need of high performance: existing applications will be made better and whole new types of applications impossible in the past will be developed and provided to our customers.

The in-memory data layer will revolutionize the way how "traditional" OLTP applications work. Not only will they become 10-100-1000 times faster, through elimination of redundantly stored summary data we will see dramatically improved change management and extensibility. Find out how we have designed the new layer, how we integrated in-memory technologies in products, and what SAP plans for future delivery.

Real-Real Time Computing - The Motivation

Ever wondered how information can be accessed before a customer even leaves the store? With real-real time computing companies can make smarter, faster decisions every minute of the day see how it works. Take a look at this video.

Fooling the Speed Limit

Already in 2002 the CPU hit a frequency wall: 2.93 gigahertz. Now chip manufacturers have bet their future on multi-core architecture to satiate the sustained need for speed. A multi-core processor comprises two or more CPU cores into a single processor, where the independent cores share the processing load. It represents a concept commonly known in computer science as parallelization; a concept Hasso believes is just now starting to be exploited.


Endless Stores
While the processing speeds have increased, the cost of main memory storage has decreased. Main memory (as opposed to disk memory) uses simpler algorithms and needs to execute fewer CPU instructions, which translates to better performance.Hasso contends that main memory storage and multi-core processing packaged together in a single server yield unprecedented speeds. Customers can capitalize on this relatively inexpensive and an incredibly fast hardware setup.To illustrate the power of such a system, Hasso produced a visual of the latest Nehalem EP server, equipped with two CPUs (each with four cores running at 2.93GHz), with slots for up to 144 gigabytes of DDR3 Ram. Fully loaded with memory, this server would cost approximately USD 22,500 - yet with column-store compression could run the financials of roughly 70 companies.


The Software that Exploits the Hardware

Moving from hardware to software, SAP demonstrates the productized version of a vision we had just a year ago: SAP BusinessObjects Explorer. Officially introduced the day before, SAP BusinessObjects Explorer combines intuitive information search and exploration functionality with the high performance and scalability of SAP NetWeaver Business Warehouse Accelerator. This software scales very well with enhanced hardware. The software can handle more and more queries per second as the number of cores per server increases. Furthermore, the greater the number of servers the more queries per hour can be performed.The increase in hardware speed and inexpensive storage hold promise for other applications, not just for analytics. However, SAP announced a press release for the BusinessObjectsTM Explorer showing the new technology as part of the solution.

What It Means for Customers
As the price for performance continues to drop, so, too, will total cost of ownership. The new types of hardware systems will mean less data to manage and simpler systems to maintain and upgrade - that's lower total cost of ownership.What customers will notice most is speed. No longer will executives have to wait for minutes or hours to get the answers they need to effectively run their business. Information at the tip of your fingers now means milliseconds, not next Wednesday.With immediate information, executives can anticipate and react in real time to changing market conditions together with key stakeholders. We live in a new world where leaders of organizations cannot wait weeks for reports to be generated to provide visibility into dynamic changes in the marketplace.

Publications and Whitepapers
Hasso Plattner talkes often about in-memory technology at events, about future trends, and his vision on data storage. Some of his thoughts are listed as podcast. Please watch the videos on the HPI web page for in-memory data management.

Enterprise Data Management in Mixed Workload Environments

A Hybrid Row-Column OLTP Database Architecture for Operational Reporting

A Composite Benchmark for Online Transaction Processing and Operational Reporting

A Database Engine for Flexible Real-Time Available-to-Promise

A Common Database Approch for OLTP and OLAP Using an In-Memory Column Database (Hasso Plattner)

Enterprise Applications - OLTP and OLAP - Share One Database Architecture (Hasso Plattner)


Hasso Plattner Statements On In-Memory Computing

Hasso Plattner talkes often about in-memory technology at events, about future trends, and his vision on data storage. Some of his thoughts are listed as podcast. Please watch the videos on the HPI web page for in-memory data management.

Former Member