Four next generation capabilities

Four must-have features for next generation in-memory computing

Today’s organizations want everything now — and this includes business answers that can be used to gain a competitive edge, lower costs, identify risks and more. It used to be that OLAP analytics and reporting was a wait-for-results proposition. In-memory computing changes all this with its ability to rapidly run reports on massive data sets while eliminating the latency of moving data across slower disk storage. IBM is thinking bigger by offering in-memory speed without the limitations. No more waiting for the answers you need for your business.

Four key capabilities make BLU Acceleration a next generation solution for in-memory computing:BLU Acceleration in-memory database

1. BLU Acceleration does not require the entire dataset to fit in memory while still processing at lightning-fast speeds.

Instead, BLU Acceleration uses a series of patented algorithms that nimbly handle in-memory data processing. This includes the ability to anticipate and “prefetch” data just before it’s needed and to automatically adapt to keep necessary data in or close to the CPU. Add some additional CPU acceleration techniques, and you get highly efficient in-memory computing at lightning-speed.

2. BLU Acceleration works on compressed data- saving time and money.
Why waste time and CPU resources on decompressing data, analyzing it and recompressing it? Instead of all these extra steps, BLU Acceleration preserves the order of data and performs a broad range of operations—including joins and predicate evaluations—on compressed data without the need for decompression. This is another next-generation technique to speed processing, skip resource-intensive steps and add agility.

3. BLU Acceleration intelligently skips processing of data it doesn’t need to get the answers you want.
With a massive data set, chances are good that you don’t need all of the data to answer a particular query. BLU Acceleration employs a series of metadata management techniques to automatically determine which data would not qualify for analysis within a particular query, enabling large chunks of data to be skipped. This results in a more agile computing, including storage savings and system hardware efficiency. This metadata is kept updated on a real-time basis so that data changes are continually reflected in the analytics. Less data to analyze in the first place means faster, simpler and more agile in-memory computing. We call this data skipping.

4. BLU Acceleration is simple to use.
As your business users demand more analytics faster, you need in-memory computing that keeps the pace. BLU Acceleration delivers optimal performance out of the box—no need for indexes, tuning, or time-consuming configuration efforts. You simply convert your row-based data to columns and run your queries. Because BLU Acceleration is seamlessly integrated with DB2, you can manage both row-based and column-based data from a single proven system, thus reducing complexity. This helps free the technical team to deliver value to the business – less routine maintenance and more innovation.

Fast, simple, agile

BLU Acceleration delivers the next generation of in-memory computing so that you organization is equipped to use actionable data to grow revenue, identify new efficiencies, spot opportunities and pinpoint risks. DB2 client can take advantage of BLU Acceleration by simply upgrading to DB2 10.5, while BLU for cloud is another flexible option to get started.

Learn about BLU Acceleration in-depth