Article

In memory analytics: Six factors spurring adoption

Sandeep Raut

Companies are trying to explore new markets while retaining their existing customers. This effort has acquired a new dimension with the explosion of big data and social

Continue Reading This Article

Enjoy this article as well as all of our content, including E-Guides, news, tips and more.

media. In order to strategize faster and speed up the response to real-time or near real-time levels, swift analysis has become crucial. There are various options to do a quick analysis—in-memory analytics, in database analytics and data warehouse appliances.

While data warehouse appliances have mature players such as Netezza, Greenplum, Kognitio and Vertica, the other two technologies (in-memory analytics and in-database analytics) are relatively new. Although in-database analytics is picking up pace, it is an expensive technology. By contrast, in-memory analytics is not expensive, and is therefore a rising favorite to exploit the additional memory available.

Numerous factors are driving the adoption of in-memory analytics. Let us examine some of them.

1) 64-bit operating systems

In the 32-bit era, a computer could recognize only up to 4 GB of RAM, so even if you added more memory chips you could not take advantage of that additional capacity. With 64-bit operating systems being widely used currently we can go beyond terabytes (TBs) of addressable memory; using these for querying is therefore much faster than accessing data stored in hard drives (RAM). They can show memory added beyond 1 TB, and are used for processing in-memory.

2) Affordability 

The RAM becomes a temporary workspace for the system’s processing. The more the RAM available, the more it can perform multiple tasks at once, thus resulting in performance improvements. Earlier, an 8 GB RAM upgrade would cost around $100 to $150; this is now available at $50 to $70 (approximately Rs 2,600 to Rs 3,600). In this way, companies can create RAM disks and respond faster to the challenging demands of real-time analytics.

With a drop in the cost of memory hardware and chips, in-memory analytics has become feasible for business intelligence (BI) and analytics environments. According to research firm Gartner, by 2012, 70 percent of all Global 1000 organizations will load detailed data into memory as the primary method of optimizing the performance of their BI applications.

3) Real-time analytics

In traditional data warehousing scenarios, due to different physical architectures, the extraction, loading and accessing of data for querying took a long time. Another drawback was that the data in the data warehouse was either a day (or a few hours) older than that available with the operational systems. In-memory analytics will be able to remove this latency and enable real-time use of operational and transactional information by the decision makers.

4) Growing data volumes

It is said that 15 petabytes of data are generated every day. With data volumes reaching an all-time high with tremendous variety and high velocity, it has become necessary to use speed for analysis of this data. However, building a data warehouse takes a long time. As the speed of response to market dynamics is critical, more and more companies will need to find ways to process and respond faster—and in-memory analytics is one of the options available to them. Today, columnar databases such as Vertica, Greenplum and Asterdata provide many compression techniques by which more data can be made available for in-memory analytics.

5) Rapid exploratory analysis

Visual data mining has become popular. In this process, users can see all the data on the canvas, filter the required data elements to make them more appealing, and get insights through quick analysis. With in-memory analytics, users can view the complete data at one go. They can also chart, plot and add filters to visualize patterns and trends; spot anomalies in the data; and quickly do a what-if analysis with the aid of in-memory analytics.

One obvious advantage of in-memory analytics is the improved performance of analytical applications. As queries and related data reside in the server’s memory, analysis does not require any network access or disk I/O. In-memory analytics will therefore supercharge the time-to-value in decision-making abilities. Tools like Tableau and Spotfire allow loading of entire datasets into memory so that users can do an exploratory analysis quickly.

6) Lowered IT costs

Because data movement from warehouses and databases is reduced, in-memory analytics can lead to reduced dependence on IT personnel. In-memory analytics can also simplify the analysis done by tools (like QlikView) which may not be full-fledged analytical tools or may not be having data mining capabilities. In this way, in-memory analytics has made it easier for business users to do what-if or exploratory analysis by themselves instead of waiting for IT personnel to provide them with the required data and dashboards.

Emerging trends

According to Gartner’s top 10 strategic technology trends for 2012, in-memory computing will see widespread use of flash memory in consumer devices, entertainment devices and other embedded IT systems. Unlike RAM (the main-memory in servers and PCs), flash memory is persistent even when the power is removed. In this way, it looks more like disk drives where we place information which must survive power-downs and reboots, yet it has much of the speed of memory and is far faster than a disk drive. As lower-cost—and lower-quality—flash is used in the data center, software that can optimize the use of flash and minimize the endurance cycles becomes critical. Users and IT providers should therefore look at in-memory analytics as a long-term technology trend.

 

About the author: Sandeep Raut has more than 22 years of IT consulting experience with over 14 years in the BI space and established BI CoEs. He is currently working at Syntel as the Director for BI and analytics.