Gartner Inc. says in-memory computing is speeding its way toward mainstream adoption; the consulting and market research company predicts that at least 35% of large and midsize organizations will adopt in-memory technology by 2015, up from less than 10% in 2012. That's partly because of big pushes by SAP and Oracle to sell in-memory appliance systems -- HANA in SAP's case, Exalytics in Oracle's. Meanwhile, vendors of self-service business intelligence tools that support in-memory applications -- QlikTech, Tableau, Tibco -- have taken up residence in the "leaders" box in Gartner's Magic Quadrant for BI platforms.
Enter another technology trend that's attracting, oh, a fair amount of attention: big data. Put in-memory analytics tools and big data together, and what do you have? A combination that might help speed up the process of analyzing all the structured and unstructured data that organizations are creating, collecting, storing and hoping to capitalize on.
SearchBusinessAnalytics recently published a series of stories that offers insight and advice on getting the in-memory and big data mix right from experienced BI managers and consultants. In one, we explore the possibilities of using in-memory applications to analyze big data and the issues that need to be considered. In another, we look at the IT requirements to keep in mind for building effective in-memory big data systems. And in a third, we interview Michael Minelli, co-author of a book about big data analytics, on managing in-memory and big data projects.
Craig Stedman is executive editor of SearchBusinessAnalytics. Email him at email@example.com.
Follow us on Twitter: @BizAnalyticsTT.
This was first published in June 2013