Posts Tagged business intelligence

Big Data Analytics Defines Top Performers

A survey of over 1100 executives by the IBM Center for Applied Insights showed that organizations making extensive use of analytics experienced up to 1.6x the revenue growth, 2.0x EBITDA growth, and a 2.5x stock price appreciation compared to their peers.  And what they are analyzing is Big Data, a combination of structured data found in conventional relational databases and unstructured data pouring in from widely varied sources.

Big Data is growing fast.  By 2015 the digital universe, as forecast by IDC, will hit 8 zettabytes (ZB). (1ZB = 1021 bytes, one sextillion bytes). Adding to the sheer volume is the remarkable velocity at which data is created.  Every minute 600 new blog posts are published and 34,000 Twitter tweets are sent. If some of that data is about your organization, brand, products, customers, competitors, or employees wouldn’t you want to know?

Big data involves both structured and unstructured data.  Traditional systems contain predominantly structured data. Unstructured data comes from general files; from smart phones and mobile devices; from social media like Twitter, Facebook, and others; from RFID tags and other sensors and meters; and even from video cameras. All can be valuable to organizations in particular contexts.

Large organizations, of course, can benefit from Big Data, but midsize and small businesses can too.  A small chain of pizza shops needs to know the consumer buzz about their pizza as much as Domino’s.

IBM describes a 4-step process for tapping the value of Big Data: align, anticipate, act, and learn. The goal is to make the right decision at the point of maximum impact. That might be when the customer is on the phone with a sales agent or when the CFO is about to negotiate the details of an acquisition.

Align addresses the need to identify your data sources and plan how you are going to collect and organize the data. It will involve your structured databases as well as the wide range of enterprise content from unstructured sources. Anticipate addresses data analytics and business intelligence with the goal of predicting and shaping outcome. It focuses on identifying and analyzing trends, making hypotheses, and testing predictions. Act is the part where you put the data into action, whether it is making the best decision or taking advantage of a new pattern you have uncovered. But it doesn’t stop there. Another payoff from Big Data comes from the ability to learn, for the purpose of refining your analytics and identifying new patterns based on subsequent data.

Big Data needs to be accompanied by appropriate tools and technology. Earlier this month, IBM introduced three task-specific Smarter Analytics Signature Solutions. The first addresses anti-fraud, waste, and abuse by using sophisticated analytics to recommend the most effective remedy for each case. For example it might recommend a different letter requesting payment in one case but suggest a full criminal investigation in another.

The second Signature Solution focuses on next-best-action.  This looks at the various data uses real-time analytics to predict customer behavior and preferences and recommend the next best action to take with regard to a customer, such as to reduce churn or up-sell.

The third Signature Solution, dubbed CFO Performance Insight, works on a collection of complex and cross-referenced internal and external data sets using predictive analytics to deliver increased visibility and control of financial performance along with predictive insights and root-cause analyses. These are delivered via an executive-style dashboard.

IBM isn’t the only vendorr to jump on the Big Data bandwagon. EMC has put a stake into this market.  Oracle, which has been stalking IBM for years, also latched onto Big Data through Exalytics, its in-memory analytics product similar to IBM’s Netezza. Of course, small players like Cloudera, which early on staked out Hadoop, the key open source component of Big Data, also offer related products and services.

Big Data analytics will continue as an important issue for some years to come. This blog will return to it time and again.


, , , , , , ,

1 Comment

Predictive Analysis on the Mainframe

Real time, high volume predictive analysis has become a hot topic with the growing interest in Big Data. The consulting firm McKinsey addresses the growth of Big Data here. McKinsey’s gurus note that advancing technologies and their swift adoption are upending traditional business models. BottomlineIT also took up Big Data in October 2011.

With that in mind, IBM has been positioning the zEnterprise, its latest mainframe, for a key role in data analysis.  To that end, it acquired SPSS and Cognos and made sure they ran on the mainframe. The growing interest in Big Data and real-time data analytics fueled by reports like that above from McKinsey only affirmed IBM’s belief that as far as data analytics goes the zEnterprise is poised to take the spotlight. This is not completely new; BottomlineIT’s sister blog, DancingDinosaur, addressed it back in October 2009.

Over the last several decades people would laugh if a CIO suggested a mainframe for data analysis beyond the standard canned system reporting.  For ad-hoc querying, multi-dimensional analysis, and data visualization you needed distributed systems running a variety of specialized GUI tools. Still, the resulting queries could take days to run.

In a recent analyst briefing, Alan Meyer, IBM’s senior manager for Data Warehousing on the System z, built the case for a different style of data analysis on the zEnterprise. He drew a picture of companies needing to make better informed decisions at the point of engagement while applications and business users increasingly are demanding the latest data faster than ever. At the same time there is no letup in pressure to lower cost, reduce complexity, and improve efficiency.

So what’s stopping companies from doing near real-time analytics and the big data thing? The culprits, according to Meyer, are duplicate data infrastructures, the complexity of integrating multiple IT environments, inconsistent security, and insufficient processing power, especially when having to handle large volumes of data fast. The old approach clearly is too slow and costly.

The zEnterprise, it turns out, is an ideal vehicle for today’s demanding analytics.  It is architected for on-demand processing through pre-installed capacity paid for only when activated and allowing the addition of processors, disk, and memory without taking the system offline.  Virtualized top to bottom, zEnterprise delivers the desired isolation while prioritization controls let you identify the most critical queries and workloads. Its industry-leading processors ensure that the most complex queries run fast, and low latency enables near real-time analysis. Finally, multiple deployment options means you can start with a low-end z114 and grow through a fully configured z196 combined with a zBX loaded with blades.

Last October the company unveiled the IBM DB2 Analytics Accelerator (IDAA), a revamped version on the Smart Analytics Optimizer available only for the zEnterprise, along with a host of other analytics tools under the smarter computing banner. But the IDAA is IBM’s analytics crown jewel. The IDAA incorporates Netezza, an analytics engine that speeds complex analytics through in-memory processing combined with a highly intelligent query optimizer. When run in conjunction with DB2 also residing on the zEnterprise, the results can be astonishing, with queries that normally require a few hours completed in just a few seconds, 1000 times faster according to some early users.

Netezza, when deployed as an appliance, streamlines database performance through hardware acceleration and optimization for deep analytics, multifaceted reporting, and complex queries. When embedded in the zEnterprise, it delivers the same kind of performance for mixed workloads—operational transaction systems, data warehouse, operational data stores, and consolidated data marts—but with the z’s extremely high availability, security, and recoverability. As a natural extension of the zEnterprise, where the data already resides in DB2 and the OLTP systems, IDAA is able to deliver pervasive analytics across the organization while further speeding performance and ease of deployment and administration.

Of course, IT has other options: EMC now offers its Greenplum data analytics appliance  and Oracle just released its Big Data Appliance. Neither requires a mainframe. But when the price for the latest general purpose mainframe starts at $75,000 and can do much more than any appliance, maybe it’s time to consider one. The value of the predictive, near real-time business analytics alone could justify it.

, , , , , ,

Leave a comment

Top performers are analytics-driven

Does your organization take advantage of data analytics?  A study from the MIT Sloan Management School and IBM’s Institute for Business Value late last year found top-performing companies are three times more likely to be capitalizing on data analytics.

The study based on a sample of nearly 3,000 executives and business analysts from 108 countries and 30 industries found a clear connection between users of analytics technology and the ability to achieve competitive differentiation and performance. For example, top performers are five times more likely to apply analytics rather than intuition across the widest possible range of decisions. In financial and budgeting, top performers were nearly four times more likely than others to apply analytics. So much for making decisions based on what your gut says.

A more recent study released this March,  again by the IBM Institute for Business Value, surveyed several hundred leading banks on banking success factors and also found data analytics to be critically important. Specifically, 90% of bankers believe they need to transform from the status quo for future profitability, and the successful banks will be the ones that invest in building sophisticated insights based on powerful analytics. Top performers, the researchers found, will use insights gained through analytics to focus their operations.

These studies hit at a time when organizations are being inundated with data, often at a rate faster than their people and even their systems and processes can effectively capture, assess, and act. And it is not just the volume of data but the speed at which it is pouring in and the variety of the information that makes analytics so difficult yet so important.

IT people excel at capturing data, processing it, and reporting it. More recently IT latched onto business intelligence (BI) to extract value from data and has deployed an array of BI tools, such as Cognos or Business Objects. IT, however, has been slower to put sophisticated, automated high speed analytics at the disposal of the organization’s line of business managers.

Many organizations pretty much give up, rarely moving beyond, say, some multi-dimensional data arrays that they can dice-and-splice in a handful of different ways. Even BI as it is currently deployed does not help companies become true top performers.

According to the MIT-IBM study, top performers are two times more likely to shape future business strategies as well as guide day to day operations based on analytics. The study found that despite popular complaints about the overwhelming amount of information, organizations today are far less concerned about data deluge issues, instead feeling hindered by traditional management practices.

Those management practices fall into three areas:

  1. Lack of understanding about how to apply analytics to improve their business
  2. Lack of bandwidth due to competing priorities
  3. Lack of skills in the lines of business

This becomes most apparent when organizations attempt some analysis but can’t translate the resulting insights into effective action. The MIT-IBM study, however, suggests how to get past these initial hurdles:

Tackle biggest challenges first: Don’t wait for complete data or perfect skills before you try to apply analytics to high-value opportunities.

Flip the data to insights equation: Instead of data before insights, recognize the specific insights needed and focus on getting only the data required for answers.

Adopt techniques and tools best suited to your management: New tools like data visualization and simulation techniques can help executives and managers alike anticipate the consequences of their decisions, explore alternative approaches, and confront the tradeoffs. These do not require unusual skills to use and can be applied by business leaders at anywhere in the organization.

IT is perfectly positioned to work with forward thinking business managers deliver the data and tools they need to turn their companies into top performers. IBM, which funded both these studies, has also taken the lead in delivering the necessary tools with its Smart Analytics toolset. Oracle, HP, and others are scrambling to get into this space.


, , , , , , , , ,

1 Comment