Posts Tagged predictive analytics

Fresh Toys Help IT Open New Business Opportunities in 2014

Much of what to expect for IT for 2014 you have already glimpsed right here in BottomlineIT although some will be startlingly new. The new stuff will address opportunities in many cases that businesses are just starting to consider. Some of these, like 3D-printing or e-wallet, have the potential to radically change the way business operates.

Let’s start with what you already know: 2014 will be cloud everything; which is being steadily absorbed into business DNA as it evolves into the predominant way companies go to market, relate with their customers and partners, find employees, and deliver increasing aspects of their products as online services.

Also expect the continued hyping of big data, especially the unstructured data found everywhere, and analytics, which is necessary to make sense of the data. In 2014 analytics will be augmented by real-time analytics and predictive analytics, both of which can indeed deliver measurable business value.

In 2014 everything will be virtualized. In the process it will become defined by software. That means it will be programmable, allowing you to change its capabilities almost at will. Virtualized, software-defined capabilities will be in the products you acquire and the appliances you buy. You next car will be software-defined and Internet (cloud) connected. Your video-enabled car will be able to park in a tighter space than you can park it yourself.

Mobile, in the form of smartphones and tablet devices, will be the devices of choice for more and more people worldwide. Your mobile device will increasingly handle your communications, shopping, purchasing, socializing, entertainment, and work tasks even as it take over more of the functions of your wallet. Eventually the e-wallet will contain your identification, memberships, subscriptions, credit and debit cards as security gets bolstered,

On to the completely new: business drones are coming; mainly in the form of smart, software defined and programmable devices that can do errands. Basically they are taking robotics to a new level. Amazon hopes to use them to deliver items to your doorstep within hours of your purchase. What might your business do with a capability like this?

3D-printing is BottomlineIT’s favorite. Where the Internet disintermediated much of the traditional supply chain and distribution channel, 3D-printing can disintermdiate manufacturers by producing the physical product at your desk. Now software-defined, customizable mass products can be cost-effectively manufactured at scale for a market of just one. With 3D-printing you can deliver a customizable version of your widget to a customer as readily as you send a fax. Can you make some money with that capability?

Finally, smart, wearable, cloud-connected computers in the form of wrist watches (remember old Dick Tracy comics) and eye wear. Google Glass will become increasingly commonplace. Exactly what will be the business value of Google Glass remains unclear. Right now you buy it for the extreme cool factor.

So expect new IT goodies around the digital Xmas tree starting to arrive this year but in quantity by the end of 2014. Some may be a bust; others may be late in coming. As CIO, your job is to figure out which of these help can you meet your organization’s business goals. Best wishes for 2014.

, , , , , , , , , , , , ,

Leave a comment

Sorting Out the Data Analytics IT Options

An article from McKinsey & Company, a leading management consulting and research firm, declares: “By 2018, the United States alone could face a shortage of 140,000 to 190,000 people with deep [data] analytical skills as well as [a shortage of] 1.5 million managers and analysts with the know-how to use the analysis of big data to make effective decisions. Might this be your shop?

Many companies today are scrambling to assemble an IT data analytics infrastructure to support data analytics. But before they can even begin they have to figure out what kind of analytics the organization will want to deploy. Big data is just one of many possibilities and the infrastructure that works for some types of data analytics won’t work for others.

Just off the top of the head this blogger can list a dozen types of data analytics in play: OLAP, business intelligence (BI), business analytics, predictive analytics, real-time analytics, big data analytics, social analytics, web analytics, click stream analytics, mobile analytics, brand/reputation analysis, and competitive intelligence. You’ve probably have a few of these already.

As advanced analytics pick up momentum data center managers will be left trying to cobble together an appropriate IT infrastructure for whatever flavors of analytics the organization intends to pursue. Unless you have a very generous budget you can’t do it all.

For example, big data is unbelievably hot right now so maybe it makes sense to build an infrastructure to support big data analytics. But predictive analytics, the up and coming superstar of business analytics, is an equally hot capability due to its ability to counter fraud or boost online conversion immediately, while the criminal or customer is still online.

BI, however, has been the analytics workhorse for many organizations for a decade or more, along with OLAP, and companies already have a working infrastructure for that.  It consists of a data warehouse with relational databases and common query, reporting, and cubing tools. The IT infrastructure, for the most part, already is in place and working.

On the other hand, if top management now wants big data analytics or real time data analytics or predictive analytics you may need a different information architecture and design, different tools, and possibly even different underlying technologies. Big data, for example, relies on Hadoop, a batch process that does not make use of SQL. (Vendors are making a valiant effort to graft a SQL-like interface onto Hadoop with varying degrees of success.)

Real-time analytics is just that—real-time—basically the opposite of Hadoop. It works best using in-memory data and logic processing to speed the results of analytic queries in seconds or even microseconds. Data will be stored on flash storage or in large amounts of cache memory as close to the processing as it can get.

A data information architecture that is optimized for big data’s unstructured batch data cannot also be used for real time analytics.  And the traditional BI data warehouse infrastructure probably isn’t optimized for either of them.  The solution calls for extending your existing data management infrastructure to encompass the latest analytics management wants or designing and building yet another IT data infrastructure.  Over the past year, however, the cloud has emerged as another place where organizations can run analytics, provided the providers can overcome the latencies inherent in the cloud.

, , , , , , , , , , , ,

Leave a comment

Predictive Analysis on the Mainframe

Real time, high volume predictive analysis has become a hot topic with the growing interest in Big Data. The consulting firm McKinsey addresses the growth of Big Data here. McKinsey’s gurus note that advancing technologies and their swift adoption are upending traditional business models. BottomlineIT also took up Big Data in October 2011.

With that in mind, IBM has been positioning the zEnterprise, its latest mainframe, for a key role in data analysis.  To that end, it acquired SPSS and Cognos and made sure they ran on the mainframe. The growing interest in Big Data and real-time data analytics fueled by reports like that above from McKinsey only affirmed IBM’s belief that as far as data analytics goes the zEnterprise is poised to take the spotlight. This is not completely new; BottomlineIT’s sister blog, DancingDinosaur, addressed it back in October 2009.

Over the last several decades people would laugh if a CIO suggested a mainframe for data analysis beyond the standard canned system reporting.  For ad-hoc querying, multi-dimensional analysis, and data visualization you needed distributed systems running a variety of specialized GUI tools. Still, the resulting queries could take days to run.

In a recent analyst briefing, Alan Meyer, IBM’s senior manager for Data Warehousing on the System z, built the case for a different style of data analysis on the zEnterprise. He drew a picture of companies needing to make better informed decisions at the point of engagement while applications and business users increasingly are demanding the latest data faster than ever. At the same time there is no letup in pressure to lower cost, reduce complexity, and improve efficiency.

So what’s stopping companies from doing near real-time analytics and the big data thing? The culprits, according to Meyer, are duplicate data infrastructures, the complexity of integrating multiple IT environments, inconsistent security, and insufficient processing power, especially when having to handle large volumes of data fast. The old approach clearly is too slow and costly.

The zEnterprise, it turns out, is an ideal vehicle for today’s demanding analytics.  It is architected for on-demand processing through pre-installed capacity paid for only when activated and allowing the addition of processors, disk, and memory without taking the system offline.  Virtualized top to bottom, zEnterprise delivers the desired isolation while prioritization controls let you identify the most critical queries and workloads. Its industry-leading processors ensure that the most complex queries run fast, and low latency enables near real-time analysis. Finally, multiple deployment options means you can start with a low-end z114 and grow through a fully configured z196 combined with a zBX loaded with blades.

Last October the company unveiled the IBM DB2 Analytics Accelerator (IDAA), a revamped version on the Smart Analytics Optimizer available only for the zEnterprise, along with a host of other analytics tools under the smarter computing banner. But the IDAA is IBM’s analytics crown jewel. The IDAA incorporates Netezza, an analytics engine that speeds complex analytics through in-memory processing combined with a highly intelligent query optimizer. When run in conjunction with DB2 also residing on the zEnterprise, the results can be astonishing, with queries that normally require a few hours completed in just a few seconds, 1000 times faster according to some early users.

Netezza, when deployed as an appliance, streamlines database performance through hardware acceleration and optimization for deep analytics, multifaceted reporting, and complex queries. When embedded in the zEnterprise, it delivers the same kind of performance for mixed workloads—operational transaction systems, data warehouse, operational data stores, and consolidated data marts—but with the z’s extremely high availability, security, and recoverability. As a natural extension of the zEnterprise, where the data already resides in DB2 and the OLTP systems, IDAA is able to deliver pervasive analytics across the organization while further speeding performance and ease of deployment and administration.

Of course, IT has other options: EMC now offers its Greenplum data analytics appliance  and Oracle just released its Big Data Appliance. Neither requires a mainframe. But when the price for the latest general purpose mainframe starts at $75,000 and can do much more than any appliance, maybe it’s time to consider one. The value of the predictive, near real-time business analytics alone could justify it.

, , , , , ,

Leave a comment