Archive for March, 2012

Big Data Analytics Defines Top Performers

A survey of over 1100 executives by the IBM Center for Applied Insights showed that organizations making extensive use of analytics experienced up to 1.6x the revenue growth, 2.0x EBITDA growth, and a 2.5x stock price appreciation compared to their peers.  And what they are analyzing is Big Data, a combination of structured data found in conventional relational databases and unstructured data pouring in from widely varied sources.

Big Data is growing fast.  By 2015 the digital universe, as forecast by IDC, will hit 8 zettabytes (ZB). (1ZB = 1021 bytes, one sextillion bytes). Adding to the sheer volume is the remarkable velocity at which data is created.  Every minute 600 new blog posts are published and 34,000 Twitter tweets are sent. If some of that data is about your organization, brand, products, customers, competitors, or employees wouldn’t you want to know?

Big data involves both structured and unstructured data.  Traditional systems contain predominantly structured data. Unstructured data comes from general files; from smart phones and mobile devices; from social media like Twitter, Facebook, and others; from RFID tags and other sensors and meters; and even from video cameras. All can be valuable to organizations in particular contexts.

Large organizations, of course, can benefit from Big Data, but midsize and small businesses can too.  A small chain of pizza shops needs to know the consumer buzz about their pizza as much as Domino’s.

IBM describes a 4-step process for tapping the value of Big Data: align, anticipate, act, and learn. The goal is to make the right decision at the point of maximum impact. That might be when the customer is on the phone with a sales agent or when the CFO is about to negotiate the details of an acquisition.

Align addresses the need to identify your data sources and plan how you are going to collect and organize the data. It will involve your structured databases as well as the wide range of enterprise content from unstructured sources. Anticipate addresses data analytics and business intelligence with the goal of predicting and shaping outcome. It focuses on identifying and analyzing trends, making hypotheses, and testing predictions. Act is the part where you put the data into action, whether it is making the best decision or taking advantage of a new pattern you have uncovered. But it doesn’t stop there. Another payoff from Big Data comes from the ability to learn, for the purpose of refining your analytics and identifying new patterns based on subsequent data.

Big Data needs to be accompanied by appropriate tools and technology. Earlier this month, IBM introduced three task-specific Smarter Analytics Signature Solutions. The first addresses anti-fraud, waste, and abuse by using sophisticated analytics to recommend the most effective remedy for each case. For example it might recommend a different letter requesting payment in one case but suggest a full criminal investigation in another.

The second Signature Solution focuses on next-best-action.  This looks at the various data uses real-time analytics to predict customer behavior and preferences and recommend the next best action to take with regard to a customer, such as to reduce churn or up-sell.

The third Signature Solution, dubbed CFO Performance Insight, works on a collection of complex and cross-referenced internal and external data sets using predictive analytics to deliver increased visibility and control of financial performance along with predictive insights and root-cause analyses. These are delivered via an executive-style dashboard.

IBM isn’t the only vendorr to jump on the Big Data bandwagon. EMC has put a stake into this market.  Oracle, which has been stalking IBM for years, also latched onto Big Data through Exalytics, its in-memory analytics product similar to IBM’s Netezza. Of course, small players like Cloudera, which early on staked out Hadoop, the key open source component of Big Data, also offer related products and services.

Big Data analytics will continue as an important issue for some years to come. This blog will return to it time and again.

, , , , , , ,

1 Comment

Open Source Virtualization Saves Money

Virtualization is a powerful technology that enables numerous benefits detailed here, particularly saving money. The savings come mainly through the IT resource consolidation. When you add open source to the virtualization equation, it creates another avenue to savings.

 Virtualization technology, in the form of the hypervisor, is not exactly cheap. VMware, the industry’s 900lbs. hypervisor gorilla commands significant license fees. With its latest pricing plan introduced last year, the standard license starts at about $1000. Enterprise costs are based on processor sockets and memory and, given how it is calculated, VMware can require four times as many licenses as previously needed, which dramatically increases the cost.  Here’s VMware’s FAQ on pricing. Depending on the amount of memory licensing costs could run into the tens of thousands of dollars.

Open source virtualization, noted Jean Staten Healy, IBM’s worldwide Cross-IBM Linux and Open Virtualization Director, presents opportunities to reduce virtualization costs in numerous ways.  For example, the inclusion of open source KVM in enterprise Linux distributions reduces need for additional hypervisors, enabling the organization to avoid buying more VMware licenses.  KVM also enables higher virtual machine density for more savings. IDC’s Al Gillen and Gary Chen put out a white paper detailing the recent KVM advances.

 The ability to manage mixed KVM- VMware virtualization through a single tool further increases the cost efficiency of open source virtualization. IBM’s System Director VMControl is one of the few tools providing such mixed hypervisor, cross-platform management.  For general hypervisor management, Linux and KVM standardized on libvirt and libguestfs as the base APIs for managing virtualization and images. These APIs work with other Linux hypervisors beyond KVM (higher-level tools, such as virsh and virt-manager, are built on top of libvirt).

The combination of KVM technical advances, the slow but steadily increasing adoption of KVM, and the inclusion of KVM as a core feature of the Linux operation system is driving more enterprises deploy KVM along with VMware. Of course, the fact that KVM now comes for free as part of the Linux core means you can try it at no cost and minimal risk.

Enterprise Linux users are now using KVM where they previously would have not bothered to virtualize a particular workload due to cost. This makes sense for several reasons; free being just one of them. Other reasons include the integration of the KVM toolset with the Linux toolset and the fact that Linux admins already know how to use it.

One large bank used Linux and KVM as a development and test resource in a private cloud. Normally, they would have needed to request more budget for VMware but since they had Linux with KVM they could just add Windows virtual machines. And by developing in Java, they could roll out prototypes fast.  In the process, the bank achieved high virtual machine density at minimal cost.

Another financial services firm set up virtual machines with KVM to monitor Linux usage for under-utilized hosts and then deployed virtual machine images to the host as warranted. The result: an ad-hoc grid of KVM virtual machines with high utilization, again at minimal cost.

KVM is a natural for private clouds. IBM reports private clouds being built using Moab with xCAT and KVM. The resulting private cloud handles both VMware and KVM equally well, making them plug-compatible.  With this approach, organizations can gradually expand their use of KVM and reduce, or at least delay, the need to buy more VMware licenses, again saving money.

KVM also is being exercised in a big way as the hypervisor behind IBM’s public Smart Cloud Enterprise, demonstrating how enterprise-capable this free, open source hypervisor is.

BottomlineIT expects VMware will remain the dominant x86 virtualization platform going forward. However, it makes sense to grab every opportunity to use KVM for enterprise-class, multi-platform virtualization and save money wherever you can.

, , , , ,

Leave a comment

The Return of ERP—Not Your Father’s ERP

Four years ago, Eric Berridge declared in his book Iterate or Die that traditional ERP systems were dinosaurs headed for extinction. He was right, but before ERP completely disappeared some products began to morph into something else. It took a few years and now we are witnessing the return of ERP. But this is not your father’s ERP.

As Gartner explains: Business applications are undergoing many changes. The realities of cloud computing, the accessibility provided by mobile technologies, and the impact of social paradigms are affecting the business application environment. And ERP is the business application most affected.

The recent recession and painstakingly slow recovery hurt the ERP business. Even worse, the costly, cumbersome, inflexible, outdated ERP systems the vendors were hawking hurt the businesses saddled with them. Now the survivors are creeping back and finding a radically changed ERP landscape.

ERP systems run many companies. Your company probably has one or, more likely, several. They combine business process with software and integrate numerous critical back-office functions across a company. The tight integration of the various functions and the use of a common database gave ERP systems their power to coordinate the organization’s core activities based on a single set of shared, consistent data. Having multiple ERP systems certainly complicates things but with decentralized organizations or those that inherited multiple ERP systems along with acquisitions and never got around to standardizing, it is a mess many, sadly, continue to struggle with.

The ERP idea is still good, but the old implementations were too costly, too slow to implement, and too rigid, and hard to use. Rather than help the business they prevented organizations from changing quickly. Today, the inability to change fast is disastrous. That’s why Berridge titled his book Iterate or Die. Many businesses that failed to heed his words did just that.

The latest twist to the ERP saga has been the marriage of ERP with social networking. As one ERP observer put it: Social media and ERP make good financial sense. Not only is social media being taken seriously as a proper business tool, but useful new tools, especially when engaging in conversation with customers, are emerging. For example, Salesforce.com acquired Radian6, which focuses on B2C needs, effectively allowing companies to participate in consumer conversations on the social web.  Expect to see a bigger convergence of CRM and social media. The integration of the enterprise CRM and Twitter, Facebook, and LinkedIn will allow organizations to bolster both internal and external customer relationship functions.

When Berridge wrote his book he mainly had in mind Software-as-a-Service (SaaS).  SaaS has moved into the mainstream for ERP, CRM, supply chain management, and financial systems.  Notes Roger Borek, Borek Business Solutions, a Microsoft ERP provider, the growth of SaaS ERP software is accelerating and analysts predict continued growth of through 2012. For most companies, this hosted delivery model requires no initial cash outlay for IT resources while enabling a faster software implementation, on-demand scalability, and improved ROI. These factors collectively reduce the total cost of ownership (TCO) and accelerate time-to-market benefits.

Typical of the new breed of ERP is Plex Online, ERP for manufacturers, and NetSuite, an integrated collection of SaaS applications including ERP.

Compare that to the ERP of the past that took armies of consultants several years to implement and tailor to the company’s needs at a cost of millions of dollars.  And even then many went widely unused, giving rise to the pejorative term shelfware.

The new ERP ideally will incorporate social networking, add gamification to make it easy to use and measure, and deliver it as a SaaS offering to make it fast and cost efficient. Gamification, notes Gartner, will drive process innovation—that’s something you would never associate with the old ERP. The result certainly won’t be your father’s ERP.

, , , , , , , ,

Leave a comment

Amid Poor Earnings HP Launches Gen8 Servers

The bad decisions HP has made, especially announcements to kill WebOS and its tablet devices and its decision to get out of the PC business, have finally hit home.  The company’s 1Q2012 financials were dismal.  Revenue was down 7% while earning per share dropped 32%.

Still, don’t write off HP so quickly. Just last month HP announced a new line of servers, the HP ProLiant Generation 8 (Gen8). These servers represent an effort to redefine data center economics by automating every aspect of the server life cycle and spawned a new systems architecture called HP ProActive Insight architecture, which will span the entire HP Converged Infrastructure. HP clearly continues to play the game.

In fact, HP is adding features into the Gen8 servers, such as integrated lifecycle automation that it estimates can save 30 days of admin time each year per admin; dynamic workload acceleration, which can boost performance 7x; and automated energy optimization, which HP promises will nearly double compute-per-watt capacity, thereby saving an estimated $7 million in energy costs in a typical data center over three years.

Compared to the HP results, IBM had a good quarter, announcing fourth-quarter 2011 diluted earnings of $4.62 per share, compared with diluted earnings of $4.18 per share in the fourth quarter of 2010, an increase of 11%. Fourth-quarter net income was $5.5 billion compared with $5.3 billion in the fourth quarter of 2010, an increase of 4%. Operating (non-GAAP) net income was $5.6 billion compared with $5.4 billion in the fourth quarter of 2010, an increase of 5%. All this despite a weak quarter for its hardware group, which reported revenues of $5.8 billion for the quarter, down 8% from Q4 2010. The group’s pre-tax income was $790 million, a decrease of 33% due mainly to unexpectedly weak mainframe sales following a streak of record setting mainframe quarterly gains.

But still Gartner found IBM tops among all servers in Q4 2011 and #1 in the market for UNIX servers with 52.8% market share in that same quarter. IBM increased quarterly revenues by 17% year over year with IBM Power Systems and improved its share in comparison to Q4 2010 by 10.9%. For the full year of 2011, IBM led the UNIX server market with 45.9& market share, a gain of 6.9 points over 2010. IBM grew UNIX revenues by 23% over 2010, according to Gartner.

IBM also led the market for servers costing more than $250,000, attaining 69.4% factory revenue share in the fourth quarter with IBM System z mainframes and Power Systems. IBM also led this market for the full year of 2011 with 8% revenue growth over 2010, capturing 63.7% market share.

Meanwhile, IBM announced 570 competitive displacements in 4Q 2011 alone and nearly 2,400 competitive displacements in 2011for its servers and storage systems. For Power, it had more than 350 competitive displacements in 4Q alone, which resulted in over $350 million of business. Roughly 60% of the displacement by Power came from HP. Overall, almost 40% of the 2,400 displacements came from HP and more than 25% came from Oracle/Sun, another company that has struggled to get its product strategy on track.  IBM reports the competitive displacements in 2011 generated over $1 billion of business. 

IBM spent much of 2010 optimizing its Power Systems lineup, the latest optimized for data-intensive workloads, and buyers responded. The POWER7 processor offers 4, 6 or 8 cores per socket and up to four threads per core. With a 4.25 GHz top processor speed and an integrated eDRAM L3 cache these systems can fly.  In fact, IBM reports Power grew 6%, the fifteenth consecutive quarter of share gains in Power.

In other achievements from IBM’s Systems Group its x86 machines, the System x, scored a benchmark success with a world-record 4-processor result for Linux on the two-tier SAP Sales and Distribution (SD) standard application benchmark. This was achieved with an IBM System x 3850 X5, running IBM DB2 9.7, Red Hat Enterprise Linux 6.2, and SAP enhancement package 4 for the SAP ERP application Release 6.0. Specifically, the x3850 X5 achieved 12,560 SAP SD benchmark users with 0.99 seconds average dialog response, 68,580 SAPS measured throughput of 4,115,000 dialog steps per hour (or 1,371,670 fully processed line items per hour), and an average CPU utilization of 98% for the central server.

Ironically, the previous best four-processor result, 12,204 SAP SD benchmark users on Linux, was achieved by the HP ProLiant DL580 G7.  The new benchmark-winning IBM x3850 X5 was configured with four Intel Xeon E7-8870 processors at 2.40GHz with 30MB shared L3 cache per processor (4 processors/40 cores/80 threads), 512GB of memory, 64-bit DB2 9.7, Red Hat Enterprise Linux 6.2, and SAP enhancement package 4 for SAP E.

IBM is not hesitating to press its advantage.  Besides its Migration Factory to facilitate migration to IBM platforms, it announced services designed to help companies upgrade IT infrastructures in the face of technology challenges like exponentially larger data volumes, server sprawl, increasingly complex infrastructures, and flat budgets. These include new financing options for those wanting to migrate from HP or Oracle/Sun technologies, including 0% financing on two key IBM systems families: IBM Power Systems and IBM System Storage. Specifically, through March 2012, organizations in the US or Canada can finance (12-month full pay-out lease) between $5,000 and $1 million in Power Systems and/or System Storage technologies at 0%.

As BottomlineIT sees it, this kind of competition is only good for companies that depend on IT.

, , , , , , ,

Leave a comment

Follow

Get every new post delivered to your Inbox.

Join 402 other followers