Posts Tagged IBM

Best TCO—System z vs. x86 vs. Public Cloud

IBM recently analyzed various likely customer workload scenarios and found that the System z as an enterprise Linux server could consistently beat x86 machines and even public cloud providers like AWS in terms of TCO.  The analysis was reasonably evenhanded although, like automobile mileage ratings, your actual results may vary.

This blogger has long contended that the z Enterprise Linux Server acquired under the deeply discounted IBM System z Solution Edition program could beat comparable x86 systems not only in terms of TCO but even TCA. Algar, a Brazilian telecom, acquired its initial zEnterprise Linux server to consolidate a slew of x86 systems and lay a foundation for scalable growth. It reports cutting data center costs by 70%. Nationwide Insurance, no newcomer to mainframe computing, used the zEnterprise to consolidate Linux servers, achieving $46 million in savings.

The point: the latest IBM TCO analyses confirm what IBM has been saying. TCO advantage, IBM found, switches to the z Enterprise Linux Server at around 200 virtual servers compared to the public cloud and a bit more VMs compared to x86 machines. View the IBM z TCO presentation here.

IBM further advanced its cause in the TCO/TCA battle with the recent introduction of the IBM Enterprise Cloud System. This is a factory-built and integrated system—processor, memory, network, IFLs, virtualization management, cloud management, hypervisor, disk orchestration, Linux OS—priced (discounted) as a single solution. IBM promises to deliver it in 45 days and have it production ready within hours of hitting the organization’s loading dock. Of course, it comes with the scalability, availability, security, manageability, etc. long associated with the z, and IBM reports it can scale to 6000 VMs. Not sure how this compares in price to a Solution Edition Enterprise Linux Server.

The IBM TCO analysis compared the public cloud, x86 cloud, and the Enterprise Cloud System in terms power and space, labor, software/middleware, and hardware costs when running 48 diverse (a range of low, medium, and high I/O) workloads. In general it found an advantage for the z Enterprise Cloud System of 34-73%.  The z cost considerably more in terms of hardware but it more than made up for it in terms of software, labor, and power. Overall, the TCO examined more than 30 cost variables, ranging from blade/IFL/memory/storage amounts to hypervisor/cloud management/middleware maintenance.

In terms of hardware, the z included the Enterprise Linux Server, storage, z/VM, and IBM Wave for z/VM. Software included WebSphere Application Server middleware, Cloud Management Suite for z, and Tivoli for z/VM. The x86 cloud included HP hardware with a hypervisor, WebSphere Application Server, SmartCloud Orchestrator, SmartCloud Monitoring, and Tivoli Storage Manager EE. Both analyses included labor to manage both hardware and VMs, power and space costs, and SUSE Linux.

The public cloud assumptions were a little different. Each workload was deployed as a separate instance. The pricing model was for reserved instances. Hardware costs were based on instances in east US region with SUSE, EBS volume, data in/out, support (enterprise), free and reserved tier discounts applied. Software costs included WebSphere Application Server ND (middleware) costs for instances. A labor cost was included for managing instance.

When IBM applied its analysis to 398 I/O-diverse workloads the results were similar, 49-75% lower cost with the Cloud System on z. Again, z hardware was considerably more costly than either x86 or the public cloud. But z software and labor was far less than the others. In terms of 3-year TCO, the cloud was the highest at $37 M, x86 came in at $18.3 M, and the Cloud on z cost $9.4 M. With 48 workloads, the z again came in with lowest TCO at $1 M compared to $1.6 M for x86 systems, and $3.9 M for the public cloud.

IBM tried to keep the assumptions equivalent across the platforms. If you make different software or middleware choices or a different mix of high-mid-low I/O workloads your results will be different but the rankings probably won’t change all that much.

Also, there still is time to register for IBM Edge2014 in Las Vegas. This blogger will be there hanging around the bloggers lounge when not attending sessions. Please join me there.

Follow Alan Radding/BottomlineIT on Twitter: @mainframeblog

, , , , , , , , , , , , , , , , , , , , , , , , ,

Leave a comment

The Internet of Things Gains Traction

The Internet of Things (IoT) appears finally to be gaining real traction with both Gartner and IDC putting out reports on it. The opportunity, however, can best be understood in terms of vertical applications because the value of IoT is based on individual use cases across all verticals. “Successful sales and marketing efforts by vendors will be based on understanding the most lucrative verticals that offer current growth and future potential and then creating solutions for specific use cases that address industry-specific business processes,” said Scott Tiazkun, senior research analyst, IDC’s Global Technology and Industry Research Organization.” Similarly, enterprise IT needs to understand which vertical use cases will benefit first and most.

Tiazkun was referring to IDC’s latest Worldwide Internet of Things Spending by Vertical Market 2014-2017 Forecast.  To tap that market, IDC advises consultants to focus on the individual vertical opportunity that arises from IoT already in play.  Here is where a vertical business savvy IT exec can win. As IDC noted, realizing the existence of the vertical opportunity is the first step to understanding the impact and, therefore, to understanding an IoT market opportunity that exists – for enterprises and IT vendors and consultants.

The idea of IoT has been kicking around for years. BottomelineIT wrote about it early in 2011 here. It refers to the idea of embedding intelligence into things in the form of computer processors and making them IP addressable. Linking them together over a network gives you IoT.  The idea encompasses almost anything from the supply chain to consumer interests. Smart appliances, devices, and things of all sorts can participate in IoT.  RFID, all manner of sensors and monitors, big data, and real time analytics play into IoT.

In terms of dollars, IoT is huge. Specifically, IDC has found:

  • The technology and services revenue from the components, processes, and IT support for IoT to expand from $4.8 trillion in 2012 to $7.3 trillion by 2017 at an 8.8% compound annual growth rate (CAGR), with the greatest opportunity initially in the consumer, discrete manufacturing, and government vertical industries.
  • The IoT/machine-to-machine (M2M) market is growing quickly, but the development of this market will not be consistent across all vertical markets. Industries that already understand IoT will see the most immediate growth, such as industrial production/automotive, transportation, and energy/utilities. However, all verticals eventually will reflect great opportunity.
  • IoT is a derivative market containing many elements, including horizontal IT components as well as vertical and industry-specific IT elements. It is these vertical components where IT consultants and vendors will want to distinguish themselves to address industry-specific IoT needs.
  • IoT also opens IT consultants and vendors to the consumer market by providing business-to-business-to-consumer (B2B2C) services to connect and run homes and automobiles – all places that electronic devices increasingly  will have networking capability.

 Already, leading vendors are positioning themselves for the IoT market. To Oracle IoT brings tremendous promise to integrate every smart thing in this world.  Cisco, too, jumped early on IoT bandwagon dubbing it the Internet of Everything.

IBM gets almost cosmic about IoT, which it describes as the emergence of a kind of global data field. The planet itself—natural systems, human systems, physical objects—have always generated an enormous amount of data, but until recent decades, we weren’t able to hear, see, or capture it. Now we can because all of these things have been instrumented with microchips, UPC codes, and other technologies. And they’re all interconnected, so now we can actually access the data. Of course, this dovetails with IBM’s Smarter Planet marketing theme.

Enterprise IT needs to pay close attention to IoT too. First, it will change the dynamics of your network, affecting everything from network architecture to bandwidth to security. Second, once IT starts connecting the various pieces together, it opens interesting new possibilities for using IT to advance business objectives and even generate revenue. It can help you radically reshape the supply chain, the various sales channels, partner channels, and more. It presents another opportunity for IT to contribute to the business in substantive business terms.

IDC may have laid out the best roadmap to IoT for enterprise IT. According to IDC, the first step will be to understand the components of IoT/M2M IT ecosphere. Because this is a derivative market, there are many opportunities for vendors and consultants to offer pieces, product suites, and services that cover the needed IoT technology set. Just make sure this isn’t just about products. Make sure services, strategies, integration, and business execution are foremost. That’s how you’ll make it all pay off.

The promise of IoT seems open ended. Says Tiazkun: “The IoT solutions space will expand exponentially and will offer every business endless IoT-focused solutions. The initial strategy of enterprise IT should be to avoid choosing IoT-based solutions that will solve only immediate concerns and lack staying power. OK, you’re been alerted.

Follow BottomlineIT on Twitter: @mainframeblog

, , , , , , , , , , , ,

3 Comments

Fueled by SMAC Tech M&A Activity to Heat Up

Corporate professional services firm BDO USA  polled approximately 100 executives of U.S. tech outfits for its 2014 Technology Outlook Survey and found them firm in the belief that mergers and acquisitions in tech would either stay at the same rate (40%) or increase over last year (43%). And this isn’t a recent phenomenon.

M&A has been widely adopted across a range of technology segments as not only the vehicle to drive growth but, more importantly, to remain at the leading edge in a rapidly changing business and technology environment that is being spurred by cloud and mobile computing. And fueling this M&A wave is SMAC (Social, Mobile, Analytics, Cloud).

SMAC appears to be triggering a scramble among large, established blue chip companies like IBM, EMC, HP, Oracle, and more to acquire almost any promising upstart out there. Their fear: becoming irrelevant, especially among the young, most highly sought demographics.  SMAC has become the code word (code acronym, anyway) for the future.

EMC, for example, has evolved from a leading storage infrastructure player to a broad-based technology giant driven by 70 acquisitions over the past 10 years. Since this past August IBM has been involved in a variety of acquisitions amounting to billions of dollars. These acquisitions touch on everything from mobile networks for big data analytics and mobile device management to cloud services integration.

Google, however, probably should be considered the poster child for technology M&A. According to published reports, Google has been acquiring, on average, more than one company per week since 2010. The giant search engine and services company’s biggest acquisition to date has been the purchase of Motorola Mobility, a mobile device (hardware) manufacturing company, for $12.5 billion. The company also purchased an Israeli startup Waze  in June 2013 for almost $1 billion.  Waze is a GPS-based application for mobile phones and has brought Google a strong position in the mobile phone navigation business, even besting Apple’s iPhone for navigation.

Top management has embraced SMAC-driven M&A as the fastest, easiest, and cheapest way to achieve strategic advantage through new capabilities and the talent that developed those capabilities. Sure, the companies could recruit and build those capabilities on their own but it could take years to bring a given feature to market that way and by then, in today’s fast moving competitive markets, the company would be doomed to forever playing catch up.

Even with the billion-dollar and multi-billion dollar price tags some of these upstarts are commanding strategic acquisitions like Waze, IBM’s SoftLayer, or EMC’s XtremeIO have the potential to be game changers. That’s the hope, of course. But it can be risky, although risk can be managed.

And the best way to manage SMAC merger risk is to have a flexible IT platform that can quickly absorb those acquisitions and integrate and share the information and, of course, a coherent strategy for leveraging the new acquisitions. What you need to avoid is ending up with a bunch of SMAC piece parts that don’t fit together.

, , , , , , , , , , , , , , , , , ,

Leave a comment

Change-proof Your Organization

Many organizations are being whiplashed by IT infrastructure change—costly, disruptive never changes that are hindering IT and the organization.  You know the drivers: demand for cloud computing, mobile, social, big data, real-time analytics, and collaboration. Don’t forget to add soaring transaction volumes, escalating amounts of data, 24x7x365 processing, new types of data, proliferating forms of storage, incessant compliance mandates, and more keep driving change. And there is no letup in sight.

IBM started to articulate this in a blog post, Infrastructure Matters. IBM was focusing on cloud and data, but the issues go even further. It is really about change-proofing, not just IT but the business itself.

All of these trends put great pressure on the organization, which forces IT to repeatedly tweak the infrastructure or otherwise revamp systems. This is costly and disruptive not just to IT but to the organization.

In short, you need to change-proof your IT infrastructure and your organization.  And you have to do it economically and in a way you can efficiently sustain over time. The trick is to leverage some of the very same  technology trends creating change to design an IT infrastructure that can smoothly accommodate changes both known and unknown. Many of these we have discussed in BottomlineIT previously:

  • Cloud computing
  • Virtualization
  • Software defined everything
  • Open standards
  • Open APIs
  • Hybrid computing
  • Embedded intelligence

These technologies will allow you to change your infrastructure at will, changing your systems in any variety of ways, often with just a few clicks or tweaks to code.  In the process, you can eliminate vendor lock-in and obsolete, rigid hardware and software that has distorted your IT budget, constrained your options, and increased your risks.

Let’s start by looking at just the first three listed above. As noted above, all of these have been discussed in BottomlineIT and you can be sure they will come up again.

You probably are using aspects of cloud computing to one extent or another. There are numerous benefits to cloud computing but for the purposes of infrastructure change-proofing only three matter:  1) the ability to access IT resources on demand, 2) the ability to change and remove those resources as needed, and 3) flexible pricing models that eliminate the upfront capital investment in favor of paying for resources as you use them.

Yes, there are drawbacks to cloud computing. Security remains a concern although increasingly it is becoming just another manageable risk. Service delivery reliability remains a concern although this too is a manageable risk as organizations learn to work with multiple service providers and arrange for multiple links and access points to those providers.

Virtualization remains the foundational technology behind the cloud. Virtualization makes it possible to deploy multiple images of systems and applications quickly and easily as needed, often in response to widely varying levels of service demand.

Software defined everything also makes extensive use of virtualization. It inserts a virtualization layer between the applications and the underlying infrastructure hardware.  Through this layer the organization gains programmatic control of the software defined components. Most frequently we hear about software defined networks that you can control, manage, and reconfigure through software running on a console regardless of which networking equipment is in place.  Software defined storage gives you similar control over storage, again generally independent of the underlying storage array or device.

All these technologies exist today at different stages of maturity. Start planning how to use them to take control of IT infrastructure change. The world keeps changing and the IT infrastructures of many enterprises are groaning under the pressure. Change-proofing your IT infrastructure is your best chance of keeping up.

, , , , , , , , , , , , , , ,

Leave a comment

IBM on How Computing Will Change Us in 5 Years

Since 2006 at about this time IBM comes out with five predictions, dubbed 5-in-5, about how technology will affect the world within five years.  Each year the predictions look at how technology innovations will change the way people work, live, and play within the next five years. They are based on market and social trends combined with ideas from the thousands of biologists, engineers, mathematicians and medical physicians in IBM research labs around the world.

Last year the 5-in-5 predictions focused on how systems would augment human senses. It looked at sight, hearing, smell, touch, and taste. For example, a machine that experiences flavor could determine the precise chemical structure of food and why people like it. Or, computers might smell for chemicals in urban environments to monitor pollution or analyze the soil.

IBM’s 5-in-5 predictions for 2013 go in a different direction. This year the researchers looked at how innovations in computing allow us to interact with the meaning that lies in data. The researchers, taking a distinctly benign view, suggest that systems will emerge that treat us as individuals, adapt to us, and look out for our interests. Others, of course, might see this as the tyranny of Big Brother.

Here is this year’s 5-in-5:

  1. The classroom will learn you.  Teachers will work on device that can monitor and interact with the student and ultimately create a unique persona for each student. Teachers will use that persona, which changes over time, to guide the student on his or her learning path. They will know, through the student’s device, what the particular student is struggling to learn and will provide the right help at the right time.
  2. Buying local beats online.  The combination of cloud technology and in-store display will enable local stores to act as a showroom for the wide variety of products available online and enable customers to interact with a product. Of course the store will recognize you and know your preferences. In short, IBM is predicting the convergence of online and brick and mortar retail.
  3. Doctors will use your DNA to keep you well. This already is happening how. But it goes beyond DNA to using the data analytic power of computers to diagnose patient ills and guide doctors in treatment. IBM’s Watson is doing some of this today. How quickly this will evolve remains to be seen; healthcare is a minefield of conflicting interests, most of which have nothing to do with patient care and successful outcomes. You can, for instance, have your personal genome assessed and analyzed today but few have opted to do so. Do you want to know you have a strong genetic inclination toward a disease for which there is no cure?
  4. You city will help you live in it. Sitting at consoles in operations centers connected to myriad sensors generating massive amounts of real time data, city administrators will be able to, say, manage traffic lights interactively as traffic flows or dynamically adjust the arrival and departure of various transportation. All things we as citizens probably want. The city also could become a huge social network where policies are developed based on clicking likes. Big brother, anyone?
  5. A digital guardian will protect you online. The retailer Target just compromised tens of millions of personal identifications at the end of the year. We truly need an effective digital guardian. As IBM notes, this requires continuous, sophisticated analytics, to identify whatever activity in your digital life varies from the norm and flags any sudden deviation of behavior. This guardian needs to shut down bad things proactively before it reaches you and also provide a private safe fortress for your data and online persona. As one whose email was recently hacked, this blogger is ready to sign up today. BTW: my apologies to any readers who received a desperate message purportedly from me saying I was stranded in Turkey, my wallet destroyed, and in immediate need of money to get home. Hope you didn’t send any.

Best wishes for a peaceful and prosperous New Year.

New Year Resolution #1: Follow this blogger on Twitter: @mainframeblog

, , , , , , , , , , , , , , ,

Leave a comment

Big Data and Analytics as Game Changing Technology

If you ever doubted that big data was going to become important, there should be no doubt anymore. Recent headlines from the past couple of weeks of the government capturing and analyzing massive amounts of daily phone call data should convince you.

That this report was shortly followed by more reports of the government tapping the big online data websites like Google, Yahoo, and such for even more data should alert you to three things:

1—There is a massive amount of data out there that can be collected and analyzed.

2—Companies are amassing incredible volumes of data in the normal course of serving people who readily and knowingly give their data to these organizations. (This blogger is one of those tens of million .)

3—The tools and capabilities are mature enough for someone to sort through that data and connect the dots to deliver meaningful insights.

Particularly with regard to the last point this blogger thought the industry was still five years away from generating meaningful results from that amount of data coming in at that velocity. Sure, marketers have been sorting and correlating large amounts of data for years, but it was mostly structured data and not at nearly this much. BTW, your blogger has been writing about big data for some time.

If the news reports weren’t enough it became clear at IBM Edge 2013, wrapping up is Las Vegas this week, that big data analytics is happening and companies and familiar companies are succeeding at it now. It also is clear that there is sufficient commercial off-the-shelf computing power from companies like IBM and others and analytics tools from a growing number of vendors to sort through massive amounts of data and make sense of it fast.

An interesting point came up in one of the many discussions at Edge 2013 touching on big data. Every person’s data footprint is as unique as a fingerprint or other bio-metrics. We all visit different websites and interact with social media and use our credit and debit cards in highly individual ways. Again, marketers have sensed this at some level for years, but they haven’t yet really honed it down to the actual individual on a mass scale, although there is no technical reason one couldn’t. You now can, in effect, market to a demographic of one.

A related conference is coming up Oct. 21-25 in Orlando, Fl., called Enterprise Systems 2013.  It will combine the System z and the Power System Technical University along with a new executive-focused Enterprise Systems event. It will include new announcements, peeks into trends and directions, over 500 expert technical sessions across 10 tracks, and a comprehensive solution center. This blogger has already put it on his calendar.

There was much more interesting information at Edge 2013, such as using data analytics and cognitive computing to protect IT systems.  Perimeter defense, anti-virus, and ID management are no longer sufficient. Stay tuned.

, , , , , , , , , , , ,

Leave a comment

Where Have All the Enterprise IT Hardware Vendors Gone?

Remember that song asking where all the flowers had gone? In a few years you might be asking the same of many of today’s enterprise hardware vendors.  The answer is important as you plan your data center 3-5 years out.  Where will you get your servers from and at what cost? Will you even need servers in your data center?  And what will they look like, maybe massive collections of ARM processors?

As reported in The Register (Amazon cloud threatens the entire IT ecosystem): Amazon’s cloud poses a major threat to most of the traditional IT ecosystem, a team of 25 Morgan Stanley analysts write in a report, Amazon Web Services: Making Waves in the IT Pond, that was released recently. The Morgan Stanley researchers cite Brocade, NetApp, QLogic, EMC and VMware as facing the greatest challenges from the growth of AWS. The threat takes the form of AWS’s exceeding low cost per virtual machine instance.

Beyond the price threat, the vendors are scrambling to respond to the challenges of cloud, mobile, and big data/analytics. Even Intel, the leading chip maker, just introduced the 4th generation Intel® Core™ processor family to address these challenges.  The new chip promises optimized experiences personalized for end-users’ specific needs and offers double the battery life and breakthrough graphics targeted to new low cost devices such as mobile tablets and all-in-one systems.

The Wall Street Journal online covered related ground from a different perspective when it wrote: PC makers unveiled a range of unconventional devices on the eve of Asia’s biggest computer trade show as they seek to revive (the) flagging industry and stay relevant amid stiff competition. Driven by the cloud and the explosion of mobile devices in a variety of forms the enterprise IT industry doesn’t seem to know what the next device should even be.

Readers once chastised this blogger for suggesting that their next PC might be a mobile phone. Then came smartphones, quickly followed by tablets. Today PC sales are dropping fast, according to IDC.

The next rev of your data center may be based on ARM processors (tiny, stingy with power, cheap, cool, and remarkably fast), essentially mobile phone chips. They could be ganged together in large quantities to deliver mainframe-like power, scalability, and reliability at a fraction of the cost.

IBM has shifted its focus and is targeting cloud computing, mobile, and big data/analytics, even directing its acquisitions toward these areas as witnessed by yesterday’s SoftLayer acquisition. HP, Oracle, most of the other vendors are pursuing variations of the same strategy.  Oracle, for example, acquired Tekelec, a smart device signaling company.

But as the Morgan Stanley analysts noted, it really is Amazon using its cloud scale to savage the traditional enterprise IT vendor hardware strategies and it is no secret why:

  • No upfront investment
  • Pay for Only What You Use (with a caveat or two)
  • Price Transparency
  • Faster Time to Market
  • Near-infinite Scalability and Global Reach

And the more AWS grows, the more its prices drop due to the efficiency of cloud scaling.  It is not clear how the enterprise IT vendors will respond.

What will your management say when they get a whiff of AWS pricing. An extra large, high memory SQL Server database instance lists for $0.74 per hour (check the fine print). What does your Oracle database cost you per hour running on your on-premise enterprise server? That’s what the traditional enterprise IT vendors are facing.

, , , , , , , , , , , , , , , , , , , ,

Leave a comment

Follow

Get every new post delivered to your Inbox.

Join 456 other followers