Posts Tagged technology

Best TCO—System z vs. x86 vs. Public Cloud

IBM recently analyzed various likely customer workload scenarios and found that the System z as an enterprise Linux server could consistently beat x86 machines and even public cloud providers like AWS in terms of TCO.  The analysis was reasonably evenhanded although, like automobile mileage ratings, your actual results may vary.

This blogger has long contended that the z Enterprise Linux Server acquired under the deeply discounted IBM System z Solution Edition program could beat comparable x86 systems not only in terms of TCO but even TCA. Algar, a Brazilian telecom, acquired its initial zEnterprise Linux server to consolidate a slew of x86 systems and lay a foundation for scalable growth. It reports cutting data center costs by 70%. Nationwide Insurance, no newcomer to mainframe computing, used the zEnterprise to consolidate Linux servers, achieving $46 million in savings.

The point: the latest IBM TCO analyses confirm what IBM has been saying. TCO advantage, IBM found, switches to the z Enterprise Linux Server at around 200 virtual servers compared to the public cloud and a bit more VMs compared to x86 machines. View the IBM z TCO presentation here.

IBM further advanced its cause in the TCO/TCA battle with the recent introduction of the IBM Enterprise Cloud System. This is a factory-built and integrated system—processor, memory, network, IFLs, virtualization management, cloud management, hypervisor, disk orchestration, Linux OS—priced (discounted) as a single solution. IBM promises to deliver it in 45 days and have it production ready within hours of hitting the organization’s loading dock. Of course, it comes with the scalability, availability, security, manageability, etc. long associated with the z, and IBM reports it can scale to 6000 VMs. Not sure how this compares in price to a Solution Edition Enterprise Linux Server.

The IBM TCO analysis compared the public cloud, x86 cloud, and the Enterprise Cloud System in terms power and space, labor, software/middleware, and hardware costs when running 48 diverse (a range of low, medium, and high I/O) workloads. In general it found an advantage for the z Enterprise Cloud System of 34-73%.  The z cost considerably more in terms of hardware but it more than made up for it in terms of software, labor, and power. Overall, the TCO examined more than 30 cost variables, ranging from blade/IFL/memory/storage amounts to hypervisor/cloud management/middleware maintenance.

In terms of hardware, the z included the Enterprise Linux Server, storage, z/VM, and IBM Wave for z/VM. Software included WebSphere Application Server middleware, Cloud Management Suite for z, and Tivoli for z/VM. The x86 cloud included HP hardware with a hypervisor, WebSphere Application Server, SmartCloud Orchestrator, SmartCloud Monitoring, and Tivoli Storage Manager EE. Both analyses included labor to manage both hardware and VMs, power and space costs, and SUSE Linux.

The public cloud assumptions were a little different. Each workload was deployed as a separate instance. The pricing model was for reserved instances. Hardware costs were based on instances in east US region with SUSE, EBS volume, data in/out, support (enterprise), free and reserved tier discounts applied. Software costs included WebSphere Application Server ND (middleware) costs for instances. A labor cost was included for managing instance.

When IBM applied its analysis to 398 I/O-diverse workloads the results were similar, 49-75% lower cost with the Cloud System on z. Again, z hardware was considerably more costly than either x86 or the public cloud. But z software and labor was far less than the others. In terms of 3-year TCO, the cloud was the highest at $37 M, x86 came in at $18.3 M, and the Cloud on z cost $9.4 M. With 48 workloads, the z again came in with lowest TCO at $1 M compared to $1.6 M for x86 systems, and $3.9 M for the public cloud.

IBM tried to keep the assumptions equivalent across the platforms. If you make different software or middleware choices or a different mix of high-mid-low I/O workloads your results will be different but the rankings probably won’t change all that much.

Also, there still is time to register for IBM Edge2014 in Las Vegas. This blogger will be there hanging around the bloggers lounge when not attending sessions. Please join me there.

Follow Alan Radding/BottomlineIT on Twitter: @mainframeblog

Advertisements

, , , , , , , , , , , , , , , , , , , , , , , , ,

Leave a comment

Fueled by SMAC Tech M&A Activity to Heat Up

Corporate professional services firm BDO USA  polled approximately 100 executives of U.S. tech outfits for its 2014 Technology Outlook Survey and found them firm in the belief that mergers and acquisitions in tech would either stay at the same rate (40%) or increase over last year (43%). And this isn’t a recent phenomenon.

M&A has been widely adopted across a range of technology segments as not only the vehicle to drive growth but, more importantly, to remain at the leading edge in a rapidly changing business and technology environment that is being spurred by cloud and mobile computing. And fueling this M&A wave is SMAC (Social, Mobile, Analytics, Cloud).

SMAC appears to be triggering a scramble among large, established blue chip companies like IBM, EMC, HP, Oracle, and more to acquire almost any promising upstart out there. Their fear: becoming irrelevant, especially among the young, most highly sought demographics.  SMAC has become the code word (code acronym, anyway) for the future.

EMC, for example, has evolved from a leading storage infrastructure player to a broad-based technology giant driven by 70 acquisitions over the past 10 years. Since this past August IBM has been involved in a variety of acquisitions amounting to billions of dollars. These acquisitions touch on everything from mobile networks for big data analytics and mobile device management to cloud services integration.

Google, however, probably should be considered the poster child for technology M&A. According to published reports, Google has been acquiring, on average, more than one company per week since 2010. The giant search engine and services company’s biggest acquisition to date has been the purchase of Motorola Mobility, a mobile device (hardware) manufacturing company, for $12.5 billion. The company also purchased an Israeli startup Waze  in June 2013 for almost $1 billion.  Waze is a GPS-based application for mobile phones and has brought Google a strong position in the mobile phone navigation business, even besting Apple’s iPhone for navigation.

Top management has embraced SMAC-driven M&A as the fastest, easiest, and cheapest way to achieve strategic advantage through new capabilities and the talent that developed those capabilities. Sure, the companies could recruit and build those capabilities on their own but it could take years to bring a given feature to market that way and by then, in today’s fast moving competitive markets, the company would be doomed to forever playing catch up.

Even with the billion-dollar and multi-billion dollar price tags some of these upstarts are commanding strategic acquisitions like Waze, IBM’s SoftLayer, or EMC’s XtremeIO have the potential to be game changers. That’s the hope, of course. But it can be risky, although risk can be managed.

And the best way to manage SMAC merger risk is to have a flexible IT platform that can quickly absorb those acquisitions and integrate and share the information and, of course, a coherent strategy for leveraging the new acquisitions. What you need to avoid is ending up with a bunch of SMAC piece parts that don’t fit together.

, , , , , , , , , , , , , , , , , ,

Leave a comment

Change-proof Your Organization

Many organizations are being whiplashed by IT infrastructure change—costly, disruptive never changes that are hindering IT and the organization.  You know the drivers: demand for cloud computing, mobile, social, big data, real-time analytics, and collaboration. Don’t forget to add soaring transaction volumes, escalating amounts of data, 24x7x365 processing, new types of data, proliferating forms of storage, incessant compliance mandates, and more keep driving change. And there is no letup in sight.

IBM started to articulate this in a blog post, Infrastructure Matters. IBM was focusing on cloud and data, but the issues go even further. It is really about change-proofing, not just IT but the business itself.

All of these trends put great pressure on the organization, which forces IT to repeatedly tweak the infrastructure or otherwise revamp systems. This is costly and disruptive not just to IT but to the organization.

In short, you need to change-proof your IT infrastructure and your organization.  And you have to do it economically and in a way you can efficiently sustain over time. The trick is to leverage some of the very same  technology trends creating change to design an IT infrastructure that can smoothly accommodate changes both known and unknown. Many of these we have discussed in BottomlineIT previously:

  • Cloud computing
  • Virtualization
  • Software defined everything
  • Open standards
  • Open APIs
  • Hybrid computing
  • Embedded intelligence

These technologies will allow you to change your infrastructure at will, changing your systems in any variety of ways, often with just a few clicks or tweaks to code.  In the process, you can eliminate vendor lock-in and obsolete, rigid hardware and software that has distorted your IT budget, constrained your options, and increased your risks.

Let’s start by looking at just the first three listed above. As noted above, all of these have been discussed in BottomlineIT and you can be sure they will come up again.

You probably are using aspects of cloud computing to one extent or another. There are numerous benefits to cloud computing but for the purposes of infrastructure change-proofing only three matter:  1) the ability to access IT resources on demand, 2) the ability to change and remove those resources as needed, and 3) flexible pricing models that eliminate the upfront capital investment in favor of paying for resources as you use them.

Yes, there are drawbacks to cloud computing. Security remains a concern although increasingly it is becoming just another manageable risk. Service delivery reliability remains a concern although this too is a manageable risk as organizations learn to work with multiple service providers and arrange for multiple links and access points to those providers.

Virtualization remains the foundational technology behind the cloud. Virtualization makes it possible to deploy multiple images of systems and applications quickly and easily as needed, often in response to widely varying levels of service demand.

Software defined everything also makes extensive use of virtualization. It inserts a virtualization layer between the applications and the underlying infrastructure hardware.  Through this layer the organization gains programmatic control of the software defined components. Most frequently we hear about software defined networks that you can control, manage, and reconfigure through software running on a console regardless of which networking equipment is in place.  Software defined storage gives you similar control over storage, again generally independent of the underlying storage array or device.

All these technologies exist today at different stages of maturity. Start planning how to use them to take control of IT infrastructure change. The world keeps changing and the IT infrastructures of many enterprises are groaning under the pressure. Change-proofing your IT infrastructure is your best chance of keeping up.

, , , , , , , , , , , , , , ,

1 Comment

IBM on How Computing Will Change Us in 5 Years

Since 2006 at about this time IBM comes out with five predictions, dubbed 5-in-5, about how technology will affect the world within five years.  Each year the predictions look at how technology innovations will change the way people work, live, and play within the next five years. They are based on market and social trends combined with ideas from the thousands of biologists, engineers, mathematicians and medical physicians in IBM research labs around the world.

Last year the 5-in-5 predictions focused on how systems would augment human senses. It looked at sight, hearing, smell, touch, and taste. For example, a machine that experiences flavor could determine the precise chemical structure of food and why people like it. Or, computers might smell for chemicals in urban environments to monitor pollution or analyze the soil.

IBM’s 5-in-5 predictions for 2013 go in a different direction. This year the researchers looked at how innovations in computing allow us to interact with the meaning that lies in data. The researchers, taking a distinctly benign view, suggest that systems will emerge that treat us as individuals, adapt to us, and look out for our interests. Others, of course, might see this as the tyranny of Big Brother.

Here is this year’s 5-in-5:

  1. The classroom will learn you.  Teachers will work on device that can monitor and interact with the student and ultimately create a unique persona for each student. Teachers will use that persona, which changes over time, to guide the student on his or her learning path. They will know, through the student’s device, what the particular student is struggling to learn and will provide the right help at the right time.
  2. Buying local beats online.  The combination of cloud technology and in-store display will enable local stores to act as a showroom for the wide variety of products available online and enable customers to interact with a product. Of course the store will recognize you and know your preferences. In short, IBM is predicting the convergence of online and brick and mortar retail.
  3. Doctors will use your DNA to keep you well. This already is happening how. But it goes beyond DNA to using the data analytic power of computers to diagnose patient ills and guide doctors in treatment. IBM’s Watson is doing some of this today. How quickly this will evolve remains to be seen; healthcare is a minefield of conflicting interests, most of which have nothing to do with patient care and successful outcomes. You can, for instance, have your personal genome assessed and analyzed today but few have opted to do so. Do you want to know you have a strong genetic inclination toward a disease for which there is no cure?
  4. You city will help you live in it. Sitting at consoles in operations centers connected to myriad sensors generating massive amounts of real time data, city administrators will be able to, say, manage traffic lights interactively as traffic flows or dynamically adjust the arrival and departure of various transportation. All things we as citizens probably want. The city also could become a huge social network where policies are developed based on clicking likes. Big brother, anyone?
  5. A digital guardian will protect you online. The retailer Target just compromised tens of millions of personal identifications at the end of the year. We truly need an effective digital guardian. As IBM notes, this requires continuous, sophisticated analytics, to identify whatever activity in your digital life varies from the norm and flags any sudden deviation of behavior. This guardian needs to shut down bad things proactively before it reaches you and also provide a private safe fortress for your data and online persona. As one whose email was recently hacked, this blogger is ready to sign up today. BTW: my apologies to any readers who received a desperate message purportedly from me saying I was stranded in Turkey, my wallet destroyed, and in immediate need of money to get home. Hope you didn’t send any.

Best wishes for a peaceful and prosperous New Year.

New Year Resolution #1: Follow this blogger on Twitter: @mainframeblog

, , , , , , , , , , , , , , ,

Leave a comment

Five Reasons Businesses Use the Cloud that IT Can Live With

By 2016, cloud will matter more to business leaders than to IT, according to the IBM Center for Applied Insights. In fact, cloud’s strategic importance to business leaders is poised to double from 34% to 72%. That’s more than their IT counterparts where only 58% acknowledge its strategic importance.

This shouldn’t be surprising. Once business leaders got comfortable with the security of the cloud it was just a matter of figuring out how to use it to lower costs or, better yet, generate more revenue faster. IT, on the other hand, recognized the cloud early on as a new form of IT outsourcing and saw it as a direct threat, which understandably dampened their enthusiasm.

IBM’s research—involving more than 800 cloud decision makers and users—painted a more business-friendly picture that showed the cloud able to deliver more than just efficiency, especially IT efficiency. Pacesetting organizations, according to IBM, are using cloud to gain competitive advantage through strategic business reinvention, better decision making, and deeper collaboration. And now the business results to prove it are starting to roll in. You can access the study here.

IT, however, needn’t worry about being displaced by the cloud. Business managers still lack the technical perspective to evaluate and operationally manage cloud providers. In addition, there will always be certain functions that best remain on premise. These range from conformance with compliance mandates to issues with cloud latency to the need to maintain multiple sources of IT proficiency and capability to ensure business continuance. Finally, there is the need to assemble, maintain, and manage an entire ecosystem of cloud providers (IaaS, PaaS, SaaS, and others) and services like content distribution, network acceleration, and more.  So, rest assured; if you know your stuff, do it well, and don’t get greedy the cloud is no threat.

From the study came five business reasons to use the cloud:

1)      Better insight and visibility—this is the analytics story; 54% use analytics to derive insights from big data, 59% use it to share data, and 59% intend to use cloud to access and manage big data in the future

2)      Easy collaboration—cloud facilitates and expedites cross-functional collaboration, which drives innovation and boosts productivity

3)      Support for a variety of business needs by forging a tighter link between business outcomes and technology in areas like messaging, storage, and office productivity suites; you should also add compute-business agility

4)      Rapid development of new products and services—with  52% using the cloud to innovate products and services fast and 24% using it to offer additional product and services; anything you can digitize, anything with an information component can be marketed, sold, and delivered via the cloud

5)      Proven results –25% reported a reduction in IT costs due to the cloud, 53% saw an increase in efficiency, and 49% saw improvement in employee mobility.

This last point about mobility is particularly important. With the advent of the cloud geography is no longer a constraining business factor. You can hire people anywhere and have them work anywhere. You can service customers anywhere. You can source almost any goods and services from anywhere. And IT can locate data centers anywhere too.

Yes, there are things for which direct, physical interaction is preferred. Despite the advances in telemedicine, most people still prefer an actual visit to the doctor; that is unless a doctor simply is not accessible. Or take the great strides being made in online learning; in a generation or two the traditional ivy covered college campus may be superfluous except, maybe, to host pep rallies and football games. But even if the ivy halls aren’t needed, the demand for the IT capabilities that make learning possible and enable colleges to function will only increase.

As BottomelineIT has noted many times, the cloud is just one component of your organization’s overall IT and business strategy.  Use it where it makes sense and when it makes sense, but be prepared to alter your use of the cloud as changing conditions dictate. Change is one of the best things at which the cloud is best.

, , , , , , , , ,

Leave a comment

Make Your Digital Presence a Valuable Asset

Do you recognize your organization’s digital presence as a valuable asset?  You probably are familiar with some aspects of it, less familiar with others. The organization’s website forms a core component of your digital presence. So do any information or blog portals your organization deploys. Do you conduct webcasts to educate customers or prospects? Webcasts are part of your digital presence too. Your digital presence, in short, is all you do in the digital sphere.

IT probably didn’t initiate the organization’s digital presence way back when the scramble was on to stake out a web presence. Probably marketing agitated for it and IT assigned someone as webmaster. Things have changed dramatically in the decade or two since.

 “Digital is the future and a critical component of business strategy in many industries,” notes Howard Tiersky, CEO of Moving Interactive, which specializes in digital innovation consulting.  In other words, Tiersky tries to increase the value of companies’ digital presence, whatever pieces it may include.

To Tiersky, digital represents the largest transformation the media world has seen in decades—the old rules and ways of launching new products no longer apply. But your digital presence probably extends far beyond the digital media world.

According to Kennedy Consulting, “digital strategy, the integration of digital technologies into companies’ strategies and operations in ways that fundamentally alter the value chain, is emerging as a significant source of competitive advantage.” It is driving dramatic changes in the products and services companies bring to market, as well as how they do business. What we really mean when talking about digital is the entire digital landscape: the Internet, Web (World Wide Web), the Cloud, and all they contain; mobile even plays a key part of it.

Every organization today operates in this rapidly expanding digital landscape. Some have a small digital presence there, maybe just a website that is little more than a static information portal or electronic brochure. Others digitally engage their customers, partners, and other stakeholders much more extensively through social business, online collaboration, webcasts, video, and more.

At this point, the extent of an organization’s involvement in the digital landscape generally mirrors its industry. “In some industries, digital has become the primary way to interact with customers,” says Tiersky.  For customers in media, entertainment, travel, and financial services an effective digital strategy is a critical requirement. In other industries the need is less urgent right now, but before not too long every company in every industry will need a digital strategy that shapes its digital presence.

Most companies began a decade or two ago with a simple static website. Marketing usually was driving the bus with IT lending technical support as needed.  Over the years it grew and expanded; IT increasingly became involved, often reluctantly.

The budget for these kinds of digital initiatives also grew, and the recipient of the budget began to shift. According to Gartner, marketing is purchasing significant marketing-related technology and services from their own capital and expense budgets – both outside the control of the internal IT organization and in conjunction with them.  The upshot, Gartner predicts that by 2017 the CMO will spend more on IT than the CIO. And the volume and value of transactions being generated through the organization’s digital presence has likely become substantial.

The digital landscape and the performance of the organization’s digital presence within that landscape has grown in size to such an extent, as reflected by the increasing amounts of budget allocated to it, that neither IT nor marketing can handle it alone. The scope and complexity of the digital landscape and its many disparate elements has evolved and expanded fast. In addition, the importance of the organization’s digital presence grown even faster; that’s why every organization needs outside help.

And this is why digital consultants, content delivery networks, and cloud-based services providers of all sorts are in demand.  It is time for you as CIO to sit down with the CMO and put together a team that can efficiently optimize your digital presence as a valuable asset going forward.

 The digital landscape is not going away. “We are going through a multi-decade transformation process; every business will shift significantly into digital world,” says Tiersky.  As that happens you want to make sure IT is playing a key role.

, , , , , , , , ,

Leave a comment

Big Data and Analytics as Game Changing Technology

If you ever doubted that big data was going to become important, there should be no doubt anymore. Recent headlines from the past couple of weeks of the government capturing and analyzing massive amounts of daily phone call data should convince you.

That this report was shortly followed by more reports of the government tapping the big online data websites like Google, Yahoo, and such for even more data should alert you to three things:

1—There is a massive amount of data out there that can be collected and analyzed.

2—Companies are amassing incredible volumes of data in the normal course of serving people who readily and knowingly give their data to these organizations. (This blogger is one of those tens of million .)

3—The tools and capabilities are mature enough for someone to sort through that data and connect the dots to deliver meaningful insights.

Particularly with regard to the last point this blogger thought the industry was still five years away from generating meaningful results from that amount of data coming in at that velocity. Sure, marketers have been sorting and correlating large amounts of data for years, but it was mostly structured data and not at nearly this much. BTW, your blogger has been writing about big data for some time.

If the news reports weren’t enough it became clear at IBM Edge 2013, wrapping up is Las Vegas this week, that big data analytics is happening and companies and familiar companies are succeeding at it now. It also is clear that there is sufficient commercial off-the-shelf computing power from companies like IBM and others and analytics tools from a growing number of vendors to sort through massive amounts of data and make sense of it fast.

An interesting point came up in one of the many discussions at Edge 2013 touching on big data. Every person’s data footprint is as unique as a fingerprint or other bio-metrics. We all visit different websites and interact with social media and use our credit and debit cards in highly individual ways. Again, marketers have sensed this at some level for years, but they haven’t yet really honed it down to the actual individual on a mass scale, although there is no technical reason one couldn’t. You now can, in effect, market to a demographic of one.

A related conference is coming up Oct. 21-25 in Orlando, Fl., called Enterprise Systems 2013.  It will combine the System z and the Power System Technical University along with a new executive-focused Enterprise Systems event. It will include new announcements, peeks into trends and directions, over 500 expert technical sessions across 10 tracks, and a comprehensive solution center. This blogger has already put it on his calendar.

There was much more interesting information at Edge 2013, such as using data analytics and cognitive computing to protect IT systems.  Perimeter defense, anti-virus, and ID management are no longer sufficient. Stay tuned.

, , , , , , , , , , , ,

Leave a comment