Posts Tagged big data

New Enterprise Systems Maturity Model

Does your shop use maturity models to measure where you stand and where you should be going compared with industry trends and directions? Savvy IT managers often would use such a model to pinpoint where their organization stood on a particular issue  as part of their pitch for an increased budget to hire more people or acquire newer, faster, greater IT capabilities.

Today maturity models are still around but they are more specialized now. There are, for instance, maturity models for security and IT management. Don’t be surprised to see maturity models coming out for cloud or mobile computing if they are not already here.

Earlier this year, Compuware introduced a new maturity model for the enterprise data center.  You can access it here. Compuware describes the new maturity model as one that helps organizations assess and improve the processes for managing application performance and costs as distributed and mainframe systems converge.

Why now? The data center and even the mainframe have been changing fast with the advent of cloud computing, mobile, and big data/analytics. Did your enterprise data center team ever think they would be processing transactions from mobile phones or running analytic applications against unstructured social media data? Did they ever imagine they would be handling compound workloads across mainframes and multiple distributed systems running both Windows and Linux?  Welcome to the new enterprise data center normal.

Maybe the first difference you’ll notice in the new maturity model are the new types of people populating the enterprise data center. Now you need to accommodate distributed and open systems along with the traditional mainframe environment. It requires that you bring together completely different teams and integrate them. Throw in mobile, big data, analytics, and social and you have a vastly different reality than you had even a year ago.  And with that comes the need to bridge the gap that has long existed between the enterprise (mainframe) and distributed data center teams. This is a cultural divide that will have to be navigated, and the new enterprise IT maturity model can help.

The new data center normal, however, hasn’t changed data center economics, except maybe to exacerbate the situation. The data center has always been under pressure to rein in costs and use resources, both CPUs and MIPS, efficiently.  Those pressures are still there but only more so because the business is relying on the data center more than ever before as IT becomes increasingly central to the organization’s mission.

Similarly, the demand for high levels of quality of service (QoS) not only continues unabated but is expanding. The demand for enterprise-class QoS now extends to compound workloads that cross mainframe and distributed environments, leaving the data center scrambling to meet new end user experience (EUE) expectations even as it pieces together distributed system QoS work-arounds. The new enterprise IT maturity model will help blend these two worlds and address the more expansive role IT is playing today.

To do this the model combines distributed open systems environments with the mainframe and recognizes different workloads, approaches, processes, and tooling. It defines five levels of maturity: 1) ad hoc, 2) technology-centric, 3) internal services-centric, 4) external services-centric, and 5) business-revenue centric.

Organizations at the ad hoc level, for example, primarily use the enterprise systems to run core systems and may still employ a green screen approach to application development. At the technology-centric level, there’s an emphasis on infrastructure monitoring to support increasing volumes, higher capacity, complex workload and transaction processing along with greater MIPS usage. As organizations progress from internal services-focused to external services-focused, mainframe and distributed systems converge and EUE and external SLAs assume a greater priority.

Finally, at the fifth or business centric level, the emphasis shifts to business transaction monitoring where business needs and EUE are addressed through interoperability of the distributed systems and mainframes with mobile and cloud systems. Here technologies provide real-time transaction visibility across the whole delivery chain, and IT is viewed as a revenue generator. That’s the new enterprise data center normal.

In short, the new enterprise maturity model requires enterprise and distributing computing come together and all staff work together closely; that proprietary systems and open systems interoperate seamlessly. And there is no time for delay. Already, DevOps, machine-to-machine computing (the Internet of Things), and other IT strategies, descendents of agile computing, are gaining traction while smart mobile technologies drive the next wave of enterprise computing.

, , , , , , , , , , , , , , , , , ,

Leave a comment

Fueled by SMAC Tech M&A Activity to Heat Up

Corporate professional services firm BDO USA  polled approximately 100 executives of U.S. tech outfits for its 2014 Technology Outlook Survey and found them firm in the belief that mergers and acquisitions in tech would either stay at the same rate (40%) or increase over last year (43%). And this isn’t a recent phenomenon.

M&A has been widely adopted across a range of technology segments as not only the vehicle to drive growth but, more importantly, to remain at the leading edge in a rapidly changing business and technology environment that is being spurred by cloud and mobile computing. And fueling this M&A wave is SMAC (Social, Mobile, Analytics, Cloud).

SMAC appears to be triggering a scramble among large, established blue chip companies like IBM, EMC, HP, Oracle, and more to acquire almost any promising upstart out there. Their fear: becoming irrelevant, especially among the young, most highly sought demographics.  SMAC has become the code word (code acronym, anyway) for the future.

EMC, for example, has evolved from a leading storage infrastructure player to a broad-based technology giant driven by 70 acquisitions over the past 10 years. Since this past August IBM has been involved in a variety of acquisitions amounting to billions of dollars. These acquisitions touch on everything from mobile networks for big data analytics and mobile device management to cloud services integration.

Google, however, probably should be considered the poster child for technology M&A. According to published reports, Google has been acquiring, on average, more than one company per week since 2010. The giant search engine and services company’s biggest acquisition to date has been the purchase of Motorola Mobility, a mobile device (hardware) manufacturing company, for $12.5 billion. The company also purchased an Israeli startup Waze  in June 2013 for almost $1 billion.  Waze is a GPS-based application for mobile phones and has brought Google a strong position in the mobile phone navigation business, even besting Apple’s iPhone for navigation.

Top management has embraced SMAC-driven M&A as the fastest, easiest, and cheapest way to achieve strategic advantage through new capabilities and the talent that developed those capabilities. Sure, the companies could recruit and build those capabilities on their own but it could take years to bring a given feature to market that way and by then, in today’s fast moving competitive markets, the company would be doomed to forever playing catch up.

Even with the billion-dollar and multi-billion dollar price tags some of these upstarts are commanding strategic acquisitions like Waze, IBM’s SoftLayer, or EMC’s XtremeIO have the potential to be game changers. That’s the hope, of course. But it can be risky, although risk can be managed.

And the best way to manage SMAC merger risk is to have a flexible IT platform that can quickly absorb those acquisitions and integrate and share the information and, of course, a coherent strategy for leveraging the new acquisitions. What you need to avoid is ending up with a bunch of SMAC piece parts that don’t fit together.

, , , , , , , , , , , , , , , , , ,

Leave a comment

Change-proof Your Organization

Many organizations are being whiplashed by IT infrastructure change—costly, disruptive never changes that are hindering IT and the organization.  You know the drivers: demand for cloud computing, mobile, social, big data, real-time analytics, and collaboration. Don’t forget to add soaring transaction volumes, escalating amounts of data, 24x7x365 processing, new types of data, proliferating forms of storage, incessant compliance mandates, and more keep driving change. And there is no letup in sight.

IBM started to articulate this in a blog post, Infrastructure Matters. IBM was focusing on cloud and data, but the issues go even further. It is really about change-proofing, not just IT but the business itself.

All of these trends put great pressure on the organization, which forces IT to repeatedly tweak the infrastructure or otherwise revamp systems. This is costly and disruptive not just to IT but to the organization.

In short, you need to change-proof your IT infrastructure and your organization.  And you have to do it economically and in a way you can efficiently sustain over time. The trick is to leverage some of the very same  technology trends creating change to design an IT infrastructure that can smoothly accommodate changes both known and unknown. Many of these we have discussed in BottomlineIT previously:

  • Cloud computing
  • Virtualization
  • Software defined everything
  • Open standards
  • Open APIs
  • Hybrid computing
  • Embedded intelligence

These technologies will allow you to change your infrastructure at will, changing your systems in any variety of ways, often with just a few clicks or tweaks to code.  In the process, you can eliminate vendor lock-in and obsolete, rigid hardware and software that has distorted your IT budget, constrained your options, and increased your risks.

Let’s start by looking at just the first three listed above. As noted above, all of these have been discussed in BottomlineIT and you can be sure they will come up again.

You probably are using aspects of cloud computing to one extent or another. There are numerous benefits to cloud computing but for the purposes of infrastructure change-proofing only three matter:  1) the ability to access IT resources on demand, 2) the ability to change and remove those resources as needed, and 3) flexible pricing models that eliminate the upfront capital investment in favor of paying for resources as you use them.

Yes, there are drawbacks to cloud computing. Security remains a concern although increasingly it is becoming just another manageable risk. Service delivery reliability remains a concern although this too is a manageable risk as organizations learn to work with multiple service providers and arrange for multiple links and access points to those providers.

Virtualization remains the foundational technology behind the cloud. Virtualization makes it possible to deploy multiple images of systems and applications quickly and easily as needed, often in response to widely varying levels of service demand.

Software defined everything also makes extensive use of virtualization. It inserts a virtualization layer between the applications and the underlying infrastructure hardware.  Through this layer the organization gains programmatic control of the software defined components. Most frequently we hear about software defined networks that you can control, manage, and reconfigure through software running on a console regardless of which networking equipment is in place.  Software defined storage gives you similar control over storage, again generally independent of the underlying storage array or device.

All these technologies exist today at different stages of maturity. Start planning how to use them to take control of IT infrastructure change. The world keeps changing and the IT infrastructures of many enterprises are groaning under the pressure. Change-proofing your IT infrastructure is your best chance of keeping up.

, , , , , , , , , , , , , , ,

1 Comment

IBM on How Computing Will Change Us in 5 Years

Since 2006 at about this time IBM comes out with five predictions, dubbed 5-in-5, about how technology will affect the world within five years.  Each year the predictions look at how technology innovations will change the way people work, live, and play within the next five years. They are based on market and social trends combined with ideas from the thousands of biologists, engineers, mathematicians and medical physicians in IBM research labs around the world.

Last year the 5-in-5 predictions focused on how systems would augment human senses. It looked at sight, hearing, smell, touch, and taste. For example, a machine that experiences flavor could determine the precise chemical structure of food and why people like it. Or, computers might smell for chemicals in urban environments to monitor pollution or analyze the soil.

IBM’s 5-in-5 predictions for 2013 go in a different direction. This year the researchers looked at how innovations in computing allow us to interact with the meaning that lies in data. The researchers, taking a distinctly benign view, suggest that systems will emerge that treat us as individuals, adapt to us, and look out for our interests. Others, of course, might see this as the tyranny of Big Brother.

Here is this year’s 5-in-5:

  1. The classroom will learn you.  Teachers will work on device that can monitor and interact with the student and ultimately create a unique persona for each student. Teachers will use that persona, which changes over time, to guide the student on his or her learning path. They will know, through the student’s device, what the particular student is struggling to learn and will provide the right help at the right time.
  2. Buying local beats online.  The combination of cloud technology and in-store display will enable local stores to act as a showroom for the wide variety of products available online and enable customers to interact with a product. Of course the store will recognize you and know your preferences. In short, IBM is predicting the convergence of online and brick and mortar retail.
  3. Doctors will use your DNA to keep you well. This already is happening how. But it goes beyond DNA to using the data analytic power of computers to diagnose patient ills and guide doctors in treatment. IBM’s Watson is doing some of this today. How quickly this will evolve remains to be seen; healthcare is a minefield of conflicting interests, most of which have nothing to do with patient care and successful outcomes. You can, for instance, have your personal genome assessed and analyzed today but few have opted to do so. Do you want to know you have a strong genetic inclination toward a disease for which there is no cure?
  4. You city will help you live in it. Sitting at consoles in operations centers connected to myriad sensors generating massive amounts of real time data, city administrators will be able to, say, manage traffic lights interactively as traffic flows or dynamically adjust the arrival and departure of various transportation. All things we as citizens probably want. The city also could become a huge social network where policies are developed based on clicking likes. Big brother, anyone?
  5. A digital guardian will protect you online. The retailer Target just compromised tens of millions of personal identifications at the end of the year. We truly need an effective digital guardian. As IBM notes, this requires continuous, sophisticated analytics, to identify whatever activity in your digital life varies from the norm and flags any sudden deviation of behavior. This guardian needs to shut down bad things proactively before it reaches you and also provide a private safe fortress for your data and online persona. As one whose email was recently hacked, this blogger is ready to sign up today. BTW: my apologies to any readers who received a desperate message purportedly from me saying I was stranded in Turkey, my wallet destroyed, and in immediate need of money to get home. Hope you didn’t send any.

Best wishes for a peaceful and prosperous New Year.

New Year Resolution #1: Follow this blogger on Twitter: @mainframeblog

, , , , , , , , , , , , , , ,

Leave a comment

Fresh Toys Help IT Open New Business Opportunities in 2014

Much of what to expect for IT for 2014 you have already glimpsed right here in BottomlineIT although some will be startlingly new. The new stuff will address opportunities in many cases that businesses are just starting to consider. Some of these, like 3D-printing or e-wallet, have the potential to radically change the way business operates.

Let’s start with what you already know: 2014 will be cloud everything; which is being steadily absorbed into business DNA as it evolves into the predominant way companies go to market, relate with their customers and partners, find employees, and deliver increasing aspects of their products as online services.

Also expect the continued hyping of big data, especially the unstructured data found everywhere, and analytics, which is necessary to make sense of the data. In 2014 analytics will be augmented by real-time analytics and predictive analytics, both of which can indeed deliver measurable business value.

In 2014 everything will be virtualized. In the process it will become defined by software. That means it will be programmable, allowing you to change its capabilities almost at will. Virtualized, software-defined capabilities will be in the products you acquire and the appliances you buy. You next car will be software-defined and Internet (cloud) connected. Your video-enabled car will be able to park in a tighter space than you can park it yourself.

Mobile, in the form of smartphones and tablet devices, will be the devices of choice for more and more people worldwide. Your mobile device will increasingly handle your communications, shopping, purchasing, socializing, entertainment, and work tasks even as it take over more of the functions of your wallet. Eventually the e-wallet will contain your identification, memberships, subscriptions, credit and debit cards as security gets bolstered,

On to the completely new: business drones are coming; mainly in the form of smart, software defined and programmable devices that can do errands. Basically they are taking robotics to a new level. Amazon hopes to use them to deliver items to your doorstep within hours of your purchase. What might your business do with a capability like this?

3D-printing is BottomlineIT’s favorite. Where the Internet disintermediated much of the traditional supply chain and distribution channel, 3D-printing can disintermdiate manufacturers by producing the physical product at your desk. Now software-defined, customizable mass products can be cost-effectively manufactured at scale for a market of just one. With 3D-printing you can deliver a customizable version of your widget to a customer as readily as you send a fax. Can you make some money with that capability?

Finally, smart, wearable, cloud-connected computers in the form of wrist watches (remember old Dick Tracy comics) and eye wear. Google Glass will become increasingly commonplace. Exactly what will be the business value of Google Glass remains unclear. Right now you buy it for the extreme cool factor.

So expect new IT goodies around the digital Xmas tree starting to arrive this year but in quantity by the end of 2014. Some may be a bust; others may be late in coming. As CIO, your job is to figure out which of these help can you meet your organization’s business goals. Best wishes for 2014.

, , , , , , , , , , , , ,

Leave a comment

Five Reasons Businesses Use the Cloud that IT Can Live With

By 2016, cloud will matter more to business leaders than to IT, according to the IBM Center for Applied Insights. In fact, cloud’s strategic importance to business leaders is poised to double from 34% to 72%. That’s more than their IT counterparts where only 58% acknowledge its strategic importance.

This shouldn’t be surprising. Once business leaders got comfortable with the security of the cloud it was just a matter of figuring out how to use it to lower costs or, better yet, generate more revenue faster. IT, on the other hand, recognized the cloud early on as a new form of IT outsourcing and saw it as a direct threat, which understandably dampened their enthusiasm.

IBM’s research—involving more than 800 cloud decision makers and users—painted a more business-friendly picture that showed the cloud able to deliver more than just efficiency, especially IT efficiency. Pacesetting organizations, according to IBM, are using cloud to gain competitive advantage through strategic business reinvention, better decision making, and deeper collaboration. And now the business results to prove it are starting to roll in. You can access the study here.

IT, however, needn’t worry about being displaced by the cloud. Business managers still lack the technical perspective to evaluate and operationally manage cloud providers. In addition, there will always be certain functions that best remain on premise. These range from conformance with compliance mandates to issues with cloud latency to the need to maintain multiple sources of IT proficiency and capability to ensure business continuance. Finally, there is the need to assemble, maintain, and manage an entire ecosystem of cloud providers (IaaS, PaaS, SaaS, and others) and services like content distribution, network acceleration, and more.  So, rest assured; if you know your stuff, do it well, and don’t get greedy the cloud is no threat.

From the study came five business reasons to use the cloud:

1)      Better insight and visibility—this is the analytics story; 54% use analytics to derive insights from big data, 59% use it to share data, and 59% intend to use cloud to access and manage big data in the future

2)      Easy collaboration—cloud facilitates and expedites cross-functional collaboration, which drives innovation and boosts productivity

3)      Support for a variety of business needs by forging a tighter link between business outcomes and technology in areas like messaging, storage, and office productivity suites; you should also add compute-business agility

4)      Rapid development of new products and services—with  52% using the cloud to innovate products and services fast and 24% using it to offer additional product and services; anything you can digitize, anything with an information component can be marketed, sold, and delivered via the cloud

5)      Proven results –25% reported a reduction in IT costs due to the cloud, 53% saw an increase in efficiency, and 49% saw improvement in employee mobility.

This last point about mobility is particularly important. With the advent of the cloud geography is no longer a constraining business factor. You can hire people anywhere and have them work anywhere. You can service customers anywhere. You can source almost any goods and services from anywhere. And IT can locate data centers anywhere too.

Yes, there are things for which direct, physical interaction is preferred. Despite the advances in telemedicine, most people still prefer an actual visit to the doctor; that is unless a doctor simply is not accessible. Or take the great strides being made in online learning; in a generation or two the traditional ivy covered college campus may be superfluous except, maybe, to host pep rallies and football games. But even if the ivy halls aren’t needed, the demand for the IT capabilities that make learning possible and enable colleges to function will only increase.

As BottomelineIT has noted many times, the cloud is just one component of your organization’s overall IT and business strategy.  Use it where it makes sense and when it makes sense, but be prepared to alter your use of the cloud as changing conditions dictate. Change is one of the best things at which the cloud is best.

, , , , , , , , ,

Leave a comment

Sorting Out the Data Analytics IT Options

An article from McKinsey & Company, a leading management consulting and research firm, declares: “By 2018, the United States alone could face a shortage of 140,000 to 190,000 people with deep [data] analytical skills as well as [a shortage of] 1.5 million managers and analysts with the know-how to use the analysis of big data to make effective decisions. Might this be your shop?

Many companies today are scrambling to assemble an IT data analytics infrastructure to support data analytics. But before they can even begin they have to figure out what kind of analytics the organization will want to deploy. Big data is just one of many possibilities and the infrastructure that works for some types of data analytics won’t work for others.

Just off the top of the head this blogger can list a dozen types of data analytics in play: OLAP, business intelligence (BI), business analytics, predictive analytics, real-time analytics, big data analytics, social analytics, web analytics, click stream analytics, mobile analytics, brand/reputation analysis, and competitive intelligence. You’ve probably have a few of these already.

As advanced analytics pick up momentum data center managers will be left trying to cobble together an appropriate IT infrastructure for whatever flavors of analytics the organization intends to pursue. Unless you have a very generous budget you can’t do it all.

For example, big data is unbelievably hot right now so maybe it makes sense to build an infrastructure to support big data analytics. But predictive analytics, the up and coming superstar of business analytics, is an equally hot capability due to its ability to counter fraud or boost online conversion immediately, while the criminal or customer is still online.

BI, however, has been the analytics workhorse for many organizations for a decade or more, along with OLAP, and companies already have a working infrastructure for that.  It consists of a data warehouse with relational databases and common query, reporting, and cubing tools. The IT infrastructure, for the most part, already is in place and working.

On the other hand, if top management now wants big data analytics or real time data analytics or predictive analytics you may need a different information architecture and design, different tools, and possibly even different underlying technologies. Big data, for example, relies on Hadoop, a batch process that does not make use of SQL. (Vendors are making a valiant effort to graft a SQL-like interface onto Hadoop with varying degrees of success.)

Real-time analytics is just that—real-time—basically the opposite of Hadoop. It works best using in-memory data and logic processing to speed the results of analytic queries in seconds or even microseconds. Data will be stored on flash storage or in large amounts of cache memory as close to the processing as it can get.

A data information architecture that is optimized for big data’s unstructured batch data cannot also be used for real time analytics.  And the traditional BI data warehouse infrastructure probably isn’t optimized for either of them.  The solution calls for extending your existing data management infrastructure to encompass the latest analytics management wants or designing and building yet another IT data infrastructure.  Over the past year, however, the cloud has emerged as another place where organizations can run analytics, provided the providers can overcome the latencies inherent in the cloud.

, , , , , , , , , , , ,

Leave a comment