Archive for April, 2011

Use Cloud to Avoid WikiLeaks-like Exposure

WikiLeaks material makes for interesting reading unless your company is the target. Bank of America Corp., for example, was a recent target of a WikiLeaks exposure.  It turned out not to be WikiLeaks but a hacker that goes by the name Anonymous, but the result—exposed information—was the same.

The point here: every business risks finding its information exposed in WikiLeaks fashion. A Bank of America spokesman said the documents were non-foreclosure related clerical and administrative documents stolen by a former Balboa Insurance employee. Should managers feel relieved?

Absolutely not. In this case an employee of a presumably trusted associate leaked the information. In today’s networked, connected, collaborative economy, every company relies increasingly on expanding networks—ecosystems—of associated organizations with which they must share information. Fortunately, there are ways to protect against this.

For years overnight delivery has been the favored method of sharing documents or unstructured information. It’s easy, but it isn’t cheap and it certainly isn’t secure. Documents can be lost; more importantly, you have no control over what the recipient will do once the document is received.

Today maybe the most common way to share electronic documents outside the organization is via email attachments. Email is fast, easy, and cheap but it, too, is far from secure. And once it gets to its destination anything can happen.

IT is reasonably good at securing documents for sharing within the firewall through layers or perimeter protection. There are data loss prevention (DLP) systems that can stop employees from sending unauthorized information outside the organization. Other technologies, such as digital rights management (DRM), can be embedded in documents to extend control over what can be done with them once they leave your hands.

The cloud, however, appears ideal for securely sharing documents outside the firewall. In the cloud companies set up space through which you can securely share documents. Those you want to share your documents access the space with a browser and view the documents there. Depending on the security provided those viewing your documents might not be able to download them, print them, or forward them. They might be able to make changes to the document there, but not leak them.

These shared spaces vary in the degree of security they provide, their ability to control the document beyond the shared space, and other capabilities they may provide, such as facilitating collaboration. They also vary in the volume and number of documents and users they can manage. Due to these variables, many of the shared spaces are not suitable for the kind of secure document sharing organizations are likely to need. These systems, however, all have drawbacks that prevent widespread adoption. A table below lays out the most common options for secure document sharing.

Three cloud providers, however, can do the job. IntraLinks is widely considered the industry leader. Brainloop, a German company with offices in Cambridge, MA, has staked out the document compliance management and collaboration spaces. Watchdox focuses on ease of use and extended control. Each of these companies delivers enterprise-class secure document sharing.

The main use of such spaces will be for sharing documents relating to M&A activity, Board meetings, and compliance activities. Other use cases focus on product development or collaborative marketing.

The following table lays out the common options.

Option Advantages Gaps
ERP systems Effective secure content sharing within the system Limited collaborationLack of openness
IRM (SharePoint) Strong collaboration Limited access controlLack of openness
ERM/DRM Strong access control No collaboration
Perimeter, DLP Strong protection within firewallUses existing security infrastructure No collaboration
Web/cloud shared space Easily accessibleInexpensiveAllows for strong security and control

Flexible

Collaboration support varies by providerSecurity and control varies by provider

Whatever option or combination of options you choose, the goal is to stop your organization’s documents from showing up on the various WikiLeaks of the world or getting into the wrong hands.

Advertisements

, , , , , , ,

Leave a comment

Run Windows on the Mainframe

Last week IBM published a general statement of direction that essentially said the zEnterprise will have Windows in the 4th quarter via an x blade for the zBX. The announcement reads, in part, “IBM intends to offer select IBM System x blades running Microsoft Windows in the IBM zEnterprise BladeCenter Extension Model 002.”

The zBX is the extension cabinet for the new zEnterprise/z196 mainframe and for the older z10. When the product is released later this year organizations will be able to run native Windows applications through their mainframe.

This isn’t exactly new. IBM has been hinting about this since the zEnterprise announcement last July although it refrained from making any statement that might legally commit them to anything. I have been writing about it for about as long. Here is a Windows on z piece from earlier this year.

Windows on the System z (or the zBX as is the case here) raises some interesting issues. Of immediate interest is the question of which Windows workloads might be best moved to the zEnterprise. Microsoft Exchange and Microsoft SharePoint, probably the company’s two most popular enterprise server applications, are good bets to stay where they are now.

Similarly, the Microsoft Office Suite will stay where it is unless someone wants to attempt to virtualize a slew of Office desktops on a handful of zBX IBM x blades as a VDI play. IBM, however, is pushing VDI separately as a cloud workload.

In response to which workloads, IBM’s reply is as follows: enterprises with one or more applications running in a complex, heterogeneous, multi-tiered environment have the opportunity to upgrade that infrastructure with zEnterprise and enjoy the management benefits that the Unified Resource Manager brings. The most likely candidates are those Windows apps that make use of data or processing residing the z, which is IBM’s basic fit-for-purpose strategy.

IBM also is not expecting a wholesale migration of Windows servers to the z. Again the IBM spokesman explains: Many of the largest data centers already have far more blades and/or rack-mounted [Windows] servers in their inventory than could realistically fit into the zBX.

What IBM does hope for, according to the spokesman, are workloads that rely on System z for data serving and contain other application components running on System z, Power, or Intel that are required to complete the end-to-end business process. Companies with such applications running in these two- or three-tier environments may be ideal candidates for zEnterprise running the new System x blades.

Beyond these basic fit-for-purpose statements IBM isn’t saying much about blade pricing, performance, or software licensing. This serves two purposes for IBM: 1) it avoids binding promises to which it might be held, and probably as importantly, 2) it freezes the competitive market. Multi-platform, multi-tier enterprises considering an upgrade of their Windows servers to the latest offerings from HP or Oracle (Sun) or others now know this IBM option is coming in the fourth quarter and may wait until they see the pricing, licensing, and performance details.

With the System x blade and Windows IBM indeed can make a strong case for re-centralizing on the zEnterprise, especially for multi-tier, multi-platform enterprises. This case will be based on the zEnterprise’s centralized management efficiency, the potential for greater optimization, and the resulting performance improvements. It’s a good story that could catch on if IBM keeps the pricing competitive.

But a more intriguing question: is it enough to attract NON-mainframe shops to the zEnterprise/zBX? Maybe, especially if IBM offers compelling discounts as it has been known to do.

, , , , , , , , , , ,

Leave a comment

Commercializing IBM’s Watson Technology

IBM’s Watson-Jeopardy challenge proved to be entertaining theater but it left hanging the question of how a business could capitalize on Watson technology. IBM almost immediately began suggesting workloads that would apply in healthcare, finance, and elsewhere. BottmlineIT discussed some of these back in early March.

The scale of the hardware Watson required for Jeopardy, however, went beyond what most businesses could or would acquire.  Not many businesses are likely to configure 90 tightly integrated IBM Power 750 servers containing 2880 POWER7 processor cores as well as the 15TB of onboard memory for a single workload as was the configuration for Watson when it won Jeopardy.

This week we got the answer. IBM introduced new optimized POWER7 products, both upgraded servers and new blades, capable of running Watson-like workloads. These products, according to IBM, actually provide a performance kick beyond the Power 750 used with Watson.

To do what Watson did—process complex natural language queries and come up with the right answer extremely fast—is not a job for wimpy servers. It still requires high end servers, not commodity x86 Wintel machines.

IBM leads the high end UNIX server market, growing revenue in that segment by 12%, according to IDC,. The researcher’s overall discussion of the Q4 2010 worldwide server market is a little more nuanced vis-à-vis HP than IBM presents but IDC still declares IBM the leader in the worldwide server systems market with 37.4% market share in factory revenue for 4Q10.

But the real questions revolve around when and how commercial businesses are going to deploy Watson technology. Along with announcing the new POWER7 servers, IBM introduced an early adopter tapping the new POWER7 technology, if not exactly the Watson capabilities.

That business, RPM Technologies, provides wealth management software to some of the largest banks and financial services companies in Canada.  In terms of the new POWER7 technology, “POWER7 chips along with AIX 6.1 provided a big boost to the batch and threading speed of our products,” said RPM’s chief architect.  With POWER7 chips, batch job runtimes improved by upwards of 35% and used fewer resources, he reported. As part of the upgrade, RPM also moved to a fully virtualized environment across two POWER7 16-core P750 machines, which reduced the time and effort to manage the boxes.

Another early adopter, the University of Massachusetts-Dartmouth, may be more on track to tap Watson-like capabilities. The school’s researchers are using two IBM POWER7 blades to study the effect of cosmic disturbances, called gravitational waves, on black holes in space.

“We are running billions of intense calculations on the POWER7 blades… able to get results as much as eight times faster than running the same calculations on an Intel Xeon processor. Calculations that used to take a month to run are now finished in less than a week”, reported Gaurav Khanna, professor of physics at UMass-Dartmouth. Not fast enough to win Jeopardy but impressive nonetheless.

The new POWER7 products include the following enhancements:

  • Enhanced IBM Power 750 Express—the same system that powers Watson—further optimized with a faster POWER7 processor delivering more than three times the performance of comparable 32-core servers.
  • 16-core, single-wide IBM BladeCenter PS703 and 32-core, double-wide IBM BladeCenter PS704 blade servers, which provide an alternative to sprawling racks.
  • Enhanced IBM Power 755, a high-performance computing cluster node with 32 POWER7 cores and a faster processor.

Along with the servers, IBM announced new switches closely integrated with its Power servers to support workloads such as cloud computing, financial services, Web 2.0, streaming video, medical and scientific research, and business analytics. According to a recent report by The Tolly Group, the new IBM switches demonstrated an average of 55% better price/performance over comparable switches.

So, everyone is still waiting for a business user that actually is tapping Watson-like capabilities to address a business problem. It will happen. As you know, it takes time to get systems implemented, tested, and put into production.  Stay tuned.

, , , , , ,

Leave a comment

Virtualization changes the economics of IT

To most business managers the IT function is a cost center to be minimized. Gartner Data Center conference attendees in December, reportedly, overflowed a session on reducing data center costs. An audience poll at the session, however, showed that about 20% of the audience had no IT cost accounting in place at all, and over half the attendees were basically flying blind on their IT budgets. They had, in effect, no way to control IT costs.

With personnel making up about 38% of IT costs the surest way to cut IT expenses is to cut people, Gartner noted. Another suggestion: buy cheaper IT hardware.

Inna Kuznetsova, Vice President, IBM Systems Software, in a recent analyst briefing, suggests there is a different way to change the economics of IT. The key: virtualization. Even basic virtualization using x86-based servers can deliver 8:1 server consolidation, which can save $600 per server in energy costs alone. Virtualizing on IBM’s eX5 x86-based servers can get you 78% more virtual machines for the same license cost, the company reports.

The rapid digital transformation of the global economy is putting IT infrastructures at companies everywhere under great pressure.  From 2007 to this year, digital data is projected to grow by a factor of 10. Compounding that challenge is the growth of unstructured data, which will make up 80% of the digital data growth and is one factor fueling interest in the advanced analytics, which is needed to make sense of that kind of data.

The scale of this digital transformation is astounding.  Six TB of information is exchanged over the Internet every second! The number of devices connected to the Internet by the end of this year will reach 1012. Driving this is the emergence of the Internet of Things, which BottomlineIT previously covered here.

Certainly shrinking the IT technology footprint through consolidation and reducing IT staff remain key to lowering costs, but technology virtualization, noted Kuznetsova, is the best way to get there. By 2013, she reports that 69% of all server workloads will be virtualized.

Kuznetsova sees a four-step journey to new, improved IT economics through virtualization:

  1. Start with server virtualization but extend it to storage and networks, which also can be virtualized and consolidated. Through IT resource virtualization organizations can boost efficiency and increase the utilization of IT, which boosts ROI.
  2. Manage workloads to further improve staff productivity or reduce staff. This will require integrated systems management tools that enable you to increase server, storage, and network resource-to-staff ratios.  Where it took one storage admin to manage 10 TB of storage, that admin could now handle 100 TB or more through a virtualized IT infrastructure.
  3. Deploy automation to achieve consistent and repeatable processes.  This not only further reduces staffing requirements but enables IT to consistently meet business service levels.
  4. Optimize IT delivery to enable business agility. Ideally this will take the form of user self-provisioning.  Self-provisioning is feasible due to the flexibility of virtualized IT resources, which are not constrained by physical barriers or location. A business manager, for example, can use easy templates to self-provision a new server in minutes to support a new business initiative.

Of course, IBM offers tools, like IBM Systems Director or Tivoli, to assist at every step in this virtualization journey. Other vendors are heading there too, including HP, EMC, Oracle/Sun, and others.

Virtualization lies at the core of cloud computing. A progression through Kuznetsova’s virtualization-driven steps invariably leads you to the cloud. At that point you decide how much cloud is right for your organization.

 

, , , , , , , , , , , ,

Leave a comment

High End Servers—the Software Differentiator

Some may argue that given the recent tumult in the high end server market there is no high end server available today that can match the IBM zEnterprise (z196/zBX combo) in processing power, reliability, and scalability. It holds its own in terms of speeds and feeds, number of cores, memory. Software, however, may turn out to be the biggest differentiator for high end servers, and IBM has optimized a ton of software for its platforms, something the others mainly just talk about.

The high end server market has suddenly entered a period of change. In March Oracle announced that will no longer support Itanium processors. HP immediately countered with a statement of support for Itanium. SGI announced a 256-core Xeon Windows system. Also in March, Quanta Computer, a Chinese operation, reported squeezing 512 cores into a pizza box server running the Tilera multi-core processor.  Tilera’s roadmap goes out to 2013 when it expects to pack 200 cores onto a processor.  Of course, IBM launched the first hybrid server, the zEnterprise consisting of the multi-core z196 coupled with the zBX last summer.

This recent flurry of server activity at the large-scale, multi-core end of the market leaves server buyers somewhat confused. One wrote to BottomlineIT: What will be the ultimate retail price per core?  What’s the current price per core of, perhaps, a chassis full of 8-core IBM System p blades, an HP Superdome, or an SGI UV 1000 running Windows or Linux?

Fair questions, for sure. The published OEM price last year was $900 per chip for a 64-core Tilera processor, which rounds to $14 per core. SGI reports that the Altix UV starts at $50,000 with Microsoft software an additional $2,999 per four sockets (32 cores). A buyer could end up facing different vendors and technologies competing at the $50, $100, $500, $1000, $5,000 and $10,000 per core price points. Each vendor will be promoting a different architecture, configuration, memory optimization, performance, and even form factor (multi u, pizza box, blades) attributes.

This is not just about price but integration, internal communication speeds, optimization, and more. At this point all the vendors need to be more forthcoming and transparent.

But this may not turn out to be a hardware, processor, memory, speeds and feeds battle. It may not even turn into a price-per-core battle or a total cost of ownership (TCO) vs. total cost of acquisition (TCA) battle. Ultimately, it has to come down to workloads supported and delivered, and that means software. And when it comes to workload optimization and software IBM already has an advantage, especially when compared to Oracle and HP. Remember when Oracle bought Sun CEO Larry Ellison said it would optimize Oracle software for SPARC? Have you seen much yet?

A quick peek at IBM’s software lineup suggests the company has a lot of topnotch software to run on its hardware, most of it optimized for its hardware platforms.  Factor in the ISV ecosystem and the IBM picture gets even better.

Let’s start with Gartner naming IBM the worldwide market share leader overall in the application infrastructure and middleware software segment.  If you drill down into the various submarkets, IBM often comes up as leader there too. For example, IBM leads the business process management (BPM) market, with better than double the share of its closest competitor. IBM also leads in the message oriented middleware market, the transaction processing monitor market, and the combined markets for Enterprise Service Bus (ESB) and integration appliances.

Critical segments for sure, but businesses need more. For that IBM offers DB2, a powerful enterprise database management system that can rival Oracle. WebSphere goes far beyond being just an application server; it encompasses a wide range of functionality including portals and commerce. With Rational, IBM can cover the entire application development lifecycle, and with Lotus IBM nails down communication and collaboration. And don’t forget Cognos, a powerful BI tool, plus all the IBM Smart Analytics tools. Finally, IBM provides the Tivoli product set to manage systems and storage.

The point: when it comes to high end servers it is not just about processor cores. It’s about systems optimized for the software you need to run your workloads.

 

, , , , , , , , , , , , , , , , ,

Leave a comment

Top performers are analytics-driven

Does your organization take advantage of data analytics?  A study from the MIT Sloan Management School and IBM’s Institute for Business Value late last year found top-performing companies are three times more likely to be capitalizing on data analytics.

The study based on a sample of nearly 3,000 executives and business analysts from 108 countries and 30 industries found a clear connection between users of analytics technology and the ability to achieve competitive differentiation and performance. For example, top performers are five times more likely to apply analytics rather than intuition across the widest possible range of decisions. In financial and budgeting, top performers were nearly four times more likely than others to apply analytics. So much for making decisions based on what your gut says.

A more recent study released this March,  again by the IBM Institute for Business Value, surveyed several hundred leading banks on banking success factors and also found data analytics to be critically important. Specifically, 90% of bankers believe they need to transform from the status quo for future profitability, and the successful banks will be the ones that invest in building sophisticated insights based on powerful analytics. Top performers, the researchers found, will use insights gained through analytics to focus their operations.

These studies hit at a time when organizations are being inundated with data, often at a rate faster than their people and even their systems and processes can effectively capture, assess, and act. And it is not just the volume of data but the speed at which it is pouring in and the variety of the information that makes analytics so difficult yet so important.

IT people excel at capturing data, processing it, and reporting it. More recently IT latched onto business intelligence (BI) to extract value from data and has deployed an array of BI tools, such as Cognos or Business Objects. IT, however, has been slower to put sophisticated, automated high speed analytics at the disposal of the organization’s line of business managers.

Many organizations pretty much give up, rarely moving beyond, say, some multi-dimensional data arrays that they can dice-and-splice in a handful of different ways. Even BI as it is currently deployed does not help companies become true top performers.

According to the MIT-IBM study, top performers are two times more likely to shape future business strategies as well as guide day to day operations based on analytics. The study found that despite popular complaints about the overwhelming amount of information, organizations today are far less concerned about data deluge issues, instead feeling hindered by traditional management practices.

Those management practices fall into three areas:

  1. Lack of understanding about how to apply analytics to improve their business
  2. Lack of bandwidth due to competing priorities
  3. Lack of skills in the lines of business

This becomes most apparent when organizations attempt some analysis but can’t translate the resulting insights into effective action. The MIT-IBM study, however, suggests how to get past these initial hurdles:

Tackle biggest challenges first: Don’t wait for complete data or perfect skills before you try to apply analytics to high-value opportunities.

Flip the data to insights equation: Instead of data before insights, recognize the specific insights needed and focus on getting only the data required for answers.

Adopt techniques and tools best suited to your management: New tools like data visualization and simulation techniques can help executives and managers alike anticipate the consequences of their decisions, explore alternative approaches, and confront the tradeoffs. These do not require unusual skills to use and can be applied by business leaders at anywhere in the organization.

IT is perfectly positioned to work with forward thinking business managers deliver the data and tools they need to turn their companies into top performers. IBM, which funded both these studies, has also taken the lead in delivering the necessary tools with its Smart Analytics toolset. Oracle, HP, and others are scrambling to get into this space.

 

, , , , , , , , ,

1 Comment