Posts Tagged Oracle

The Internet of Things Gains Traction

The Internet of Things (IoT) appears finally to be gaining real traction with both Gartner and IDC putting out reports on it. The opportunity, however, can best be understood in terms of vertical applications because the value of IoT is based on individual use cases across all verticals. “Successful sales and marketing efforts by vendors will be based on understanding the most lucrative verticals that offer current growth and future potential and then creating solutions for specific use cases that address industry-specific business processes,” said Scott Tiazkun, senior research analyst, IDC’s Global Technology and Industry Research Organization.” Similarly, enterprise IT needs to understand which vertical use cases will benefit first and most.

Tiazkun was referring to IDC’s latest Worldwide Internet of Things Spending by Vertical Market 2014-2017 Forecast.  To tap that market, IDC advises consultants to focus on the individual vertical opportunity that arises from IoT already in play.  Here is where a vertical business savvy IT exec can win. As IDC noted, realizing the existence of the vertical opportunity is the first step to understanding the impact and, therefore, to understanding an IoT market opportunity that exists – for enterprises and IT vendors and consultants.

The idea of IoT has been kicking around for years. BottomelineIT wrote about it early in 2011 here. It refers to the idea of embedding intelligence into things in the form of computer processors and making them IP addressable. Linking them together over a network gives you IoT.  The idea encompasses almost anything from the supply chain to consumer interests. Smart appliances, devices, and things of all sorts can participate in IoT.  RFID, all manner of sensors and monitors, big data, and real time analytics play into IoT.

In terms of dollars, IoT is huge. Specifically, IDC has found:

  • The technology and services revenue from the components, processes, and IT support for IoT to expand from $4.8 trillion in 2012 to $7.3 trillion by 2017 at an 8.8% compound annual growth rate (CAGR), with the greatest opportunity initially in the consumer, discrete manufacturing, and government vertical industries.
  • The IoT/machine-to-machine (M2M) market is growing quickly, but the development of this market will not be consistent across all vertical markets. Industries that already understand IoT will see the most immediate growth, such as industrial production/automotive, transportation, and energy/utilities. However, all verticals eventually will reflect great opportunity.
  • IoT is a derivative market containing many elements, including horizontal IT components as well as vertical and industry-specific IT elements. It is these vertical components where IT consultants and vendors will want to distinguish themselves to address industry-specific IoT needs.
  • IoT also opens IT consultants and vendors to the consumer market by providing business-to-business-to-consumer (B2B2C) services to connect and run homes and automobiles – all places that electronic devices increasingly  will have networking capability.

 Already, leading vendors are positioning themselves for the IoT market. To Oracle IoT brings tremendous promise to integrate every smart thing in this world.  Cisco, too, jumped early on IoT bandwagon dubbing it the Internet of Everything.

IBM gets almost cosmic about IoT, which it describes as the emergence of a kind of global data field. The planet itself—natural systems, human systems, physical objects—have always generated an enormous amount of data, but until recent decades, we weren’t able to hear, see, or capture it. Now we can because all of these things have been instrumented with microchips, UPC codes, and other technologies. And they’re all interconnected, so now we can actually access the data. Of course, this dovetails with IBM’s Smarter Planet marketing theme.

Enterprise IT needs to pay close attention to IoT too. First, it will change the dynamics of your network, affecting everything from network architecture to bandwidth to security. Second, once IT starts connecting the various pieces together, it opens interesting new possibilities for using IT to advance business objectives and even generate revenue. It can help you radically reshape the supply chain, the various sales channels, partner channels, and more. It presents another opportunity for IT to contribute to the business in substantive business terms.

IDC may have laid out the best roadmap to IoT for enterprise IT. According to IDC, the first step will be to understand the components of IoT/M2M IT ecosphere. Because this is a derivative market, there are many opportunities for vendors and consultants to offer pieces, product suites, and services that cover the needed IoT technology set. Just make sure this isn’t just about products. Make sure services, strategies, integration, and business execution are foremost. That’s how you’ll make it all pay off.

The promise of IoT seems open ended. Says Tiazkun: “The IoT solutions space will expand exponentially and will offer every business endless IoT-focused solutions. The initial strategy of enterprise IT should be to avoid choosing IoT-based solutions that will solve only immediate concerns and lack staying power. OK, you’re been alerted.

Follow BottomlineIT on Twitter: @mainframeblog

Advertisements

, , , , , , , , , , , ,

3 Comments

Fueled by SMAC Tech M&A Activity to Heat Up

Corporate professional services firm BDO USA  polled approximately 100 executives of U.S. tech outfits for its 2014 Technology Outlook Survey and found them firm in the belief that mergers and acquisitions in tech would either stay at the same rate (40%) or increase over last year (43%). And this isn’t a recent phenomenon.

M&A has been widely adopted across a range of technology segments as not only the vehicle to drive growth but, more importantly, to remain at the leading edge in a rapidly changing business and technology environment that is being spurred by cloud and mobile computing. And fueling this M&A wave is SMAC (Social, Mobile, Analytics, Cloud).

SMAC appears to be triggering a scramble among large, established blue chip companies like IBM, EMC, HP, Oracle, and more to acquire almost any promising upstart out there. Their fear: becoming irrelevant, especially among the young, most highly sought demographics.  SMAC has become the code word (code acronym, anyway) for the future.

EMC, for example, has evolved from a leading storage infrastructure player to a broad-based technology giant driven by 70 acquisitions over the past 10 years. Since this past August IBM has been involved in a variety of acquisitions amounting to billions of dollars. These acquisitions touch on everything from mobile networks for big data analytics and mobile device management to cloud services integration.

Google, however, probably should be considered the poster child for technology M&A. According to published reports, Google has been acquiring, on average, more than one company per week since 2010. The giant search engine and services company’s biggest acquisition to date has been the purchase of Motorola Mobility, a mobile device (hardware) manufacturing company, for $12.5 billion. The company also purchased an Israeli startup Waze  in June 2013 for almost $1 billion.  Waze is a GPS-based application for mobile phones and has brought Google a strong position in the mobile phone navigation business, even besting Apple’s iPhone for navigation.

Top management has embraced SMAC-driven M&A as the fastest, easiest, and cheapest way to achieve strategic advantage through new capabilities and the talent that developed those capabilities. Sure, the companies could recruit and build those capabilities on their own but it could take years to bring a given feature to market that way and by then, in today’s fast moving competitive markets, the company would be doomed to forever playing catch up.

Even with the billion-dollar and multi-billion dollar price tags some of these upstarts are commanding strategic acquisitions like Waze, IBM’s SoftLayer, or EMC’s XtremeIO have the potential to be game changers. That’s the hope, of course. But it can be risky, although risk can be managed.

And the best way to manage SMAC merger risk is to have a flexible IT platform that can quickly absorb those acquisitions and integrate and share the information and, of course, a coherent strategy for leveraging the new acquisitions. What you need to avoid is ending up with a bunch of SMAC piece parts that don’t fit together.

, , , , , , , , , , , , , , , , , ,

Leave a comment

Where Have All the Enterprise IT Hardware Vendors Gone?

Remember that song asking where all the flowers had gone? In a few years you might be asking the same of many of today’s enterprise hardware vendors.  The answer is important as you plan your data center 3-5 years out.  Where will you get your servers from and at what cost? Will you even need servers in your data center?  And what will they look like, maybe massive collections of ARM processors?

As reported in The Register (Amazon cloud threatens the entire IT ecosystem): Amazon’s cloud poses a major threat to most of the traditional IT ecosystem, a team of 25 Morgan Stanley analysts write in a report, Amazon Web Services: Making Waves in the IT Pond, that was released recently. The Morgan Stanley researchers cite Brocade, NetApp, QLogic, EMC and VMware as facing the greatest challenges from the growth of AWS. The threat takes the form of AWS’s exceeding low cost per virtual machine instance.

Beyond the price threat, the vendors are scrambling to respond to the challenges of cloud, mobile, and big data/analytics. Even Intel, the leading chip maker, just introduced the 4th generation Intel® Core™ processor family to address these challenges.  The new chip promises optimized experiences personalized for end-users’ specific needs and offers double the battery life and breakthrough graphics targeted to new low cost devices such as mobile tablets and all-in-one systems.

The Wall Street Journal online covered related ground from a different perspective when it wrote: PC makers unveiled a range of unconventional devices on the eve of Asia’s biggest computer trade show as they seek to revive (the) flagging industry and stay relevant amid stiff competition. Driven by the cloud and the explosion of mobile devices in a variety of forms the enterprise IT industry doesn’t seem to know what the next device should even be.

Readers once chastised this blogger for suggesting that their next PC might be a mobile phone. Then came smartphones, quickly followed by tablets. Today PC sales are dropping fast, according to IDC.

The next rev of your data center may be based on ARM processors (tiny, stingy with power, cheap, cool, and remarkably fast), essentially mobile phone chips. They could be ganged together in large quantities to deliver mainframe-like power, scalability, and reliability at a fraction of the cost.

IBM has shifted its focus and is targeting cloud computing, mobile, and big data/analytics, even directing its acquisitions toward these areas as witnessed by yesterday’s SoftLayer acquisition. HP, Oracle, most of the other vendors are pursuing variations of the same strategy.  Oracle, for example, acquired Tekelec, a smart device signaling company.

But as the Morgan Stanley analysts noted, it really is Amazon using its cloud scale to savage the traditional enterprise IT vendor hardware strategies and it is no secret why:

  • No upfront investment
  • Pay for Only What You Use (with a caveat or two)
  • Price Transparency
  • Faster Time to Market
  • Near-infinite Scalability and Global Reach

And the more AWS grows, the more its prices drop due to the efficiency of cloud scaling.  It is not clear how the enterprise IT vendors will respond.

What will your management say when they get a whiff of AWS pricing. An extra large, high memory SQL Server database instance lists for $0.74 per hour (check the fine print). What does your Oracle database cost you per hour running on your on-premise enterprise server? That’s what the traditional enterprise IT vendors are facing.

, , , , , , , , , , , , , , , , , , , ,

Leave a comment

Mainframe Workload Economics

IBM never claims that every workload is suitable for the zEnterprise. The company prefers to talk about platform issues in terms of fit-for-purpose or tuned-to-the-task. With the advent of hybrid computing, the low cost z114, and now the expected low cost version of the zEC12 later this year, however, you could make a case for any workload that benefits from the reliability, security, and efficiency of the zEnterprise mainframe is fair game.

John Shedletsky, VP, IBM Competitive Project Office, did not try to make that case. To the contrary, earlier this week he presented the business case for five workloads that are optimum economically and technically on the zEnterprise.  They are:  transaction processing, critical data workloads, batch processing, co-located business analytics, and consolidation-on-one-platform. None of these should be a surprise; possibly with the exception of analytics and consolidated platform they represent traditional mainframe workloads.  BottomlineIT covered Shedletsky’s mainframe cost/workload analysis last year here.

This comes at a time when IBM has started making a lot of noise about new and different workloads on the zEnterprise. Doug Balog, head of IBM System z mainframe group, for example, was quoted widely in the press earlier this month talking about bringing mobile computing workloads to the z. Says Balog in Midsize Insider: “I see there’s a trend in the market we haven’t directly connected to z yet, and that’s this mobile-social platform.”

Actually, this isn’t even all that new either. BottomlineIT’s sister blog, DancingDinosaur, was writing about organizations using SOA to connect CICS apps running on the z to users with mobile devices a few years ago here.

What Shedletsky really demonstrated this week was the cost-efficiency of the zEC12.  In one example he compared a single workload, app production/dev/test running on a 16x, 32-way HP Superdome and an 8x, 48-way Superdome with a zEC12 41-way. The zEC12 delivered the best price/performance by far, $111 million (5yr TCA) for the zEC12 vs. $176 million (5yr TCA) for the two Superdomes.

When running Linux on z workloads with the zEC12 compared to 3 Oracle database workloads (Oracle Enterprise Edition, Oracle RAC, 4 server nodes per cluster) supporting 18K transactions/sec.  running on 12 HP DL580 servers (192 cores) the HP system priced out at $13.2 million (3yr TCA) compared to a zEC12 running 3 Oracle RAC clusters (4 nodes per cluster, each as a Linux guest) with 27 IFLs that priced out at $5.7 million (3yr TCA). The zEC12 came in at less than half the cost.

With analytics such a hot topic these days Shedletsky also presented a comparison of the zEnterprise Analytics System 9700 (zEC12, DB2 v10, z/OS, 1 general processor, 1 zIIP) and an IDAA with a current Teradata machine. The result: the Teradata cost $330K/queries per hour compared to $10K/queries per hour.  Workload time for the Teradata was 1,591 seconds for 9.05 queries per hour compared to 60.98 seconds and 236 queries per hour on the zEC12. The Teradata total cost was $2.9 million compared to $2.3 million for the zEC12.

None of these are what you would consider new workloads, and Shedletsky has yet to apply his cost analysis to mobile or social business workloads. However, the results shouldn’t be much different. Mobile applications, particularly mobile banking and other mobile transaction-oriented applications, will play right into the zEC12 strengths, especially when they are accessing CICS on the back end.

While transaction processing, critical data workloads, batch processing, co-located business analytics, and consolidation-on-one-platform remain the sweet spot for the zEC12, Balog can continue to make his case for mobile and social business on the z. Maybe in the next set of Shedletsky comparative analyses we’ll see some of those workloads come up.

For social business the use cases aren’t quite clear yet. One use case that is emerging, however, is social business big data analytics. Now you can apply the zEC12 to the analytics processing part at least and the efficiencies should be similar.

, , , , , , , , , , , , ,

Leave a comment

PaaS Gains Cloud Momentum

Guess you could say Gartner is bullish on Platform-as-a-Service (PaaS). The research firm declares: PaaS is a fast-growing core layer of the cloud computing architecture, but the market for PaaS offerings is changing rapidly.

The other layers include Software-as-a-Service (SaaS) and Infrastructure-as-a-Service (IaaS) but before the industry build-out of cloud computing is finished (if ever), expect to see many more X-as-a-Service offerings. Already you can find Backup-as-a-Service (BaaS). Symantec, for instance, offers BaaS to service providers, who will turn around and offer it to their clients.

But the big cloud action is around PaaS. Late in November Red Hat introduced OpenShift Enterprise, an enterprise-ready PaaS product designed to be run as a private, public or hybrid cloud. OpenShift, an open source product, enables organizations to streamline and standardize developer workflows, effectively speeding the delivery of new software to the business.

Previously cloud strategies focused on SaaS, in which organizations access and run software from the cloud. Salesforce.com is probably the most familiar SaaS provider. There also has been strong interest in IaaS, through which organizations augment or even replace their in-house server and storage infrastructure with compute and storage resources from a cloud provider. Here Amazon Web Services is the best known player although it faces considerable competition that is driving down IaaS resource costs to pennies per instance.

PaaS, essentially, is an app dev/deployment and middleware play. It provides a platform (hence the name) to be used by developers in building and deploying applications to the cloud. OpenShift Enterprise does exactly that by giving developers access to a cloud-based application platform on which they can build applications to run in a cloud environment. It automates much of the provisioning and systems management of the application platform stack in a way that frees the IT team to focus on building and deploying new application functionality and not on platform housekeeping and support services. Instead, the PaaS tool takes care of it.

OpenShift Enterprise, for instance, delivers a scalable and fully configured application development, testing and hosting environment. In addition, it uses Security-Enhanced Linux (SELinux) for reliable security and multi-tenancy. It also is built on the full Red Hat open source technology stack including Red Hat Enterprise Linux, JBoss Enterprise Application Platform, and OpenShift Origin, the initial free open source PaaS offering. JBoss Enterprise Application Platform 6, a middleware tool, gives OpenShift Enterprise a Java EE 6-certified on-premise PaaS capability.  As a multi-language PaaS product, OpenShift Enterprise supports Java, Ruby, Python, PHP, and Perl. It also includes what it calls a cartridge capability to enable organizations to include their own middleware service plug-ins as Red Hat cartridges.

Conventional physical app dev is a cumbersome process entailing as many as 20 steps from idea to deployment. Make it a virtual process and you can cut the number of steps down to 14; a small improvement. As Red Hat sees it, the combination of virtualization and PaaS can cut that number of steps to six; idea, budget, code, test, launch, and scale. PaaS, in effect, shifts app dev from a craft undertaking to an automated, cloud-ready assembly line. As such, it enables faster time to market and saves money.

Although Red Hat is well along in the PaaS market and the leader in open source PaaS other vendors already are jumping in and more will be joining them. IBM has SmartCloud Application Services as its PaaS offering.  Oracle offers a PaaS product as part of the Oracle Cloud Platform. EMC offers PaaS consulting and education but not a specific technology product.  When HP identifies PaaS solutions it directs you to its partners. A recent list of the top 20 PaaS vendors identifies mainly smaller players, CA, Google, Microsoft, and Salesforece.com being the exceptions.

A recent study by IDC projects the public cloud services market to hit $98 billion by 2016. The PaaS segment, the fastest growing part, will reach about $10 billion, up from barely $1 billion in 2009. There is a lot of action in the PaaS segment, but if you are looking for the winners, according to IDC, focus on PaaS vendors that provide a comprehensive, consistent, and cost effective platform across all cloud segments (public, private, hybrid). Red Hat OpenShift clearly is one; IBM SmartCloud Application Services and Microsoft Azure certainly will make the cut. Expect others.

, , , , , , , , , , , ,

Leave a comment

Speed Time to Big Data with Appliances

Hadoop will be coming to enterprise data centers soon as the big data bandwagon picks up stream. Speed of deployment is crucial. How fast can you deploy Hadoop and deliver business value?

Big data refers to running analytics against large volumes of unstructured data of all sorts to get closer to the customer, combat fraud, mine new opportunities, and more. Published reports have companies spending $4.3 billion on big data technologies by the end of 2012. But big data begets more big data, triggering even more spending, estimated by Gartner to hit $34 billion for 2013 and over a 5-year period to reach as much as $232 billion.

Most enterprises deploy Hadoop on large farms of commodity Intel servers. But that doesn’t have to be the case. Any server capable of running Java and Linux can handle Hadoop. The mainframe, for instance, should make an ideal Hadoop host because of the sheer scalability of the machine. Same with IBM’s Power line or the big servers from Oracle/Sun and HP, including HP’s new top of the line Itanium server.

At its core, Hadoop is a Linux-based Java program and is usually deployed on x86-based systems. The Hadoop community has effectively disguised Hadoop to speed adoption by the mainstream IT community through tools like SQOOP, a tool for importing data from relational databases into Hadoop, and Hive, which enables you to query the data using a SQL-like language called HiveQL. Pig is a high-level platform for creating the MapReduce programs used with Hadoop. So any competent data center IT group could embark on Hadoop big data initiatives.

Big data analytics, however, doesn’t even require Hadoop.  Alternatives like Hortonworks Data Platform (HDP), MapR, IBM GPFS-SNC (Shared Nothing Cluster), Lustre, HPCC Systems, Backtype Storm (acquired by Twitter), and three from Microsoft (Azure Table, Project Daytona, LINQ) all promise big data analytics capabilities.

Appliances are shaping up as an increasingly popular way to get big data deployed fast. Appliances trade flexibility for speed and ease of deployment. By packaging hardware and software pre-configured and integrated they make it ready to run right out of the box. The appliance typically comes with built-in analytics software that effectively masks big data complexity.

For enterprise data centers, the three primary big data appliance players:

  • IBM—PureData, the newest member of its PureSystems family of expert systems. PureData is delivered as an appliance that promises to let organizations quickly analyze petabytes of data and then intelligently apply those insights in addressing business issues across their organization. The machines come as three workload-specific models optimized either for transactional, operational, and big data analytics.
  • Oracle—the Oracle Big Data Appliance is an engineered system optimized for acquiring, organizing, and loading unstructured data into Oracle Database 11g. It combines optimized hardware components with new software to deliver a big data solution. It incorporates Cloudera’s Apache Hadoop with Cloudera Manager. A set of connectors also are available to help with the integration of data.
  • EMC—the Greenplum modular data computing appliance includes Greenplum Database for structured data, Greenplum HD for unstructured data, and DIA Modules for Greenplum partner applications such as business intelligence (BI) and extract, transform, and load (ETL) applications configured into one appliance cluster via a high-speed, high-performance, low-latency interconnect.

 And there are more. HP offers HP AppSystem for Apache Hadoop, an enterprise-ready appliance that simplifies and speeds deployment while optimizing performance and analysis of extreme scale-out Hadoop workloads. NetApp offers an enterprise-class Hadoop appliance that may be the best bargain given NetApp’s inclusive storage pricing approach.

As much as enterprise data centers loathe deploying appliances, if you are under pressure to get on the big data bandwagon fast and start showing business value almost immediately appliances will be your best bet. And there are plenty to choose from.

, , , , , , , , , , , , ,

Leave a comment

IBM Brings MEAP to All its Platforms

Mobile Enterprise Application Platforms (MEAP) are increasingly popular. With mobile as a strategic initiative IBM  is making its Mobile Development Lifecycle Solution v4.0 available on each of its platforms, from A (AIX) to Z (z/OS) and everything in-between (mainly Power and System x), including non-IBM platforms like HP, Mac, and Oracle (Sun/Solaris)

And IBM isn’t the only vendor to try to capture the mobile platform wave. The HP Enterprise Mobility Platform is intended to communication service providers. Of course, Oracle is there with its Mobile Rapid Application Development Platform that works on every platform by using HTML5, CSS3 and JavaScript for a true device agnostic solution. A smaller, lower cost player is Vervivo. An early, open standards-based player is KonyOne.

The IBM Mobile Development Lifecycle product, however, enables collaborative, mobile lifecycle management capabilities integrated with an enterprise-grade, standards-based, mobile application platform that is based on IBM Worklight for effective team development of mobile applications.

As mobile usage continues to grow worldwide—by the end of this year mobile transactions will have increased 50%–developing for mobile usage becomes an increasingly important consideration for organizations. Companies need to move beyond the initial one-off mobile projects that started them down the mobile path. Going forward they require a strategic approach that encompasses more than mobile device application coding and testing, just two aspects of the overall mobile app dev lifecycle.

Now the challenge is to ensure mobile apps are delivered on-time, with high quality, and meet business objectives. For this organizations need an approach that goes beyond the device SDKs. They need a comprehensive, team-based mobile app dev approach that provides not just a runtime infrastructure for deploying and running mobile applications for myriad devices but also an infrastructure to support rapid change, development, and delivery of quickly evolving mobile applications for business-critical data and transactions.

The mainframe, for its massive scalability and extreme high availability, can play a particularly important role in an organization’s mobile initiative, especially as the volume and value of mobile transactions increase. Already the System z mainframe is a leading platform for secure data serving and, according to IBM, the only commercial server to achieve Common Criteria Evaluation Assurance Level 5+ security classification, providing the confidence to run many different applications containing confidential data on the mainframe. And the mainframe is where much of the data users want to access from mobile devices will reside.

In particular, the newest mainframe, the new zEC12 builds on this with innovative security and privacy features to help protect data .Specifically, the zEC12 includes a state-of-the-art, tamper-resistant cryptographic co-processor, the Crypto Express4S, which provides privacy for transactions and sensitive data. It also incorporates transactional memory technology that IBM adapted to better support concurrent operations among a shared set of data, such as financial institutions processing transactions against the same set of accounts.

Making this all the more important is the anticipated growth of mobile transactions. According to Juniper Research, the value of remote transactions conducted via mobile devices is expected to exceed $730 billion annually by 2017. While Juniper sees major brands and retailers driving mobile transaction activity, IBM sees other types of transactions, such as flight check-in, client loyalty programs, employee self-service, the signing of legal documents, and other kinds of transactions that will drive the demand for mobile transaction security. Transactions, mobile and otherwise, are where the z excels.

IBM has pulled together a diverse set of capabilities to support the entire mobile lifecycle. The main pieces include IBM Worklight, IBM Endpoint Manager for Mobile Devices, and IBM WebSphere Cast Iron (Hypervisor edition). It is supplementing the core with tools like Tealeaf CXMobile, support for mobile app testing, support for mobile agile methodologies, and more.

Worldwide smartphone sales grew by 47% last year to 147 million units during the final quarter of 2011, according to Gartner. IDC estimates global downloads of mobile apps will reach 76.9 billion by 2014. It’s apparent the mobile wave is not diminishing anytime soon.

Enterprise data centers should expect to support an increasing amount of mobile traffic from new and different devices. This will present, at the least, significant new security and capacity challenges.  The z, and especially the zEC12with its recently updated software, previously covered by DancingDinosaur here, and enhancements like the Crypto Express4S, should be able to handle the challenges in stride, maybe with nothing more than some rethinking of MIPS consumption and assist processor usage.

Finally, one favor: Please take a moment to fill out this short survey. It is from Waterstone Management Group, a tech-focused management consulting firm that is attempting to gather a unique set of benchmarks and insights from a broad group of software industry professionals with the goal of sharing the summary with everyone who participates.  The scope of the survey is enterprise-wide but the respondent self-selects the areas they have responsibility for (one or more) and only answers questions specific to their area.  Each section contains 15-16 questions and should take no more than 10 min. Again, please click here for the survey.

, , , , , , , , , , , ,

Leave a comment