X
Business

Application bloat survey: if you don't know the answer make something up

Survey silly season is well under way. Company after company is sending the results of surveys and hoping ZDnet will publicize the results. Unfortunately, these studies are often self serving, badly designed and produce results that are seldom representative.
Written by Dan Kusnetzky, Contributor

Let's start with an apology

I accidentally jumped the gun and published this commentary prematurely. So, if this looks familiar, it was published and then withdrawn last Friday. While reading through the materials sent by the PR firm, I didn't notice that the information was being provided under a non-disclosure agreement (NDA) and that I shouldn't have published anything until Today. My only excuse is that typically people will send me a message asking if I'll honor an NDA prior to sending me materials. This time, the request to honor an NDA was burried near the bottom of the introductory text prior to the outline of the survey results.

Review of yet another survey

Recently a represented of Dell/Quest's PR team reached out to me to introduce me to a survey conducted by Harris Interactive. Dell/Quest sponsored this study. This is the tenth survey that has crossed my desk in the past month. Like most of the others, I have serious problems with the sample, the methodology and don't believe that the results are necessarily representative of the industry at large.

The Survey

The survey focused on what Dell/Quest called "application bloat." In the words of the announcement, "application bloat, or an overabundance of rarely used software applications deployed in business today, can lead to millions in lost revenue."

Survey findings at a glance

In Dell/Quest's own words, here are the points the company would like readers to take away from the survey:

Survey findings at a glance:

  • 75% of respondents said that in one year, annual monetary losses experienced (e.g., decreased productivity, lost transactions/sales, missed opportunities) as a result of slow, unresponsive or crashed applications cost businesses tens of thousands to tens of million dollars.
  • 50% of respondents work at organizations with more than 500 applications. Yet on a typical day the majority (57%) of organizations use 249 or fewer applications.
  • Of those applications accessed daily, less than half (48%) are accessed more than 5 times each day
  • 79% of applications are kept on-premise, while 21% are run in the cloud
  • 58% of respondents said the performance of applications have a major impact on the performance of their business
  • 77% of respondents would choose IT efficiency over reducing staff or outsourcing
  • 32% of respondents plan to implement a monitoring tool in the next 18 months
  • 89% of respondents currently have in place software that enables IT to monitor, uncover and address application-related issues.

Snapshot analysis

Having developed, fielded and then reported upon quite a number of surveys during my time at IDC and, later, at the 451 Group, I have a great deal of respect for the findings of a well designed, well implemented survey. This type of research can shine a much-needed light on what organizations are doing and planning. Unfortunately, this study and most of the ten others that have crossed my desk so far this month, fail to live up to this high standard. This, of course, makes the results of limited and, perhaps, questionable use.

The problems I have with quite a number of the marketing-focused surveys that come across my desk usually fall into one of the following categories:

  • Is the sample representative of the industry at large or, at least, the market segment it supposed to address, that is are the right people being asked?
    • Quite often, a small sample is used and broad, global statements are made.
    • The sample is made up of respondents from a single country and broad statements are made about the worldwide market.
    • The sample is made up of representatives of a single market segment and broad statements are made about the worldwide market.
    • The sample is often made up of only the sponsor's customers and the survey was conducted at the sponsor's own customer event making the sample quite limited and not at all representative of the market as a whole.
    • The sample perports to present what companies are planning to do, but the sample may not include company decision-makers. A more subtile problem is that decision-makers are included, but they may not represent the business unit or department that is really responsible for the decision.
  • The survey instrument is biased or leading making the results questionable, that is, are the right questions being asked and are they being asked in the right way?
    • The questions might assume a given positon and make no provision for respondents to make contrary answers. This is the "when did you stop beating your spouse?" type of question. There is no provision for someone who does not beat his/her spouse to respond.
    • The questions might be biased to support the need for the sponsor's product or service. In this case, the sponsor, Dell/Quest, offers products that discover, inventory and manage applications. So, of course, the questions would assume the need for this type of product.
    • The questions assume that companies would lose money because applications are not being used. I didn't see any provions made for applications having been purchased and used by another department. The assumption is that if the respondent doesn't use an application, it is not being used at all. Furthermore, the connection made from appearantly unused applications to revenue loss was tenuous at best.
    • The survey instrument asks specific questions for which respondents don't have answers. If the survey offers "don't know" or "not applicable" as choices, that is far better than offering a choice that asks respondents to estimate (or make up) answers. In the case of this study, the survey instrument asks respondents to estimate or make up answers if they don't know the answer or the question is not applicable to their environment.

I have to question how useful the results of this study will be to the industry. The respondents come only from the U.S. It is not at all clear what industry each respondent represents nor is company size information included. The questions are designed to lead to the conclusion that application inventory management tools are needed.

My biggest issue is that the questions asked respondents to estimate, or make up answers if they either didn't know the answer or it wasn't applicable to their company, business unit or department.

In the end, I would suspect that this survey, like the others that have come across my desk recently, might be useful to someone, somewhere and in some market. The use, however, appears quite limited to me.

Although broad, sweeping statements are made about the survey findings, they are not likely to be useful to everyone, everywhere and should be taken with a very large grain of salt.


Note: Dell/Quest's PR folks have sent along a document providing some responses from Harris Interactive. I will pass along their comments and my response after I've had an opportunity to evaluate their response.

Editorial standards