Lies, Statistics, and the PMO

Thursday Aug 26th 2004 by Jeannette Cabanis-Brewin

Some Project Management Office research compares apples with oranges and gets fruit salad. What should you be considering?

Here at the words-and-facts factory, we spend our days juggling statistics: either those we've generated ourselves by developing research questionnaires, or those we've sought out from other purveyors of research questionnaires. After ten years or so of this kind of activity, one becomes an extremely skeptical consumer of statistics. Like Benjamin Disraeli, who famously railed against "lies, damned lies, and statistics," the more familiar you are with all the ways that statistics can mislead, the more critical eye you cast over any use of statistical information.

Project management is a field in which much research remains to be done. In the past five years, the number of studies related to PM has increased, and that's good. Here are just a few of the studies that have been done recently that address the value of the PMO:

CIO Insight:,1397,1620739,00.asp

CIO magazine with PMI:

Forrester Research:,10801,83159,00.html

But not all research is created equal. I often see questionable assertions made about project management based on sketchy research data. So, be a careful consumer of PM research studies: If you are seeking to prove the value of a PMO, either by collecting external research to make your case, or by creating internal research to justify your existence, here are six important points to keep in mind.

1. What's in a name?

The term "Project Office" suffers from having a welter of definitions. To one company, it's a single person on a mission to promote a methodology. To another, it's a bank of project mentors that fans out across the enterprise to teach people how to fish. In some large capital project contractors, a Project Office might be a kind of war room for a single project. Some companies with mature project management practices have a true organization called a Project Office, with a Director and support staff and direct reports who oversee everything from project manager training to software selection and implementation, but this is still rare and usually confined to very large enterprises. (About 3% of companies have this level of Project Office organization, according to Center for Business Practices research.) So, even to discuss the topic, you first have to find out which Project Office paradigm the participants are coming from.

This all-over-the-map character isn't a bad thing: it's an essential part of the nature of a concept under development. While companies figure out how to make a concept into a set of workable practices, we can expect those practices to take many forms. However, in one way, the concept itself can be hurt when in-practice Project Offices are so variable. In the project management field, there isn't that much research being done, so whatever studies are done tend to be widely quoted (and sometimes misquoted: see Item 4 below.) One bad piece of information—an erroneous assumption drawn from confusing results to a badly-designed questionnaire—can have immense impact.

So, if you are creating a survey about PMOs, be sure to give the respondents a way to indicate what PMO means to them. A checklist of types of PMO organization with "check as many as apply" is one way to do this.

Another, related question might be length of time since implementation. A one-person, brand-new PMO cannot possibly have had time to have much of an effect on organizational factors like career planning, portfolio success, and the like. Always offer the "not applicable" option in multiple choice questions.

2. The question makes the answer.

In one recent PMO study I've seen, the results indicated that, for the majority of responding companies, those with a PMO had higher project failure rates.


There are a couple ways to explain this result. Maybe this entire group of companies surveyed were all really, really bad at PMO implementation. But that seems kind of unlikely. More likely is that the question didn't do much to elicit a description of reality. For example, in many companies no one is keeping track of project failure rates. Therefore, the known failure rate is ... zero. Along comes the PMO and begins collecting metrics. Aha! Project failure rate is, say, 15%. Does this mean failures went up? No, but that's what it will look like if all you ask is pre-PMO and post-PMO failure rates.

Survey design is both art and science and a poorly-designed questionnaire can obfuscate more than it illuminates. It would have been good to know things about these companies like:

  • Did they take a baseline measurement of failures before they implemented the PMO?
  • How long have they been tracking failures?
  • What failure prevention measures have been undertaken by the best-performing companies in the study? By the worst-performing?

The problem of course is that, the more precise your questions are, the fewer of them you can ask. It's a law of questionnaire development that few people will respond to a survey questionnaire that comprises more questions than will fit on about one page. People have to be highly motivated to respond to longer surveys. That's why it's good to have some sort of pre-qualified survey group. Our organization does this by soliciting members for a research network. They are a self-selected group of people who are interested in both participating in research, and learning from it.

3. Data becomes valid-or meaningless-in the analysis phase.

Especially if there are any qualitative questions (open-ended questions that elicit individualized answers), the task of analyzing survey results belongs not just to number-crunching but to creative, critical thinking. Good numbers don't always translate into sound conclusions. That's why you should ....

4. Go to the source.

What is the source of the statistic in question trying to sell you? On the surface, it may seem like disinterested information, but in today's society, we are frequently the targets of marketing disguised as information. When collecting research studies, go to the source instead of relying on quotes. I've recently seen some of our Center for Business Practices findings misrepresented in press releases—and from a university, at that. More than that, look at the numbers yourself, not just at the packaged results offered in an executive summary. Your knowledge of your own industry may allow you to draw conclusions from the data that the report writer missed.

5. Remember that even the best surveys raise more questions than they answer.

Otherwise, research would have ended a long time ago. The thoughtful researcher—or consumer of research—won't accept as final any results that seem counterintuitive. When I see an assertion such as "instituting career paths has no effect on project success rates" I immediately wonder: Why not? How long have they been in place? Do they offer technical and non-technical tracks? Is professional development cost supported by the organization? Are project managers more interested in other aspects of reward, rather than promotion or advancement? This is why articles in scholarly journals normally end with a section listing all the further research that needs to be done to explain the results of their research!

6. When doing internal research, be mindful of the Hawthorne Effect.

Merely by the way you structure your metrics-gathering, you can alter the processes and outcomes that create that data. The best example (for the worst result) that I can think of comes not from project management but from the human services field: When a state child protection agency began reviewing the metric of "cases closed," suddenly closing cases became more important than actually protecting children, with predictable results for the kids. Research on project management, thankfully, isn't likely to create such dire consequences, but it's easy to imagine a situation in which the project managers interviewed on topics such as whether or not their use of reward incentives translated into higher project success rates might start looking around for a new job.


For more good tips and resources on statistics and internal research, see the December 2002 issue of People On Projects. (This isn't available online, but back issues can be ordered in electronic format (PDF) by contacting

About the Author

Jeannette Cabanis-Brewin is editor-in-chief of the Center for Business Practices, the research arm of project management consultancy PM Solutions.

Mobile Site | Full Site
Copyright 2017 © QuinStreet Inc. All Rights Reserved