July 2007: Some Summer Project Management Reading

Editor’s Comments

So it’s summer once again and, for some anyway, a chance to wind down a little, relax, spend some time with the family and maybe even read one or more of the year’s best sellers. But perhaps you’re still up to your neck in work, trying to keep a bunch of projects going at the same time. However, we hope you’ll find a little time to browse through the articles we have for you in this issue of Project Times. We think it will be time well spent.

In Easy as Implementing a Package, Michael Mah discusses the intricacies of installing enterprise applications to the extent that, in some, cases it can take years to get the software working. Not one of the CIOs and CTOs he spoke with would want to install their applications again, even if they had the opportunity, because they wouldn’t want to face the hassle and stress. But Mah says it doesn’t have to be like that, and he explains how it’s possible to track project performance in terms of cost, schedule and quality across a range of large and small IT projects.

Terry Doerscher believes that the world of project management has changed, and he says that the project management office must go well beyond its traditional position to include financial and organizational capacity management. His article PMO 2.0: Expanding the Value and Reach of the PMO talks about the concept of a Management Integration Center (MIC), a term, he says that has been coined to differentiate its expanded role and functions within an organization, from those of the typical PMO. He sees the MIC as leading to a means to running technology services like a business.

For many years regular contributor, Chris Vandersluis, has been involved in the introduction of Enterprise Project Management into many corporations, often working with Microsoft on deployment of its Project Server system. He believes that the Microsoft solution is a powerful one, well received by clients. However, in his article, Commoditizing Project Management for the Mid-market, he points out that the company must look beyond its enterprise accounts to craft a solution that will be as attractive to the mid-market, as to the enterprise market. What is required, he says, is the commoditizing of EPM software to provide what people always expect from Microsoft: instant results.

Kate Armel and Don Becket don’t believe in people power as the key to a successful project. In their article, Adding Manpower to a Late Software Project Makes it Later, they argue that, while technology has changed dramatically, human nature remains the same. They point out that tools and methods allow us to work more efficiently, but that software development is still a uniquely human endeavor, which can present problems. Among the problems, they identify over-optimism, fear of measurement, and using the wrong tools for the job. Overcoming these obstacles will allow the project manager to manage both the technical and people challenges of software development with confidence.

We hope that we’ve given you a taste of what lies ahead of you in this Project Times, and we hope you’ll read on and find some thought-provoking ideas. If you have some thought-provoking ideas of your own, please share them with us. And if you have some ideas for future articles, we’d like to hear about those too.

Easy as Implementing a Package

Last weekend I had a conversation with an uncle who recently retired from his accounting job at a large university. His family was financially secure, the children were grown (with his first grandchild on the way), and he was healthy after a going through a medical scare years ago. It was time to call it quits and restore the antique motorcycle his wife had given him for Father’s Day last year, and get ready to bounce his new granddaughter on his lap.

But before it was official, his employer asked him to reconsider – for one more project: deployment of an enterprise (ERP) application across all the colleges of the university. “There was no way I was ever going to stick around for that,” he told me.

Most of us don’t have the luxury of tipping the hat and bidding adieu like my Uncle Larry did in the fact of a life-changing project presented by the boss. And life-changing it will be for a lot of people. What started out as a $100 million dollar project has ballooned to – $250 million. As a friend once said to me, “Now we’re talking about a SERIOUSLY BIG man-sized pile of money!”

It’s no wonder that my fellow Cutter colleague Steve Andriole said that not one of the CIOs and CTOs that he had spoken to in the last few months would install their enterprise applications again if they had a chance to do it over. In his recent Cutter Trends article entitled, “Sourcing Today and Tomorrow,” he said it took many of them years to get the software to work, with some costing hundreds of millions of dollars (man-sized piles). A few had even gotten fired when they exceeded their budgets and schedules. Do you really wonder why some folks head for the hills when the boss says the words, “Oracle,” or “SAP” implementation?

I don’t think it has to be that way. One of my clients – a large financial services company – has a solid benchmarking initiative in place that showed how nearly 100 of their projects performed in terms of cost, schedule, and quality across small to very large IT projects. Among them was a group comprised of package implementations, all plotted against industry trend lines. We color coded those projects on the graphs, using a legend convention showing them as blue squares, with the rest of the projects plotted as green circles, to distinguish the ERP batch from the overall sample.

The good news: In almost all dimensions, they behaved like the other “traditional” projects! The bad news: some had overruns and slippages just like the rest, and the piles of money for some of the overruns were pretty big. But not all behaved that way. Some were quite successful, showed high levels of productivity, and were right on target for the scope, meeting their deadlines, and finishing within budget.

What does that tell us? That ERP projects can either succeed or fail just like any large-scale IT project, and in my view, it is within our ability to influence their outcome. Enterprise application projects are not going to go away anytime soon in my view. Andriole thinks that CIOs will stop doing them, rent versus buy-and-install (going the ASP 2.0 route), and shift to software as a service (SaaS), but I think that will take a long time. Meanwhile, big organizations – multi-billion dollar corporations – still need to run their businesses. Their legacy systems will either take ongoing care and feeding, or CIOs will make the shift to companies like Oracle or SAP to keep the ship moving. Like it or not, companies will have to get better at managing this kind of work.

So here’s some more good news – I believe that it’s possible to better understand, manage, and predict how these projects behave, and not suffer from year+ delays, cost overruns, and poor reliability. Having worked with dozens of clients doing package implementation and deployment projects, including the major enterprise application vendors, here’s what we know about these kinds of projects:

  • Productivity on ERP projects is very similar to traditional IT (development) projects. Their schedules, effort/cost profiles, and defects position similarly against other software project trends.
  • Three distinct classes of complexity seem to drive their behavior – upgrades (lowest complexity), standard implementations (medium complexity), and global deployments (highest complexity).
  • It’s possible to “size” this work by estimating and counting elements like business processes (number of major/detailed processes), and the custom artifacts to implement them (i.e. reports/tables, interfaces, conversions, enhancements, and forms).
  • The effort proportions for things like “Business Blueprint” phase relative to the “Realization and Preparation” phase is highly similar to the proportions seen for the “Requirements/Design” and “Construction/Test’ phase on traditional software projects.

So here’s the tricky part. Traditional projects have long-suffered from cost overruns, schedule slippages, and cuts in scope. So tell me again why it’s good news that ERP projects behave similarly?

It means that companies that have improved their ability to measure and estimate their projects can apply the same skills to better forecast enterprise projects. It also means that if you collect some historical data on non-enterprise projects, there stands a good chance that you can leverage these existing patterns to sanity check your deadlines, budgeted effort, and scope targets against this history. Even better, you can run one of the commercially available software project estimation models to more accurately forecast time, effort, and achievable scope on these deployments. That way, you can get realistic about what you can implement within a given deadline in the first place, and not have as high a risk at suffering from an embarrassing and potentially job-threatening overrun.

This is part one of a two-part article by Michael Mah. Part two will appear in the August Project Times



Michael Mah is a Senior Consultant with Cutter Consortium’s Business Technology Trends & Impacts, Measurement and Benchmarking, Agile Project Management, and Sourcing & Vendor Relationships Practices. He is owner/partner at QSM Associates Inc. Mr. Mah is a recognized expert on practical applications of software metrics, project estimation/control, and IT productivity benchmarking. Over the past 10 years, he has published numerous articles on these and other management topics. His recent work merges concepts in software measurement and benchmarking with negotiation and dispute resolution techniques for IT outsourcing and relationship management. Mr. Mah’s particular interest is in people dynamics, such as the complex interactions between people, groups, divisions, and partnered companies working on the technology revolution at “Internet speed.” He is also focused on the latest research and theory on negotiation, including the use of game theory, role playing, and training to increase corporate and personal effectiveness. Mr. Mah is a frequent speaker at major trade conferences, including the Cutter Consortium Summit series, Better Software Conference, the Software Engineering Process Group, Software Best Practices Conference, the Technology Partners International Outsourcing Conferences, the Sourcing Interests Group, and others. Mr. Mah has a degree in engineering from Tufts University. His training in dispute resolution, mediation, and participatory processes is from the Program on Negotiation at Harvard Law School and the Radcliffe Institute for Advanced Study. He can be reached at [email protected].

PMO 2.0: Expanding the Value and Reach of the PMO

The world of project management has changed, limiting the effectiveness of traditional project management techniques in today’s fast-moving, knowledge-based, technology-oriented service organizations. The PMO must go beyond project or even portfolio management to include financial and organizational capacity management.

I’d like to introduce the concept of the Management Integration Center as a means of establishing a center of excellence to develop the expertise and capabilities to run technology services like a business.

Rescuing Project Management

The application of traditional project management techniques in today’s knowledge worker organizations has yielded little success. Billions of dollars are lost each year due to poor planning, and less than 30 percent of all projects are successfully completed. To reverse this unsettling trend, PMOs must reach beyond project and portfolio management to balance demand for resource and financial capacities across the enterprise.

Significant operational efficiency increases of 20 percent or more can be expected when organizations implement a center of excellence to focus on business functions and better align work with money and resources across the organization.

Introducing the Management Integration Center

Typically, organizational processes are operationally structured into three distinct levels: strategy, management and development. The strategy level includes executive processes such as business planning, governance and other high-level decision-making functions.

The management level provides the various linking functions between strategy and development to convert concept into products and services.

The development level is where specialized technology processes are defined and employed. This can include computer technology, R&D, manufacturing or construction processes.

Functions traditionally provided by a PMO reside as a part of the management level. For example, the PMO does not make strategic investment decisions, nor does it dictate the engineering approach. Rather, a good PMO facilitates the investment decision-making process and ensures the engineering approach applied to a project will result in the expected deliverables within defined constraints.

The term Management Integration Center (MIC) has been used to differentiate its expanded role and functions in the organization from those of a typical PMO. The MIC encompasses a wide variety of management and control functions to include support for all organizational work and resources, as well as consolidating other business management processes.

A MIC is needed for:

  • Collaboration between various groups within the organization
  • Defining a common roadmap to foster consistency of process, terminology, roles and responsibilities
  • Gathering, analyzing and disseminating business information
  • Administering enabling technology and applications
  • Developing and deploying staff with appropriate business skills.

A well-established, full-service MIC provides vital services to executives, internal departments, and customers alike by integrating business functions and information within a dedicated organization.

Since the MIC improves processes applied throughout the organization, it amplifies efficiency gains through increased performance of the entire enterprise.

The MIC as an Internal Services and Consultancy Provider

One of the overt drivers for developing an MIC is to first establish a mechanism for consolidating or creating the infrastructure and core competencies around business management. This may take the form of actually providing the service itself, such as an integrated business management application, or a dedicated planner/scheduler staff deployed to assist technical managers.

In some cases, it may manifest itself through educating the organization to raise general business awareness and process maturity. Regardless, the MIC must be positioned as a service provider rather than a domineering power center.

Of the many roles the MIC plays, one of the most important is as a liaison for the relationship between the product and service consumers and the provider organization.

Facilitating Business Process Superhighways

Processes form the framework for the flow of information, decisions work, assignments, and practically everything that is done in context of executing business today. The effectiveness and efficiency of these process ‘highways’ dictate the amount of traffic they can handle, while governance establishes the ‘rules of the road.’

By virtue of being the facilitator of the governance model and owner of the business processes, the MIC is a significant player in defining operational controls and organizational efficiency.

As the owner and administrator of the supporting technology, the MIC also develops control functions related to the automation of processes, such as defining the appropriate lifecycle and workflow models.

Every Ship Needs a Navigator

If we apply the analogy that any given business is much like a ship on a voyage, perhaps the most beneficial role of the MIC is to help it along its journey by knowing where it is at any given time by ‘reading the maps.’ While the destination is always defined by senior executives, a large component of success for the journey is ensuring the course is defined, communicated and followed.

This involves developing the analytical skills and intimate knowledge of information within supporting business systems to transform it into usable, business intelligence. Such a service is invaluable for identifying and leveraging information about past, current and future conditions.

Significant savings can be realized from this function alone. For example, if five percent of a $20 million budget is identified as being allocated to low-priority work, an organization can then spend an additional $1 million on important projects.

Business Performance Management

Information collection and analysis is a key service provided by the MIC as the bridge between tracking and controlling functions. The MIC is uniquely positioned to have the proper combination of organizational perspective, skills, access and capability to render data into useful information across departmental lines.

Business information is then further distilled into that which requires action vs. continued monitoring. Performance tracking is commonly applied to individual work items and resources, such as scheduled performance to baseline, earned value, cost versus budget or resource utilization.

However, particularly for the MIC, other important tracking functions include monitoring performance at a macro level, such as portfolio performance, average effort per particular type or class of work or process performance.

A key capability for the MIC is to have an integrated business application environment so that data can roll up and accumulate from the lowest level of planning detail to the upper levels of the organization and its portfolios.

Project, work and resource managers each have responsibilities for analyzing the information within their respective areas. And departments may also perform analysis from their perspective. But they all tend to concentrate their analysis efforts vertically. Only the MIC is uniquely positioned and equipped to perform further analysis across departmental boundaries to gain a comprehensive view of overall performance.

The Role of the MIC in Analytics and Reporting

The MIC should help define and produce the top 15-30 Key Performance Indicators (KPIs) that are used to manage at the executive level as part of routine governance and strategy meetings. KPIs refer to the set of core metrics that are an organization’s primary measures of health and performance. If you find you need more, consider rotating KPI reviews from one major grouping to another at each meeting, rather than trying to cover them all at one time.

The MIC should facilitate executive reviews by providing the KPI package to members ahead of time, flagging those in need of discussion, and keep track of actions decided upon or those that need to be reassessed at the next meeting.

As a communication tool, appropriate KPIs should get exposure beyond the governance board. Every employee should have the opportunity to see and understand organizational performance, so they can internalize their personal stake in those they influence, share in the success of performance gains, and better understand when and why improvements or changes are made.

Consolidating Business Expertise

Technology-based organizations inherently understand the value of developing depth of expertise and specialization when it comes to development-level functions. However, these same organizations have been slow to recognize and justify that the same specialization needs are present when it comes to the business management functions. A sign of organizational maturity is the recognition that business functions, such as those discussed, are as critical to being seen as a valued service provider, as much as the technology products and services themselves.

The MIC represents corporate recognition of these business capabilities as a mission critical function, and it becomes the focal point for building and applying essential business skills. In addition to improving business functions themselves, the total cost of execution actually goes down.

Assigning the Right Resource to the Right Job

Often, for lack of other options, expectations for accomplishing business functions are placed upon managers and department heads. Such functions are likely not their area of training and expertise, nor their top concern or priority. If processes, tools and techniques are formally established for these functions, they are often a product of department silos, making it difficult to consolidate results. If a common business application is available to accomplish the function, they probably aren’t using it often enough to become proficient in its use.

The net result is that if these functions are being routinely accomplished at all, they are being done by expensive resources that are better applied in other ways. And they are probably not being done as consistently or efficiently as they could be.

By comparison, the MIC can facilitate administration of these business processes using a trained and dedicated staff at considerably lower cost. This staff still works closely with managers to execute decision-making, provide reports and help them manage their work and resources better, without the administrative burden.



Terry Doerscher has over 24 years of practical process development, project management, PMO, business strategy, and work and resource management experience in the construction, nuclear, and IT fields. Mr. Doerscher is currently the Chief Solution Architect for Planview, responsible for developing Planview PRISMS™ Adaptive IT Management Best Practices, and coordinating its integration with Planview Enterprise software functionality. Prior to that, he was a business consultant and Director of Professional Services for Planview, managing the implementation of Planview for over 25 customers and supporting dozens more.

Commoditizing Project Management for the Mid-market

Over the last five years I’ve spent a fair amount of time working with Microsoft on deployments of its Project Server system. Microsoft refers to its entire solution as the Microsoft EPM (Enterprise Project Management) Solution as it encompasses much more than just Project Server. To consider the total solution we think of a “stack” of technology. There is Microsoft Windows Server 2003 to start with. Part of Windows Server that’s critical to this kind of deployment is Internet Information Services, which is the Web Server for delivering all the web content. Along with Windows Server is Windows Sharepoint Services that provides us with collaboration and web portal functionality. We often do authentication with Active Directory, so that’s part of the solution too. There’s also SQL Server where we’ll house the database. Microsoft Office Project Professional and Project Server and the Project Web Access interface are the more commonly expected pieces of software. Finally, there are some elements of the functionality that might require Microsoft Office, Microsoft Office System, SQL Reporting Services or SQL OLAP Services.

Quite a mouthful, isn’t it?

There’s no doubt that the end result is a powerful one, and no doubt that the solution has been well received. Even among Microsoft’s detractors, there is widespread opinion that Microsoft is a force to be reckoned with in the EPM space. The initial targets for this new enterprise functionality was, to no one’s surprise, Microsoft’s enterprise accounts. Microsoft doesn’t publish numbers of how many such accounts exist, but it is no secret that these accounts typically number in the thousands of PCs. In these kinds of companies, there are numerous resources that are applied to an EPM deployment. The IT department has network administrators and installers and technical support personnel. There are database administrators and programmers, and so on.

I bring up this whole topic because what Microsoft is about to confront is surely a trend to be considered by everyone who creates systems for enterprise project and portfolio management. Microsoft must, over the coming years start to look beyond just their enterprise accounts and see how they can craft a solution and a sales and deployment message that is as attractive to the mid-market as the one for the enterprise market was.

In our business we get calls almost every day from a mid-market sized firm. “How long will it take us to implement Project Server,” they ask. The question is worded in different ways but it becomes clear quite quickly that an answer in the denomination of months isn’t going to find any traction. I can’t tell you how many times we’ve gotten a call on this subject that sounds like, “Can you get the whole job done by Friday?”

With enterprise level clients, there is almost always an understanding that the deployment of such systems must be managed as a change management project. It’s the culture change, not the technology that is the big challenge. This is no surprise in a large firm. We’re talking about managing a major aspect of the business in a different way and this may well have a ripple effect through the organization.

There are aspects of this that are true at the mid-market level also, but it’s a truism to say that the smaller the organization, the more maneuverable it is. So when we explain how challenging changing behaviour may be, this is often met with more resistance at the mid-market or small-market level.

If you were Microsoft, or another project management software vendor, you’d have to think about how to tackle this market. The same sales model that worked for the enterprise isn’t going to fly here. What will be required is what people always expected from Microsoft: instant results.

This leads to what I believe will be a major trend in project management tools and their manufacturers over the next five years: The commoditizing of epm software. Publishers must ask themselves, “How can we provide a solution that enables the correct process, is a minimal drain on management to design and configure, and is priced in a way that mid-market companies can afford the total cost of ownership.”

So, how do you go about commoditizing such a product/service offering? I have a few ideas.

Make the technology all install at once. This is within the technical grasp of the large EPM system publishers. Make a one-click install that works for most mid-market size deployments. When we think of an enterprise deployment, we start talking about multiple servers, web farms, load balancing and other high-end challenges that just don’t exist when the total number of users is 200-300.

Next, pre-configure the software for my use. Sure it’s true that every company is a little different but there are many commonalities between firms. Instead of having the software arrive with nothing in it, the publishers could spend time making sure it’s pre-configured for the most common use with reports, customized fields, lookup values etc. all pre-set. Just add users and you’re there!

Make training available to the masses. There are so many great ways to distribute training now that EPM publishers need to take advantage of. Online instructor led or Computer-based courses, Teach yourself books, mixes of online and text-books and so on. The costs of such training would need to drop dramatically and the training would have to be broken into bite-sized pieces so any sized organization could digest them.

Don’t forget to abandon acronyms. Any arcane science has its secret codes. In the project management world we talk about things like CPM, SPI, CPI, EV, BCWS and so on. Even in high-end project management circles, these acronyms and abbreviations are being abandoned in favor of straight descriptive language.

Finally, build the processes into the software. There is a process to being effective with managing projects but the 80/20 rule has always applied here. Twenty percent of the process delivers eighty percent of the value. A basic fundamental process, created, perhaps around the tenets of the PMI could be woven right into the software of most EPM systems so that organizations could adopt it or not as they saw fit.

If you think of the project management systems market as though it was a pyramid, with the most experienced users at the top and the neophytes at the bottom, then the use of project management so far has been focused at the very tip of the pyramid. I sometimes hear people say that all kinds of complex algorithmic functionality should be added to project management software in order to do better analysis, but it’s certain that there’s little return for such an investment. No, the big returns for systems publishers are to make project management systems and project management methodology accessible to the masses. It should be like acquiring any other commodity; a bar of soap or a tube of toothpaste. Project management software, as a commodity, would be used by millions, upon millions of users, and that’s where the big payoffs come for software firms.

That makes commoditizing epm software inevitable.


Chris Vandersluis is the founder and president of HMS Software based in Montreal, Canada. He has an economics degree from Montreal’s McGill University and over 22 years experience in the automation of project control systems. He is a long-standing member of both the Project Management Institute (PMI) and the American Association of Cost Engineers (AACE) and is the founder of the Montreal Chapter of the Microsoft Project Association. Mr. Vandersluis has been published in numerous publications including Fortune Magazine, Heavy Construction News, the Ivey Business Journal, PMI’s PMNetwork and Computing Canada. Mr. Vandersluis has been part of the Microsoft Enterprise Project Management Partner Advisory Council since 2003. He teaches Advanced Project Management at McGill University’s Executive Institute. He can be reached at [email protected]

Adding Manpower to a Late Software Project Makes it Later

In 1975 a mighty clue bat was unleashed on the software world. In The Mythical Man-Month, Fred Brooks reminded us there are finite limits to our ability to compress the development process. Moreover, throwing people onto troubled projects often backfires. These insights should not have surprised us; after all, time and effort are hardly fungible commodities. Even with the best tools and methods, nine women still can’t deliver a baby in one month.



If Brooks merely reminded people of what they already suspected, why do so many software projects still come in late and over budget? A recent study of the QSM database showed that large projects (defined as over 50,000 ESLOC) have only a 19% chance of meeting their planned schedules and a 30% probability of making their budgeted effort. It’s discouraging to see organizations struggling after thirty years of technological change and process improvement effort. Why does this still happen so frequently? More importantly, what can we do to change it?

Technology Advances, But People Remain All Too Human

Part of the problem is that while technology has changed rapidly, human nature remains constant. A critical ingredient in software development – perhaps the critical ingredient – is people. This is an insight technical managers sometimes forget to factor into their plans.

Tools and methods allow us to do things more efficiently, but software development remains a uniquely human endeavor. Consequently, successful project management requires a mastery of both people and technical skills. The first part of this paper deals with the human factors that trip up so many software projects. The latter part brings data to the problem-solving table.

The people problems that plague software teams tend to involve over-optimism, fear of measurement, and using the wrong tools for the job. They fall into three broad categories:

The Triumph of Hope over Experience:

  • Competitive pressure. Bid solicitation (especially in the outsourcing world) involves a great deal of internal pressure on participants to win business. This competitive ‘tunnel vision’ often leads to overly optimistic assumptions that ignore an organization’s proven ability to deliver software.
  • Unfounded productivity assumptions. If it has always taken 20 hours to produce a widget, assembling a crack team of developers will not cut that number to 10. Productivity improvement is a long-term endeavor; not a short-term fix.


Fear of Measurement:

  • Not learning from history. Companies which measure projects well, develop organizational self-knowledge, identify capacities and patterns, and come to know their strengths and weaknesses. In short, they learn from experience and develop an empirical basis for project planning. Unfortunately, most organizations lack formal software measurement and evaluation capacity or measure and plan haphazardly. Lacking self-knowledge, these organizations continually put themselves at risk.
  • Not planning for growth. The planned project generally differs from the delivered project in one key component: it is smaller and delivers less functionality. Good project management and effective change control help mitigate scope creep, but a recent QSM study showed a median size growth of about 20%. Projects locked into budgets and schedules based on one set of requirements will be sorely pressed to meet these commitments when the requirements increase.
  • Not watching where we’re going. Most software teams work hard and want to succeed. There is an admirable human tendency to double one’s efforts when problems arise. Such industry should be encouraged, but Herculean effort makes a poor substitute for timely, gentle course corrections. In fact, it is usually too late to take effective countermeasures when problems finally manifest themselves.


Applying the Wrong BandAid:

  • Ineffective or inappropriate countermeasures. There are only three possible courses of action when a project threatens to exceed budget or schedule. Each works within a limited range of possibility and carries accompanying cost.
    • Relaxing the schedule: Results in a less expensive project with fewer defects. There are good and bad reasons why this option is not used more often. Legal or contractual requirements may mandate delivery by a certain date; late delivery may invoke penalties or loss of customer goodwill. Also, organizations may have committed project staff to other endeavors. The bad reasons center more on reluctance to change and unwillingness to “lose face”.
    • Reduce the scope of the delivery. Deferring non-critical functionality until a later release (or eliminating it entirely) can keep a project within time and cost constraints. The cost is obvious: less is delivered than was promised or expected.
    • Add staff. Within a narrow range, adding staff can reduce schedule, albeit slightly and at considerable cost. As many managers have discovered, schedule/effort tradeoff is non-linear: a single unit of schedule reduction “costs” many units of effort and this ratio increases exponentially as the schedule is compressed.

Challenging the Conventional Wisdom

So, what are harried software managers to do when faced with non-linear relationships between time and effort, technology that changes constantly, and human behaviors that, despite experience, remain stubbornly entrenched? This is where measurement is invaluable. Having a good metrics program in place tells organizations several important things: what they have built in the past, what their historical capabilities are, and which patterns in the data may be helpful in the future. A good metrics program does one more thing: armed with a good historical baseline, managers can monitor their progress and make timely course corrections as projects unfold. For managers who need to assess the risks/benefits of using new technologies in real time, this kind of feedback is priceless.

As technology continues to shift the productivity curve outward, managers are tempted to challenge the conventional wisdom. The allure of Agile programming may make them wonder if it isn’t possible, after all, to make that baby in one month instead of nine. This is not necessarily a bad thing. As new tools and methods appear it makes sense to reexamine old assumptions about the relationships between time, effort, and productivity. But that reexamination should be grounded in empirical methods and hard data, not pie in the sky optimism.

Take Fred Brooks’ famous maxim,“Adding manpower to a late software project makes it later”. QSM researchers have found a strong correlation between project size and most other metrics. In our experience, the non-linear relationships between size, time, effort, and defects often make simple rules of thumb less than universally applicable. In practice, these tried-and-truisms often hold true for many, if not most projects but since many software relationships ‘go exponential’ at certain points along the size spectrum, it’s probably not a bad idea to test them against the data.

“Adding Manpower to a Late Project Makes It ????”

We looked at large Information Technology software projects completed in the last decade to answer the question, “Just how does the ‘mega staff’ strategy affect large projects?” On a scatter plot of effective (new and modified) size vs. average staff we found an interesting separation in projects at the high end of the staffing curve. We call this gap the “Unglued Point”: where staffing runs wild.

Below 100,000 lines of code, the projects are evenly distributed. But beginning at the 100 K ESLOC mark, a hole opens up, separating the bulk of these projects from those staffed at far higher levels.

The trend lines in the first chart are average, plus, and minus one standard deviation lines. At any point on the size spectrum, there is wide range of staffing strategies. Above the range of ‘normal’ variability is the unglued point, representing projects with exceptionally high staffing. The high staff projects position well above the +1 standard deviation line, placing them over the 68th percentile, closer to the 75th percentile or above.

What can these high staff projects tell us? How do their schedules compare with other, more reasonably staffed projects? How does the high staff strategy impact project quality? And of course, what are the cost implications of such a strategy?

Let’s find out.

The second graph displays only projects above the unglued point for staffing. The parallel lines show average, plus and minus 1 σ trend lines for “reasonably staffed” projects. Crossing these diagonally is the trend from the high-staffed projects shown. For projects up to 100,000 lines of code, using large teams seems to deliver projects at or below the QSM average for schedule.

However, matters deteriorate rapidly as projects increase in size. At best, aggressive staffing may keep a project’s schedule within the normal range of variability but this strategy becomes increasingly ineffective as project size increases.

What about quality? Again, only high staff projects are shown. The steeply sloped line crossing the QSM defect trend lines is the average of the mega-staffed projects. Their quality is consistently worse than average (higher defect density) and increases precipitously as the projects increase in size. The impact of high staffing on project quality is clearly negative.

Finally, what are the cost implications of the large team strategy? First let’s review what is purchased in terms of schedule reduction: at best high staffing moves a project into the range of normal schedule variation, though this strategy becomes increasingly ineffective as projects increase in size. Overall project quality, which is its legacy to its users, is worse than normal. Now the cost: as the following table illustrates, high staffed projects are several times more expensive.

As to the question at the start of this section: If you answered ‘later,’ you were correct.

Conclusion

So, how did Brooks’ famous maxim hold up against the evidence? Does adding staff to a late project only make it later? It’s hard to tell. Large team projects, on the whole, did not take notably longer than average. For small projects the strategy had some benefit, keeping deliveries at or below the industry average, but this advantage disappeared at the 100,000 line of code mark. At best, aggressive staffing may keep a project’s schedule within the normal range of variability.

Contrary to Brooks’ law, for large projects the more dramatic impacts of bulking up on staff showed up in quality and cost. Software systems developed using large teams had more defects than average, which would adversely affect customer satisfaction and, perhaps repeat business. The cost was anywhere from 3 times greater than average for a 50,000 line of code system up to almost 8 times as large for a 1 million line of code system. Overall, mega-staffing a project is a strategy with few tangible benefits, which should be avoided unless you have a gun pointed at your head. One suspects some of these projects found themselves in that situation: between a rock and a hard place.

How do managers avoid these types of scenarios? Software development remains a tricky blend of people and technical skills, but having solid data at your fingertips and challenging the conventional wisdom wisely can help you avoid costly mistakes. Measurement allows you to manage both the technical and people challenges of software development with confidence whether you are negotiating achievable schedules based on your proven ability to deliver software, finding the optimal team size for that new project, planning for requirements growth, tracking your progress, or making timely mid-course corrections. You might even avoid that giant clue bat!

 


 

Kate Armel is a technical manager with Quantitative Software Management, Inc. She has 8 years of experience in technical writing, metrics research and analysis, and assisting Fortune 1000 firms estimate, track, and benchmark software projects. Ms. Armel was the chief editor and co-author of the QSM Software Almanac.

Donald M. Beckett is a consultant for Quantitative Software Management with more than 20 years of software development experience, including 10 years specifically dedicated to software metrics and estimating. Beckett is a Certified Function Point Specialist with the International Function Point Users Group and has trained over 300 persons in function point analysis in Europe, North America, and Latin America. He was a contributing author to “IT Measurement: Practical Advice from the Experts.” Beckett is a graduate of Tulane University.