Wednesday, 06 November 2013 08:06

Project Management Best Practices: Tracking and Learning

Written by

In the previous three articles in this series, I’ve described seventeen practices the project manager can apply to lay the foundation for a successful project, plan the project, and estimate the work to be done. In this final article I share three good practices for tracking your progress throughout the project and one practice for learning how to execute future projects more successfully.

Tracking Your Progress

Practice #18: Record actuals and estimates.

Unless you record the actual effort or time spent on each project task and compare them to the estimates, your estimates will forever remain guesses. Someone once asked me where to get historical data to improve her ability to estimate future work. My answer was, “If you write down what actually happened today, that becomes historical data tomorrow.” It’s really not more complicated than that. Each individual can begin recording estimates and actuals, and the project manager should track these important data items on a project task or milestone basis. In addition to effort and schedule, you could estimate and track the size of the product, in units of requirements, user stories, lines of code, function points, classes and methods, GUI screens, or other units that make sense for your project.

Practice #19: Count tasks as complete only when they’re one hundred percent complete.

We tend to give ourselves a lot of partial credit for tasks we’ve begun but not yet fully completed: “I thought about the algorithm for that module in the shower this morning, and the algorithm is the hard part, so I’m probably about sixty percent done.” It’s difficult to accurately assess what fraction of a sizable task has actually been finished at a given moment.

One benefit of using inch-pebbles (see Practice #6 in Part 2 of this series) for task planning is that you can break a large activity into a number of small tasks (inch-pebbles) and classify each small task as either done or not done—nothing in between.

Your project status tracking is then based on the fraction of the tasks that are completed and their size, not the percent completion of each task. If someone asks you whether a specific task is complete and your reply is, “It’s all done except...,” then it’s not done! Don’t let people “round up” their task completion status. Instead, use explicit criteria to determine whether an activity truly is completed.

Practice #20: Track project status openly and honestly.

An old riddle asks, “How does a software project become six months late?” The rueful answer is, “One day at a time.” The painful problems arise when the project manager doesn’t know just how far behind (or, occasionally, ahead) of plan the project really is. Surprise, surprise, surprise.

If you’re the PM, create a climate in which team members feel it is safe for them to report project status accurately. Strive to run the project from a foundation of accurate, data-based facts, rather than from the misleading optimism that can arise from the fear of reporting bad news. Use project status information and metrics data to take corrective actions when necessary and to celebrate when you can. You can only manage a project effectively when you really know what’s done and what isn’t, what tasks are falling behind their estimates and why, and what problems, issues, and risks remain to be tackled.

The five major areas of software measurement are size, effort, time, quality, and status. It’s a good idea to define a few metrics in each of these categories. Instilling a measurement culture into an organization is not trivial. Some people resent having to collect data about the work they do, often because they’re afraid of how managers might use the measurements. The cardinal rule of software metrics is that management must never use the data collected to either reward or punish the individuals who did the work. The first time you do this will be the last time you can count on getting accurate data from the team members. Read Chapters 12 and 13 of my book Practical Project Initiation to learn about basic principles of software measurement and metrics traps to avoid.

Learning for the Future

Practice #21: Conduct project retrospectives.

Retrospectives (also called postmortems and post-project reviews) provide an opportunity for the team to reflect on how the last project, phase, or iteration went and to capture lessons learned that will help enhance your future performance. During such a review, identify the things that went well, so you can create an environment that enables you to repeat those success contributors. Also look for things that didn’t go so well, so you can change your approaches and prevent those problems in the future. In addition, think of events that surprised you. These might be risk factors to look for on the next project. Finally, ask yourself what you still don’t understand about the project, so you can try to learn how to execute future work better.

It’s important to conduct retrospectives in a constructive and honest atmosphere. Don’t make them an opportunity to assign blame for previous problems. Chapter 15 of Practical Project Initiation describes the project retrospective process and provides a worksheet to help you plan your next retrospective.

It’s a very good idea to capture the lessons learned from each retrospective or process improvement exploration and share them with the entire team and organization. This is a way to help all team members, present and future, benefit from your experience. Chapter 14 of Practical Project Initiation talks about best practices, worst practices, and a way to record lessons learned.

The twenty-one project management best practices I’ve described in this series of articles won’t guarantee your project a great outcome. They will, however, help you get a solid handle on your project and ensure that you’re doing all you can to make it succeed in an unpredictable world.

Don't forget to leave your comments below.

Read 23632 times
Karl Wiegers

Prior to starting Process Impact in 1997, Karl spent 18 years at Eastman Kodak. His responsibilities there included experience as a photographic research scientist, software applications developer, software manager, and software process and quality improvement leader. Karl has provided training and consulting services worldwide on many aspects of software development, management and process improvement. He's the author of several technical books and one self-help book, has written more than 150 articles on many aspects of software, and has spoken at many software conferences and professional society meetings.

© ProjectTimes.com 2017

macgregor logo white web