I Changed From Being Process-Skeptic to Process-Ambassador

Mubeen Nazimuddin

I have always been a “get things done” person, with a belief that too many processes are a waste of time and resources. I found innovative excuses to defer process implementation in my project.

One day, I was asked to be on the CMMI Assessment Team. I smiled, and I agreed.

And the jaunt started.

In order to understand how the transformation happened, I need to take you through the journey of those 13 days.

The 8 nominated Appraisal Team Members (ATMs) were trained for 3 days on an Introduction to CMMI. The theoretical study of maturity levels, process areas, goals, practices, sub-practices was overwhelming. It was like a jigsaw puzzle that started falling in place only after reading the CMMI model a couple of times from end to end.

The next step was to get into the actual assessment process. When the lead assessor from KPMG laid down his commandments, expecting us to stretch and exercise, practice hand-writing, and have our back-ups in place so our routine work was not impacted, I smiled and said to myself, “What a sheer waste of time.”

The exercise started, with day 1 and part of day 2 spent on a walkthrough of the SCAMPI-A assessment process.

Day 3 & 4: The painstaking task of looking at the process areas, and digging into the project artifacts of all 5 sample projects to find objective evidences of process compliance against each goal, practice, and sub-practice. Each ATM was assigned a couple of process areas to work on.

A glance at the documents would not be sufficient, as we needed to look for both direct and indirect references, and write a detailed description of compliance/non-compliance. No one liners. Discussions among the ATMs when in doubt led to amazing insights into the sample projects.

On day 5, we prepared questions for the project manager interviews to happen the following week.

During this process, the detail with which we looked at each process was an eye opener for a process-skeptic like me.

Week two: The Lead Assessor led an opening meeting with all stakeholders, setting the expectations for the next 5 days.

Before each of the 9 interviews that would last 60-90 minutes each, you would see the ATMs stretching arms, flexing fingers, fidgeting with their pens and notepads, all poised for a get-set-go from the LA. The moment one person asked questions, the remaining 7 would aggressively scribble in their notepads. The verdict was clear – write by hand, verbatim. The gap between two interviews was used to document the affirmations against each process area and the recorded objective evidences.

By the end of the interviews with the Business Unit Heads, the L&D team, the Quality Team, Engineering Functional Area Representatives, and the Project Managers of the 5 sampled projects, we had accumulated over 1200 pages of hand-written notes. I didn’t write this much even in my college days.

In my opinion, I would rather have used these days on something productive. My opinion about processes didn’t change.

The turning point of my take from being process-skeptic to a process-ambassador was the 16 hours we spent non-stop on the Thursday of the 2nd week. We brainstormed on each of the 17 process areas, with their 146 processes, and their 600+ sub-practices. Each ATM would read out the practice definition, its goals and sub practices, and then go to the objective evidences and oral affirmations that we collected. And then we would vote, for each of the 5 projects. Was the process Fully Implemented (FI), Largely Implemented (LI), Partially Implemented (PI) or Not Implemented (NI). Sometimes, the vote was unanimous, and we had consensus in seconds. Other times, active discussion would go on for as long as an hour for one process area.

The best thing was the wisdom shared by the LA on how and why each process was relevant for the organization and its success. As we unfolded different process areas, the inter-links between the processes started coming to light. The jigsaw puzzle started taking the appearance of a meaningful image.

In order to be CMMI Level 3 compliant, we needed to have proof – both objective and affirmative, of FI, or at-least LI on all the 146 processes.

It hurt when we identified the first miss. But the words of the Lead Assessor cheered us up, when he said that the two weeks of effort would be worthless if we can’t identify areas of improvement that would help taking the organization to a higher level. We were now looking at weaknesses with a positive mindset of adding value vs finding faults.

Just before the stroke of midnight, we finally completed the 146th process, identifying 12 weaknesses.

The last day was all about final consolidation. The 12 weaknesses were not strong enough to mark any of the 17 process areas “Not satisfactory”. The organization had once again passed the test. We were jubilant, and proud to be part of the exercise.

The only heart-breaking task on of the whole exercise, was when we had to destroy every bit of printed and written paper, before the lead assessor presented the final findings to all stakeholders.

What I learnt from this exercise:

We always look at processes in silos, and end up struggling before every audit to see if everything is in place. A week before every audit, we see reminders from the Quality Representatives asking to “check” if all documents are in place. This always leads to anxiety, mistakes, short-cuts before every audit.

Understanding the true intent of each process, and how it intertwines with other processes, can help making process implementation a way of life.

Here are a couple of classic examples of one of the many interlinked processes that we discovered during this exercise.

* The Process Capability Baselines (PCBs) came up while discussing the Measurement and Analysis process area. The Quality team published organization level PCBs periodically. They also prepared department level PCBs for the individual departments. However, projects were measured for their capabilities against the Organization level PCBs.

* Now, different departments have different capability levels due to various factors – like experience of associates, maturity of the process/project, kind of client they work with.

* Using organization level PCBs is unfair for departments work at a higher capability level than others. It would give them undue advantage in the form of a lower benchmark.

* This resulted in a weakness in the Risk Management process area. Departments operating at a lower capability level didn’t identify this as a risk, that they have a challenge reaching up to the organization level PCBs.

Another interesting example relating to defect analysis.

At the end of every project/iteration, we analyze and review the defects, categorize them based on certain criteria, create pareto charts to identify the most occurring categories. A CAR is performed, resulting in preventive and corrective actions.

However, in the next iteration, when a defect review is done again, do we cross verify with the past pareto charts to see if a particular category has shifted sides on the 80-20 graph? We were not looking at the trend to see the effectiveness of the corrective/preventive actions.

An even simpler example would be the simple process of writing an MOM after a meeting, that has a direct impact on the Stakeholder Management process area.

 

In conclusion, what we learnt in 10 days about the processes and their practical applications and implications, wouldn’t have been possible in weeks and months of process training. These 10 days, especially the day 9 that lasted 16 hours, transformed me from a process-skeptic to a process-ambassador.