Archive for August, 2011 Performance Architect update 45/2011

It is time to reinvent management. You can help.

An interesting question I came across this week got me thinking. The question, rather rhetorical was along the lines of: “Why follow some model developed by a consultant when hundreds of smart people from hundreds of companies have spend tens of years developing and refining models line Baldrige and EFQM?”

The long answer to such a question, from my perspective is the following:

  • For the same reason the several of the dozen companies involved in the Nolan Norton Institute study experimented in 1990 with Balanced Scorecard prototypes expanded from Art Schneiderman’s original “Corporate Scorecard” piloted at Analog Devices. It is a classic tale of practitioner insight, mixed with academic rigour and consulting acumen and embraced by organizations willing to innovate while contributing to the enrichment of management as a discipline. (Details in the preface of Kaplan and Norton’s 1996 book.)
  • For the same reason Motorola’s CEO Bob Galvin embraced in 1985 the quality improvement ideas expressed in a research report by two of its employees: Mikel Harry, PhD. (academic rigour) and Bill Smith (practitioner insight, with 35 years of experience in engineering and quality assurance). Their proposed MAIC problem-solving approach became a stepping stone in the evolution of  Six Sigma. The D was added by IBM and other early adopters after Motorola winning the Baldrige Award in 1988.
  • For the same reason why after being presented in an efficiency report to the Executive Committee, Donaldson Brown’s return on investment formula was adopted by Du Pont in 1912. Brown was a 27 years old engineering graduate at the time. The subsequent work done by Brown at Du Pont and General Motors is legendary, with many cost accounting techniques and principles such as Return on Investment, Return on Equity, Forecasting and Flexible being established and used in a corporate context. They were gradually adopted by corporate America and grew to became part of the financial fabric of today’s corporate environment.

The short answer to the question is innovation and progress. Management is constructivist in nature. It is based on innovative ideas being proposed, tested and followed as they prove their value.

My 15 years of work as a management practitioner and consultant and 6 years of academic research offered me the opportunity to analyze plenty of models, frameworks, methodologies and abstract concepts. Some of them are puerile, some of them make sense to me, some of them don’t make sense to me, but make sense to others, many of them are trademarked in an effort to protect and monetize and lots of good work is inaccessible to many due to it being published in academic journals. My advice to anyone, be it consultant, researcher or practitioner is to never stop learning and exploring with an open mind. Great management concepts do not emerge overnight. Over time, some ideas lead to others, some impractical tools had some good points that inspired new hybrids, many organizations had the courage to support innovative staff members and consultant promoted prototypes and great things happened.

Then again, we have the option to put blinkers on and follow industry standards, widely recognized methodologies and popular management tools. That is perfectly fine, too. I myself hold a TOGAF certification (Enterprise Architecture) and I am an PRINCE2 certified practitioner (Project Management). That didn’t stop me to learn about and use components of the Zachman Framework, DODAF, FEAF (Enterprise Architecture), as well as PMBOK and other hybrid project management concepts. I enjoy exploring and generating my own taxonomies, typologies and conceptual systems.

Earlier this year, Gary Hamel launched a call for Management 2.0 in his blog posts: Inventing Management 2.0 and Improving our capacity to manage. The movement is already gaining traction at: The Management Innovation eXchange (MIX) and is supported by academic institutions (i.e London Business School), Industry Organizations (i.e. Dell, National Australia Bank) and research drive consulting companies (i.e. McKinsey&Company and Gartner). In my opinion, such a triumvirate is essential for progressing administrative science theory and practice.

Gary Hamel’s answer to the above question would probably be: It’s time to reinvent management. You can help.

Aurel Brudan
Performance Architect,

Walker, Rob 1992, “Rank Xerox – Management Revolution”, Long Range Planning, Vol. 25, No. 1, pp. 9 to 21 Performance Architect update 44/2011

Advice on KPI documentation and configuration

Configuring KPIs following their selection is represented by the documentation of the complete set of relevant details for each KPI and the activation of KPIs so that data can be reported and analyzed.

  1. Link KPIs upstream with business objectives and downstream with organizational initiatives. KPIs should be connected to organizational objective as they make objectives SMART. Initiatives should be establish to support the achievement of objectives by improving KPI results.
  2. Assign a data custodian responsible for gathering measurement data for the KPI. Data gathering for each KPI requires clarity and ownership. Having a responsible for collecting KPI data is a management approach to ensure accountability with data being available for analysis on time.
  3. Assign a KPI owner responsible for the achievement of the desired results. Each KPI should have a manager allocated as its owner, to ensure responsibility regarding its analysis, results and improvement options.
  4. Avoid tunnel KPI definitions – repeating the KPI name in the definition doesn’t add value. Good practice in working with KPIs requires thorough documentation of what they reprezent. Proper KPI definitions should go beyond repeating the KPI name, by providing a plain English explanation of what the KPI is about.
  5. Categorize KPIs by their reporting status - active = data is tracked, inactive = data not available. Activating KPIs is the process of moving a KPI status from inactive, when the data is not available to active, when data is reported and a clear process is in place for doing so on a regular basis.
  6. Clearly identify the unit type, most of the time % (percentage), # (number) and $ (dollar value). KPIs being measurable entities, they have an associated unit type. To simplify communication, the symbol should be used instead of the word expressing it.
  7. Data accuracy for each KPI should be evaluated as low, medium and high and treated as such. Not all KPIs have the same data reliability. Survey based KPIs are always going to be less reliable compared to revenue KPIs, due to objectivity issues. Other aspects to be considered are data automation and auditing.
  8. Determine the frequency of data generation and the frequency of reporting for each KPI. Data for some KPIs, such as ‘# Website visits‘ can be easily gathered on a daily basis. For other KPIs, such as ‘% Employee engagement‘, data gathering requires considerable costs and efforts, impacting a large number of staff. The frequency of reporting is influence by factors such as cost, efforts and technical complexity.
  9. Develop a customized KPI documentation form that contains the relevant details describing the KPI. Documenting KPIs can be easily done in a template that structures the main description fields considered relevant for the organization. contains such a model that can be customized at organizational level.
  10. Document if the trend is good when increasing, decreasing or when data is within a range. For some KPIs the results are good when they are decreasing from a period to another - for example ‘# Customer complaints’. For others, such as ‘$ Revenues‘, the results are good when increasing, while in the case of ‘% Budget variance‘, the results are good when within a specific range.
  11. Document where the reporting data for each KPI is sourced from and who produces it. Understanding a KPI relies on having a clear understanding of the data behind it and its source.
  12. Don’t worry too much about a KPI being leading or lagging. Differentiating between the two is debatable and confusing. What is considered a leading KPI for some is a lagging KPI for others. As agreement around this differentiation is oftentimes difficult to achieve, it is secondary in importance and impact.
  13. Ensure each KPI is clearly explained in a definition and has a purpose for usage. The separation between definition and purpose is essential. The purpose expresses the reason for using the KPI and is one of the key components of the documentation form.
  14. KISS - keep it short and simple: Use the # and % symbol to replace “number” and “percentage” in KPI names. Standardizing KPI names and shortening them supports communication and enables clear data visualization of KPIs in dashboard and scorecards.
  15. Simplify KPI names by eliminating the word “of”. As a “common denominator” it can be cut from the name. KPIs are analytical in nature and where possible, their names should be as concise as possible. Definitions, calculation and purpose fields provide context and can be more wordy.

Aurel Brudan
Performance Architect, Performance Architect update 43/2011

Advice on KPI selection

Selecting KPIs is a process which seems simple, yet is inherently complex, due to the interdependencies involved. Here are 15 things to consider before embarking on this journey.

  1. Review existing internal reports and support documents at the beginning of the KPI selection exercise. These may include previous business / strategy plans, annual reports, performance reports and other documentation that relates to performance management, measurement and benchmarking.
  2. Use external lists of examples and other secondary documentation to inform and support KPI selection. It is always a good idea to begin a journey having the end in mind. Reviewing KPI examples used in the industry or functional area, by competitors or other organizations provides context around what is in used in practice by others and improves understanding around the desired output.
  3. Engage internal stakeholders in the process of KPI selection through interactive workshops. KPI selection is not a desk exercise. It is an opportunity to communicate and learn, hence an open discussion in a workshop format is a better approach for enabling not only KPI selection, but also understanding and ownership.
  4. Calibrate KPI selection around business objectives and value drivers. KPIs are not used in isolation. They are just one component of the value creation chain and of the performance management system. A simple way to position them is al links between business objectives and related organizational initiatives.
  5. Select KPIs based on the realities of organizational activity and environment. Each organization is different, operating in different environments, with different guiding principles. Hence the KPIs used need to reflect the specifics of each organization first and industry/functional area characteristics second.
  6. Maintain a centralized catalogue of KPIs for the entire organization. Structuring KPI documentation in a central repository facilitates their understanding and usage in a similar way across the organization, growing the know-how and facilitating KPI selection and usage on an ongoing basis.
  7. Understand the difference between input, process, output and outcome KPIs. This value creation sequence is essential in facilitating the understanding of KPIs in the context of the value added by the process/activity they are related to. It is an essential mapping technique that facilitates KPI selection.
  8. Don’t hesitate in changing KPIs in scorecards and dashboards. KPIs should reflect activity and activity should adapt to a changing environment. The use of KPIs should be fluid and flexible, reflecting the change in business priorities as a result of the change in the operating environment.
  9. Review KPI relevance regularly. If new KPIs are required, they can be established at any time. An essential aspect of double loop learning. Using KPIs is not only about achieving set targets and objectives, but also about ensuring the objectives and targets were the right ones to be set in the first place and the KPIs used to track their achievement were the appropriate ones.
  10. KPI selection and target setting should be done in accordance with organizational maturity and direction. There is no one size fits all approach when it comes to using KPIs. As strategies vary from one organization to another, the use of KPIs also varies.
  11. Project milestones are not KPIs. Understanding the difference between what is and what is not a KPI is a prerequisite of successful KPI selection.
  12. Targets are not KPIs. Understanding the anatomy of a KPI is essential in KPI selection and usage.
  13. Some things are not worth measuring. For example measuring love might not be such a good idea. Not everything that can be measured should be measured with KPIs.
  14. Some things are too difficult to measure. For example cuteness. The “measuring everything that moves’ mentality should be avoided.
  15. Eliminate or replace inactive KPIs with simpler, yet measurable ones. Using some KPIs may have seemed a good idea at the time of their selection, however if measuring them proves to be too costly or time consuming, they should be replaced. An active KPI is better than an inactive KPI.

Aurel Brudan
Performance Architect,