Contact Us | Print Page | Sign In
Daily Tips for Consultants
Blog Home All Blogs
Between 2005 and 2011, IMC published Daily Tips every weekday on consulting ethics, marketing, service delivery and practice management. You may search more than 800 tips on this website using keywords in "Search all posts" or clicking on a tag in the Top Tags list to return all tips with that specific tag. Comment on individual tips (members and registered guests) or use the Contact Us form above to contact Mark Haas CMC, FIMC, Daily Tips author/editor. Daily Tips are being compiled into several volumes and will be available through IMC USA and Mark Haas.

 

Search all posts for:   

 

Top tags: client relations  communication  customer understanding  your consulting practice  marketing  consultant role  learning  client service  reputation  goodwill  consulting process  market research  practice management  sales  ethics  planning  client development  engagement management  innovation  proposals  professional development  professionalism  knowledge assets  prospect  trends  presentations  recommendations  consulting colleagues  intellectual property  product development 

#707: Increase Your Consulting S/N Ratio

Posted By Mark Haas CMC FIMC, Tuesday, November 29, 2011
Updated: Tuesday, November 29, 2011
Consultants appreciate a client who provides a lot of information to begin an engagement if it helps us get up to speed and avoids redoing analyses already done or data already collected. Recently, however, many clients "helpfully" give us every bit of data they have to start an engagement, which can be a real burden. How do we efficiently use it without incurring a scope increase just to sort through it?

Of course we want just the relevant data and not an invitation to view the client's file room. Both the consultant and client share the responsibility for information triage. In electronics, a measure of signal quality relative to the background noise is the S/N (signal to noise) ratio. This is a useful concept for consultants starting a project. Your first weeks often involve digesting data (hopefully information), product and market research, personnel records, strategy and planning documents, and the opinions of many people inside and outside the organization.

You want to increase the signal (accurate, timely, relevant, quality and usable information) to noise (inaccurate, outdated, irrelevant, low quality, and false information) ratio. Without straying into electronics engineering, the preferred approach to improve the measurement of the desired signal is to minimize the interference of background noise. This means working with the client to specify only that information (sometimes raw data) that you have a process to use. It is easy to get caught in the trap of wanting to absorb all the available information and then decide how to use it. Stick to your information management and analysis plan (you did specify these in your engagement project plan, right?) and the noise will decrease.

Tip: The concept of signal to noise ratio is useful in other areas. What is your website's S/N ratio (useful and sticky information compared to total content)? What about your presentations (how many PowerPoint slides does it take to make your point compared to the total size of the slide deck)? Your marketing (how often do prospects have to ask follow up questions about your marketing calls)? Increase your consulting S/N ratio in all your communication.

© 2011 Institute of Management Consultants USA

Tags:  analysis  communication  customer understanding  diagnosis  information management 

Share |
PermalinkComments (0)
 

#685: Consultants Need to Understand Type I and Type II Errors

Posted By Mark Haas CMC FIMC, Friday, October 28, 2011
Updated: Friday, October 28, 2011
I always hear about Type I and Type II errors in business and how important it is that consultants understand these concepts. Why should I care about this?

People are referring to a statistical concept where a Type I error is a false positive and Type II error is a false negative. For the statistician, a Type I error is rejecting the null hypothesis when it should have been accepted. For a businessperson or consultant, a Type I error is seeing something that is not really there. Type II errors are missing something that is really there (and potentially company making or breaking).

A Type II error (false negative) can be serious when looking at competitive markets or human resource issues such as culture or employee opinions. Inadequate surveys or incomplete analysis may lead a consultant to conclude that there are not serious competitors or impending revolts among employees when, in fact, there are. Depending on the situation, a Type II error may result in serious losses for a company or put it out of business.

False positives are of most interest to consultants engaging in diagnostic or investigative activities, in two ways. As a consultant whose job it is to find problems to solve or opportunities to capture, we are looking for something on which to act. Maybe a process is "broken" or a market is "large and available" to your client. In either case, you may identify something that is not really significant enough to expend resources on. Alternatively, as a result of your activities, you conclude that your impact is significant when it really is not. In both cases, you have overstated the significance, or even existence, of your role to the client. Understanding Type I and Type II errors gives you good perspective on your role and significance to a client.

Tip: Think in terms of medical testing when you consider how you are going to control for Type I and Type II errors. The worst outcome when looking for a serious disease is to conclude it is not present when it is (Type II). To accommodate that, we use screening procedures that are relatively fast, cheap and for which we can tolerate a Type I (false positive) error. As a consultant, you may want to develop protocols that let you quickly tease out potential problem areas and for which you recognize there may be Type I errors. Those items that show up may be real or, more likely, false positives. Then you can proceed with more focused and rigorous protocols to look more closely at an issue, recognizing that what you want to avoid is a Type II error (false negative). You don't have to be a statistician to understand the concept and how your ability to mitigate risk on behalf of your client is a significant value added.

© 2011 Institute of Management Consultants USA

Tags:  analysis  assessment  assumptions  consulting terminology  consulting tools  diagnosis  information management  recommendations  risk analysis  statistics 

Share |
PermalinkComments (0)
 

#682: Make Assumptions Carefully

Posted By Mark Haas CMC FIMC, Tuesday, October 25, 2011
Updated: Tuesday, October 25, 2011
Part of being a good consultant is being able to get through the diagnosis and to a solution as quickly as possible (but getting it right). To do that we must make assumptions, but where are assumptions potentially erroneous short cuts and where are they appropriate?

You know what they say about assumptions. We can't realistically base our diagnostic conclusions entirely on our empirical research done at the beginning of an engagement. We make what we consider to be reasonable assumptions based on discussions with the client and staff, market or technical research, our own analysis and any other information we collect - including years of our own experience with analogous or similar cases. It is a judicious combination of facts, intuition and experience that is the hallmark of a consultant's detective like skills.

However, professionalism compels us to be on watch for assuming too much, too fast. It is all too easy, after years of experience, to be impressed with our knowledge and comfortable with believing we "have seen this case a thousand times before." To keep this in check, a professional has processes in place, maybe even formal ones, to challenge and verify all assumptions made on the way to a diagnosis. What are the ways you make sure you are not assuming too much without knowing it?

Tip: Write out the steps you take in your normal process (or more than one) of scoping a project, collecting data, completing a diagnosis, and presenting findings and recommendations. Note the type and criticality of your assumptions at each stage. Finally, describe the implications on this diagnostic chain of each of your assumptions and what you could do to mitigate the risks of wrong assumptions. Now, when people talk about your assumptions, they will have only good things to say.

© 2011 Institute of Management Consultants USA

Tags:  analysis  assessment  assumptions  methodology  recommendations 

Share |
PermalinkComments (0)
 

#653: Think Twice About Data You Use in Your Recommendations

Posted By Mark Haas CMC FIMC, Wednesday, September 14, 2011
Updated: Wednesday, September 14, 2011
We are a quantitatively-oriented and capable consulting firm but not so much that we would be considered a heavy-duty analytics one. Here's my question. Our clients can vary considerably on the extent to which they are convinced by numbers. At the risk of sounding harsh, are those that discount quantitative analysis not being responsible managers?

We shouldn't be quick to criticize a client's reluctance to adopt our quantitative analyses. As much as fact-based decision making is a good practice, it implies that those decisions are based on valid and reliable data. Reluctance to base decisions on your data can be a competent approach by managers. Do your clients consider the data you use to develop your recommendations valid or not? Your analytical methods? Whose data and models should you, or your clients, trust?

It is widely accepted that data series used for public policy and private decision making are less than perfect. However, it is increasingly recognized that some of these data and the concepts on which they are based are fundamentally flawed. Research over the past decade shows several macro scale financial concepts (e.g., CAPM, VAR, shareholder wealth) fail to stand up to empirical analysis, despite still being taught in business school. At the national policy level, GDP has fallen from favor because it excludes the majority of asset value and infrastructure investment, compelling some countries like the UK to develop a replacement measure. The World Bank admits that 80% of the wealth of nations is left out of asset accounts, even as those flawed accounts are used for policymaking. The unemployment rate, used by economists and media to track the state of the job market, is understated by about half because it measures people looking for work who can't find a job, not real unemployment. See Shadow Stats as one of many emerging alternative statistics sources.

At the company level, executives complain about distorted financial and tax measures they have to contend with. An investment in training is counted, and taxed, as an expense not the investment it is, thus never shows up on the balance sheet. Uncompensated overtime by salaried employees is considered free labor. Data series, business concepts and tax laws all add to this distorted view of the world. As a result, portfolio managers and consultants are adopting ESG (environment, social, governance factors) triple bottom line asset valuation models, for which there is expanding evidence of being a better predictor of financial success than traditional models.

If you are trying to lose weight and the scale was off by 5 (or maybe 10) pounds in one direction (or maybe the other) every few days, to what extent would you base your diet on that scale? We ask our executives to make fact-based decisions but we also should let them be responsible for judging the validity of the data on which they make those decision.

Tip: This is a good reminder to review with our clients at the beginning of an engagement our assumptions about data sources, analytical models and philosophies, and the extent to which we will base our recommendations on analytical vs. other findings.

© 2011 Institute of Management Consultants USA

Tags:  analysis  data visualization  management theory  recommendations 

Share |
PermalinkComments (0)
 

#638: It's What You Know That Just Ain't So

Posted By Mark Haas CMC FIMC, Wednesday, August 24, 2011
Updated: Thursday, August 25, 2011
I like the comment attributed to Will Rogers about "It's not what you don't know that's the problem, it's what you know that just ain't so." Even though we are human, getting trapped in such assumptions is particularly dangerous for consultants. How can we avoid making these mistakes?

It may be the consultants are prone to being misled because of the nature of their work. We are expected to take incomplete and often confusing information and make sense of it, then use that information to chart a course for our clients. We often know of similar circumstances but, unless we have been working with a particular client for a while, are often new to the organization.

The nature of our work compels us to "jump to conclusions," even though we usually consider that a good thing. Why do we jump to the "wrong" conclusions, even after a good analysis? You are referring to a number of well documented biases that affect us in decision making. The most potent is confirmation bias, where a piece of information we have seems to comfortably fit into the "model" we have constructed. This serves to strengthen our conviction that we are on the right solution path (and our assumption that we are indeed smart consultants).

A number of other biases, recency bias (overweighting the last piece of information we received), anchoring (overweighting initial information), and vividness bias (overweighting the most stimulating information) can all lead to assuming we are right when we could be really wrong. The antidote to this is to constantly challenge your working model of how you see a client's situation and to become a student of over confidence.

Tip: Snopes.com lists a number of rumors that many people "know" to be true that just aren't. These persist because they are interesting, amusing or curious - and are either hard to verify or just not worth the effort. Find some of these rumors or urban legends (there are websites with plenty of these as well) you might believe and consider why you believe them. This might refresh your inner skeptic and help you avoid jumping to the wrong conclusion in your next engagement.

© 2011 Institute of Management Consultants USA

Tags:  advice  analysis  assumptions  consulting process  diagnosis  interpretation  learning  market research  professional development 

Share |
PermalinkComments (1)
 
Page 1 of 2
1  |  2