Print Page  |  Contact Us  |  Your Cart  |  Sign In  |  Join IMC USA
Daily Tips for Consultants
Blog Home All Blogs
Between 2005 and 2011, IMC published Daily Tips every weekday on consulting ethics, marketing, service delivery and practice management. You may search more than 800 tips on this website using keywords in "Search all posts" or clicking on a tag in the Top Tags list to return all tips with that specific tag. Comment on individual tips (members and registered guests) or use the Contact Us form above to contact Mark Haas CMC, FIMC, Daily Tips author/editor. Daily Tips are being compiled into several volumes and will be available through IMC USA and Mark Haas.

 

Search all posts for:   

 

Top tags: client relations  communication  customer understanding  your consulting practice  marketing  consultant role  learning  client service  reputation  goodwill  consulting process  market research  practice management  sales  ethics  planning  client development  engagement management  innovation  proposals  professional development  professionalism  knowledge assets  prospect  trends  presentations  recommendations  consulting colleagues  intellectual property  product development 

#730: Prove That Your Consulting Practices Are Effective

Posted By Mark Haas CMC FIMC, Friday, December 30, 2011
Updated: Friday, December 30, 2011
How would you recommend management consulting as a whole improve its effectiveness?

The traditional definition says, "A management consultant is a professional who, for a fee, provides independent and objective advice to management of client organizations to define and achieve their goals through improved utilization of resources." Buried in this widely held definition lies the challenge for consultants. "Independent and objective" often ends up interpreted as thinking in novel ways about business and management, adapting a presumed "best practice" to a new situation or developing entire new management concepts to promote a portfolio of services with which we are familiar and practiced. Nowhere is the primacy of evaluation and proof that what we are proposing actually works. Many of commonly used and highly promoted consulting practices lack validation. To be sure, our approaches are logical, they align with other management theories and our client seem to have done OK after we applied them. Where is our proof of value? Evidence-based intervention is increasingly required in medicine, but not for consulting.

We as professionals need to develop a deeper capability to recommend and deliver to our clients only those practices and strategies that are provably effective. Proving effectiveness is hard, which is why it is rarely pursued. So we develop consulting approaches that are:
  • Too old - we propose approaches that were (maybe) effective a decade ago when the economy, culture and management practices were entirely different but are no longer applicable.
  • Too new - we propose something we just read about in a management journal (most of which these days are written by consultants) but that has only been tried a few times, much less proven effective widely or over the long term.
  • Too abstract - we propose convoluted and theoretical processes that we understand well but for which the client and staff have no realistic capability to adopt or sustain.
A healthy skepticism to consulting techniques is our best defense against obsolescence as a profession and as individual consultants. Look at most "standard" management concepts from the past thirty years and you can find legitimate and well researched evidence why they are inappropriate for consultants to apply in many circumstances and potentially hazardous in others. We are now fully into a VUCA world (volatile, uncertain, complex, and ambiguous) where the pace and scope of business exceeds the ability of any individual to think through improvement approaches by him or herself. The standard of proof for consulting effectiveness will continue to increase.

Tip: Seek out disconfirming evidence for every concept, process, approach or technique you have in your consulting portfolio. There are good resources available. For an overview of how to think critically about your consulting approach at a high level, read carefully Flawed Advice and the Management Trap: How Managers Can Know When They're Getting Good Advice and When They're Not. For a more specific critique of individual techniques, look at Calling a Halt to Mindless Change: A Plea for Commonsense Management. Being a true professional means that, before we promote approaches we assume to be effective, we make sure we can defend our current practices in the face of logic and evidence that they neither make sense nor really work all that well.

© 2011 Institute of Management Consultants USA

Tags:  agility  assessment  client service  consulting process  consulting skills  consulting terminology  consulting tools  diagnosis  education  innovation  learning  management theory  methodology  performance improvement  practice management  professional development  professionalism  quality  roles and responsibilities  sustainability  technology  trust  values  your consulting practice 

Share |
PermalinkComments (0)
 

#707: Increase Your Consulting S/N Ratio

Posted By Mark Haas CMC FIMC, Tuesday, November 29, 2011
Updated: Tuesday, November 29, 2011
Consultants appreciate a client who provides a lot of information to begin an engagement if it helps us get up to speed and avoids redoing analyses already done or data already collected. Recently, however, many clients "helpfully" give us every bit of data they have to start an engagement, which can be a real burden. How do we efficiently use it without incurring a scope increase just to sort through it?

Of course we want just the relevant data and not an invitation to view the client's file room. Both the consultant and client share the responsibility for information triage. In electronics, a measure of signal quality relative to the background noise is the S/N (signal to noise) ratio. This is a useful concept for consultants starting a project. Your first weeks often involve digesting data (hopefully information), product and market research, personnel records, strategy and planning documents, and the opinions of many people inside and outside the organization.

You want to increase the signal (accurate, timely, relevant, quality and usable information) to noise (inaccurate, outdated, irrelevant, low quality, and false information) ratio. Without straying into electronics engineering, the preferred approach to improve the measurement of the desired signal is to minimize the interference of background noise. This means working with the client to specify only that information (sometimes raw data) that you have a process to use. It is easy to get caught in the trap of wanting to absorb all the available information and then decide how to use it. Stick to your information management and analysis plan (you did specify these in your engagement project plan, right?) and the noise will decrease.

Tip: The concept of signal to noise ratio is useful in other areas. What is your website's S/N ratio (useful and sticky information compared to total content)? What about your presentations (how many PowerPoint slides does it take to make your point compared to the total size of the slide deck)? Your marketing (how often do prospects have to ask follow up questions about your marketing calls)? Increase your consulting S/N ratio in all your communication.

© 2011 Institute of Management Consultants USA

Tags:  analysis  communication  customer understanding  diagnosis  information management 

Share |
PermalinkComments (0)
 

#685: Consultants Need to Understand Type I and Type II Errors

Posted By Mark Haas CMC FIMC, Friday, October 28, 2011
Updated: Friday, October 28, 2011
I always hear about Type I and Type II errors in business and how important it is that consultants understand these concepts. Why should I care about this?

People are referring to a statistical concept where a Type I error is a false positive and Type II error is a false negative. For the statistician, a Type I error is rejecting the null hypothesis when it should have been accepted. For a businessperson or consultant, a Type I error is seeing something that is not really there. Type II errors are missing something that is really there (and potentially company making or breaking).

A Type II error (false negative) can be serious when looking at competitive markets or human resource issues such as culture or employee opinions. Inadequate surveys or incomplete analysis may lead a consultant to conclude that there are not serious competitors or impending revolts among employees when, in fact, there are. Depending on the situation, a Type II error may result in serious losses for a company or put it out of business.

False positives are of most interest to consultants engaging in diagnostic or investigative activities, in two ways. As a consultant whose job it is to find problems to solve or opportunities to capture, we are looking for something on which to act. Maybe a process is "broken" or a market is "large and available" to your client. In either case, you may identify something that is not really significant enough to expend resources on. Alternatively, as a result of your activities, you conclude that your impact is significant when it really is not. In both cases, you have overstated the significance, or even existence, of your role to the client. Understanding Type I and Type II errors gives you good perspective on your role and significance to a client.

Tip: Think in terms of medical testing when you consider how you are going to control for Type I and Type II errors. The worst outcome when looking for a serious disease is to conclude it is not present when it is (Type II). To accommodate that, we use screening procedures that are relatively fast, cheap and for which we can tolerate a Type I (false positive) error. As a consultant, you may want to develop protocols that let you quickly tease out potential problem areas and for which you recognize there may be Type I errors. Those items that show up may be real or, more likely, false positives. Then you can proceed with more focused and rigorous protocols to look more closely at an issue, recognizing that what you want to avoid is a Type II error (false negative). You don't have to be a statistician to understand the concept and how your ability to mitigate risk on behalf of your client is a significant value added.

© 2011 Institute of Management Consultants USA

Tags:  analysis  assessment  assumptions  consulting terminology  consulting tools  diagnosis  information management  recommendations  risk analysis  statistics 

Share |
PermalinkComments (0)
 

#679: Make Sure Your Client Asks You the Right Question

Posted By Mark Haas CMC FIMC, Thursday, October 20, 2011
Updated: Thursday, October 20, 2011
How should a consultant handle a client who, after telling us the nature of the challenges the company faces, asks us to provide services to solve a different set of problems.

Consensus between client and consultant is critical to a successful outcome, and recognition of that success. Presumably the client has brought you in to give an independent and objective view of the challenges and opportunities the client organization faces. If the client has already decided on the symptoms, underlying problems and solutions, then your role as diagnostician is eliminated along with your role as designer of appropriate solutions. If this is the case, the first question is whether you are the right consultant. Consultants provide diagnostic and assessment expertise; if the client just wants you to implement their own solution, they are better off with a contractor.

The second question is whether you and the client have really focused on the right problem. The client or staff may be wedded to a problem definition that may be correct but that leads to a specific solution that is wrong. Most consultants know that the issues clients most often present first do not necessarily represent the full picture. Sometimes dividing (or offering to) an engagement into several parts - diagnosis, design and implementation - can break this thinking and get the client to give you more latitude to help define the issues to be addressed.

Finally, recognize that you both benefit from an orderly discussion from what you are trying to solve through which or where the solution needs to be applied, to how it will be achieved to who (with what resources) is to be accountable for results. Read a short article on a process of how to avoid misdirected projects.

Tip: It is important to have in hand a process to identify ill-defined projects and deal with them before your get too deeply engaged. Know how you can direct the project scoping conversation to either (1) open up a serious debate on fact-based and independent diagnosis or (2) your disengaging from the project respectfully. Don't agree to a project's scope, sequence and content until you and the client agree that you are asking the right question.

© 2011 Institute of Management Consultants USA

Tags:  client relations  customer understanding  diagnosis  engagement management 

Share |
PermalinkComments (0)
 

#638: It's What You Know That Just Ain't So

Posted By Mark Haas CMC FIMC, Wednesday, August 24, 2011
Updated: Thursday, August 25, 2011
I like the comment attributed to Will Rogers about "It's not what you don't know that's the problem, it's what you know that just ain't so." Even though we are human, getting trapped in such assumptions is particularly dangerous for consultants. How can we avoid making these mistakes?

It may be the consultants are prone to being misled because of the nature of their work. We are expected to take incomplete and often confusing information and make sense of it, then use that information to chart a course for our clients. We often know of similar circumstances but, unless we have been working with a particular client for a while, are often new to the organization.

The nature of our work compels us to "jump to conclusions," even though we usually consider that a good thing. Why do we jump to the "wrong" conclusions, even after a good analysis? You are referring to a number of well documented biases that affect us in decision making. The most potent is confirmation bias, where a piece of information we have seems to comfortably fit into the "model" we have constructed. This serves to strengthen our conviction that we are on the right solution path (and our assumption that we are indeed smart consultants).

A number of other biases, recency bias (overweighting the last piece of information we received), anchoring (overweighting initial information), and vividness bias (overweighting the most stimulating information) can all lead to assuming we are right when we could be really wrong. The antidote to this is to constantly challenge your working model of how you see a client's situation and to become a student of over confidence.

Tip: Snopes.com lists a number of rumors that many people "know" to be true that just aren't. These persist because they are interesting, amusing or curious - and are either hard to verify or just not worth the effort. Find some of these rumors or urban legends (there are websites with plenty of these as well) you might believe and consider why you believe them. This might refresh your inner skeptic and help you avoid jumping to the wrong conclusion in your next engagement.

© 2011 Institute of Management Consultants USA

Tags:  advice  analysis  assumptions  consulting process  diagnosis  interpretation  learning  market research  professional development 

Share |
PermalinkComments (1)
 
Page 1 of 2
1  |  2
Community Search
Sign In
Sign In securely
IMC USA Calendar

1/19/2018
IMC GA: Exit and Succession Planning for Our Clients

1/25/2018
Consulting Core Competencies - MMC 2a

Message from the Chair