- February, 2012 ( 1 )
- November, 2011 ( 1 )
- June, 2011 ( 1 )
- May, 2011 ( 1 )
- April, 2011 ( 1 )
- March, 2011 ( 1 )
- January, 2011 ( 1 )
- December, 2010 ( 1 )
- November, 2010 ( 3 )
- September, 2010 ( 1 )
- June, 2010 ( 2 )
- April, 2010 ( 1 )
- March, 2010 ( 2 )
- January, 2010 ( 4 )
- December, 2009 ( 2 )
Gate and Lean based discussion
Whenever management introduces a change, things seem to get worse before they get better, or get worse followed by further change in hope of fixing the newly created problem. Sometimes it just spirals out of control from compounded reactions and feels like there is no way to pull out before the crash. It is interesting to look at this phenomena from a systems perspective. I am going to use a lot of "geek speak," but I think the analogy helps understand what causes spiraling out of control and how to prevent it. So put on your propeller hand and be assured you can take it off when you reach the end.
Linear feedback systems are characterized by a mathematical model of the relationship between inputs and outputs. Signals in the model have two attributes: amplitude and phase. Amplitude is the strength of a signal and phase is the time signal change occurs. Management systems are complex, but have a similar characteristic, there are actions with direction and force, and reactions occur in time, usually delayed time.
It is desirable for a system to have a simple relationship between input and output, such that the output follows the input. When the input increases, the output increases, and visa versa. However, in some systems when the input increases, the output decreases at first, then increases later. The mathematical description of this is a Right Half Plane Zero. The physical cause is related to energy storage and delay.
In some power conversion circuits, energy is stored during during one time period, and transferred to the output during later time period. Other conversion circuits store and transfer at the same time. The circuit that delays the energy transfer has a Right Half Plane Zero and behaves by decreasing output before increasing output in response to an increase in input. Management systems are full of energy storage and delay, and behave in a similar manner.
Let's look at an example from a start up I was employed by many years ago. This startup manufactured semiconductor devices. Each month wafers would exit the FAB and were sliced, packaged, and tested. There was a pattern of wafers exiting the FAB the first week of the month and parts shipping on the last day of the month. The accounting system produced metrics each month. The board reviewed the numbers and managed expectations of the stock holders, in this case the VC's. There was huge pressure to have good numbers each month to maintain confidence in order to get another round of funding.
Unfortunately, this was an inefficient operation because flow was not smooth. I suggested that we smooth the process, which means packages ship after the end of the month during the time that new wafers are exiting the FAB, thus removing the peaks in the testing process. How would the suggested change impact the system? First, there would be a delay in shipments of one week. This would be a one time delay. This would be followed by increased efficiency which would eventually increase the plants capacity. From a systems point of view, the response is decreased output followed by increased output. A side effect would be poor metrics for one month, followed by better metrics each month as the improved efficiency took effect. Also, some customers may be angry with delays.
Like the power converter, this system had a storage mechanism followed by a delayed transfer. Silicon processing is batch oriented. A batch exits the FAB and is packaged. Further batches leave packaging and go to test. Then these batches are re-batched and shipped. The change in delay caused same effect as a Right Side Plane Zero. This phenomena is very common. If customer demand increases suddenly, working capital increases, inventory drops, etc. We deal with these problems by buffering with inventory, building lean systems, improving predictive analytics, etc. The Bull Whip effect in a supply chain is another example of energy storage and delay, but in this case leads to oscillations.
Things get really bad is when management fails to understand these dynamics and react to the temporary decrease in output/performance or increase in cost. This leads to oscillation like the Bull Whip effect, or worse. In some cases if the reactions continue, it leads to negative feedback and the system no longer self regulates and self destructs. This happens when a change temporarily decreases performance, and before the system switches to increased performance, there is another change which results in a second more intense decrease in performance, followed by another... driving it into destruction. Sometimes fear is the real force behind the reactions.
There are a couple of ways to avoid spiraling out of control. One way is to recognize the pattern and wait long enough to learn whether the reduced performance is temporary and will self correct like the Right Half Plane Zero, or continue. If it continues, the relationship between the change and its effect may not be understood. A second approach is to anticipate the effect and counter it with another change. In engineered systems, this is called feed forward. Essentially, you compensate for the Right Half Plane Zero by providing another path that allows rapid changes in input to bypass part of the system and go straight to the output. A third approach is to remove the Right Half Plane Zero by redesigning the system.
Here are some recommendations:
If you don't understand a management system, delay reactions to change and study its behavior. Don't over react, and certainly don't let reactions compound. You must learn the relationship between input and output. Also consider making only one change at a time.
If your business model will allow it, go lean. Lean effectively will speed up the effect of the Right Hand Plane Zero or eliminate it. If the effect happens fast enough, you will reduce the chance of reacting to a reaction.
Use feed forward. In the start up case, this implies that for several months you work hard to build some inventory. Then when the system is changed, use the inventory to avoid late shipments and poor metrics. Manage management expectations and let them know that the system will still have some temporary swings in metrics that will settle out within a few months, but the magnitude of them will be smaller than just making the change and taking the hit all at one time.
Well, enough "Geek Speak." I hope you see the value of systems thinking and not overreacting when making changes. Remember that human systems are more complex than mechanical or electrical ones. The same principles apply, but the complexity demands more attention and experimentation.
For non geeks out there, a good resource is Peter Senge. Examples of system behavior are given in less geeky terms.
Do values matter to New Product Development? I suspect most people would answer yes to the question, but not agree on what those values are. Let's work by analogy and see if we can sniff them out. An example from manufacturing will be used to tease out the four values of Efficiency, Effectiveness, Design, and Optimization. These will then be applied to product development, context will be discussed, and some observations will be given.
I spend a lot of time in semiconductor manufacturing environments, in particular final electrical test. Test floors in Asia are usually organized as a grid of work stations with operators managing multiple workstations. Each work station has a robot called a "handler" and a "tester". The handler moves product and the tester finds manufacturing defects.
The first picture is the input device. The second picture shows the robotic mechanism that moves devices between operations, such as rotate, test, mark, inspect, etc. Devices then exit the machine and are placed into reels if they are good devices, and a trash bucket if they are bad devices.
The market for semiconductors is very competitive and the equipment is very expensive. No one survives for long unless this process is very efficient. Units per hour is universally monitored and managed. Small improvements in throughput directly hit the bottom line. The core value is Efficiency.
However, end users of the devices care about quality. Devices must meet specifications. As always, the world is gray, and their are tradeoffs between test time and test quality. A more expensive tester can improve quality and throughput, but raise the capital cost. Averaging a measurement can reduce variations and reduce the probability of test escapes, but increase test time. When the goal is quality, the core value is Effectiveness.
Somewhere in the manufacturing organization someone worries about test strategy, capital allocation, market strategy, and competitive positioning. Decisions regarding purchase of buildings and equipment operate on a different time scale than everyday improvements. This core value is Design.
The last value is more holistic and organic. A well functioning manufacturing organization will maximize value and minimize cost, but you will find many localized sub-optimum processes. Someone at the top has the responsibility to ensure that all functions work as a whole to reach a global maximum. We can call this core value Optimization.
Values vs Behaviors
The reason I like to call these values rather than perspectives or behaviors is because people get attached to them. If you have effectiveness tendencies, how do you feel when you can't get something done because someone with an efficiency bug won't work around the process? And if you have spent your entire career improving efficiency, how do you feel when the someone wants to trash your beloved process when changing the organizational design or get something done a different way? These emotional attachments make them values, and they can be very persistent, even to the detriment of the organization.
New Product Development Example
How do these values relate to New Product Development? Let's change perspectives and consider the companies that develop the robotics and test equipment, and consider their development process.
Capital equipment is very complex and the design process is expensive. A typical system has mechanics, software, firmware, and electronics. With long development cycles, the risk of delivering a product that does not sell is large.
First, consider the software. Suppose the team uses an agile process. Sprints are by design a process of translating requirements into deployable code. Well designed sprints are very efficient. Roles, process, and tools are well defined. At the end of each sprint there may a postmortem where improvements to the process are discussed. At this level there is not a lot of work on effectiveness, because it is all about execution efficiency. (Agile guys and gals: please don't beat me up over the simplification, I know it is not as simple as this.)
The sprints are consuming a backlog of requirements, typically managed by a project owner or product manager. The product manager is responsible for delivering maximum value to a market or customer. There is some negotiation with the scrum master over the backlog, effectively managing the tension between delivering value value, vs. managing execution. This is all about effectiveness, and managing the relationship between efficiency and effectiveness.
When it comes to mechanics and electronics, things work a bit different. The cost of iteration is too high to run an agile team, so much more upfront requirements effort is required. This changes the dynamics for the product manager, who must spend more time in the field understanding the market and context the equipment will be used in. The product manager must generate an effective product definition, then hand it off to an efficient development team.
Both values of efficiency and effectiveness are in play, but there emphasis and time relationships have changed due to the nature of the product. Agile processes can relax the up front need for effectiveness a bit and depend on feedback coupled with efficiency. A staged process with high iteration costs can relax efficiency a bit and depend on a very effective front end that gets the requirements right the first time. (If you are in a fast moving market for hardware based products, start thinking platforms as a way to deal with the mismatch between the need for speed and need for upfront definition.)
Design plays a larger role in developing capital equipment. The hugh capital risk puts pressure on strategic thinking, because a bad bet can result in company failure. Design of the organization may be customized on a product by product basis. Optimization will play a smaller role, due to the small number of products in development. With less routine and data, optimization is difficult.
Dynamic Nature of Values
The four values of Efficiency, Effectiveness, Design, and Optimization are universal. However, their emphasis and interrelationships are context dependent. In addition to context dependency, individuals have biases and tend to favor one above all the others. Group bias is self reinforcing, working against the need to change when the context changes.
My goal here is to create a language for discussion of values in the context of product development. The values are Efficiency, Effectiveness, Design, and Optimization. I propose using the language as follows:
- Look at your current situation and categorize behavior according to the four values. Ask where these values are found in your new product development, who holds, their relative strength, and how they relate to each other.
- Look at your market and assess how well the current values address this market in terms of product development. Do they create value for the market or hinder creation of value.
- Decide what shifts in values must take place to improve value creation.
My experiences seem to indicate that effectiveness in product development is harder to come by than efficiency. Efficiency is much easier to measure, and personal risk is much lower. If one is measured by conformance to to process, a product can fail in the market while one receives high marks. Such risk avoidance happens in other areas of product development. In "Leading Product Development" Wheelwright and Clark give the following reasons why senior leaders tend to get involved in new product development when things go wrong, rather than up front:
- Low risk/high return when they arrive late in the game.
- It's urgent and visible when it is in trouble.
- Firefighting skills are rewarded.
- It's exciting to put out fires.
- It is easy to be wrong at the beginning, but at the end if one fails to fix it, it is not your fault.
- A lot of knowledge is required be involved at the beginning.
- At the front end problems are not well defined, and tools/roles are not clear.
- Lack of metrics/long feedback paths means less feedback.
- Absence of urgency.
I think that effectiveness inherently involves risk, because it requires judgement. In a risk adverse environment, this skews the values toward efficiency. If senior managers are subject to this bias, I imagine everyone else is too. Yet, what I think works best is focus on effectiveness on the front end of product development, efficiency on the back end, and a well managed boundary between them.
This implies companies must deal with risk adverse attitudes that block effective behavior.
Four values that are important for product development are Efficiency, Effectiveness, Design, and Optimization. Emphasis on and relations between these values are Context Dependent. People and groups have Preferences and Tendencies that need to be managed. The Front End of product development tends to require Effectiveness and the back end tends to require Efficiency. Risk Aversion tends to Skew values towards Efficiency.
The core categories of Efficiency, Effectiveness, Design, and Optimization are inspired by Adizes. Please see his books if you want to dig deeper into his framework. Note that he uses different words and has different nuances than I do. I don't make any claims of representing his work, I just want to give credit to his work because it led to my categorization.
Here is a great introduction to the life of a Product Manager from Stanford University
There are several thinking tools I use, but I don't often think about the tools, their advantages, and their limitations. When I did a search on Mind Maps AND Affinity Diagrams, not much showed up that compared the two, so I thought I would puts some thoughts down and share them to see what other people think.
My most common tools are linear notes, mind maps, and affinity diagrams. In all three cases, creating an artifact aids thinking by feeding back to me a prior thought. I experience the same dynamic for other creative processes. When I create software, I model architecture with UML and code. Both speak back to me and generate more ideas. If I am building a financial model, I can run scenarios and let the model show me the results.
These tools have the greatest advantages when two conditions are present: complexity and groups of people. Software is complex. Just try to write software in your head, then write it down after you are done. The mind is amazing, but it can't keep very much code in it. With multiple people, an artifact works like shared memory. Every idea feeds back to every mind to generate more ideas.
IDEO works with these dynamics. A group of people have a problem, and they prototype solutions as a group. The prototype is the artifact. When complexity and multiple people are combined with a flexible medium, there is a third dynamic. Not only do ideas go into the artifact, and the artifact speaks back, people discuss the artifact with each other. This cross-pollination process tests concepts, compares interpretations, and generates ideas.
Let's look at the three tools individually, compare them, and consider how they influence thinking.
Notes are as old as the hills. In the old days of paper and pencil, the only place you can write is at the bottom or between the lines. With computers, you can write anywhere. Nonetheless, information is mostly one dimensional except to the extent that you use bullets and tabs. Notes are easy because you can record a stream of consciousness, record a lecture, or just dump your short term memory before you forget something. Linear notes match how you read a book or article like this one. If you are writing a book, even through the book has structure, you are serializing the information. When you read, you are processing information serially, much like listening. When you write, you communicate thoughts, much like speaking.
Linear notes match our input-output devices more than our thought patterns.
Mind Maps and Affinity Diagrams
These diagrams present information in a more parallel fashion. they are inherently multi-dimensional. Information is connected in tree form. The paper layout is two dimensional, and there is hierarchy. If nodes have sentences, these diagrams contain linear notes within them. If nodes have single nodes, they are more parallel in nature.
A mind map starts with a central image. Then you draw your first branches. These are described by Tony Buzan as Basic Ordering Ideas. Branches are then added to branches. Child branches are associations of ideas. Buzan emphasizes that branches get smaller as you get away from the central image and each branch has one word. By avoiding sentences on a single branch, you can add ideas at the granularity of a single concept.
This mind map is a combination of single words and sentences.
Affinity diagrams are normally built with sticky notes. The process begins with a collection of notes and affinities are discovered by looking and grouping. Once groups are formed, higher level groups are formed until you have a small number of groups or one group. The end result is a hierarchy. Individual notes are almost always sentences, so there is always a linear aspect of an affinity diagram. It reminds me of a concentration game. You flip two cards and try to match them. You have to remember where cards are so when you flip one card, you remember where the match is. It is a pattern matching game.
This is a generic example from mindtools.com. Random ideas are organized into themes. (Source Mind Tools)
Even though the form of the result appears similar, the processes involved could not be more different. Mind mapping tends to be a very top down process. You start with a central idea, then put down Basic Ordering Ideas. (Note the term "Ordering") From these branches come more. If you work from the center out, the mind map can develop very analytically from whole to part. Buzan claims that the mind map breaks the left brained linear thinking process and aids whole brain thinking. I believe this depends completely on the process, not the representation. Nonetheless, because a mind map begins with Basic Ordering Ideas and unfolds by triggering ideas in the mind, it will tend to work top down at first. As the map gains complexity, new ideas have to be placed on the map where they associate. This part of the process is more creative and right brained, but is constrained by the Basic Ordering Ideas.
Affinity diagrams are a very bottom up right brained process. It starts will a collection of ideas. The goal is not to break an idea apart, or add new ideas, but to find patterns in the ideas available. More analytical thought processes might find affinities based on common nouns in the ideas, but a more creative thought process will find affinities that are not so obvious. Many times the affinities are somewhat goal directed. For example, if you are making affinity diagrams from interview notes on a work process, the affinities should be related to work process. The end goal is creative discovery.
Both mind maps and affinity diagrams can be linearized when done by creating prose or by storytelling. Mind maps work well for deductive and analytical thinking and affinity diagrams work well inductive and synthetic thinking. This is a somewhat stereotypical view. In reality, the mind is not always stuck in one of these modes and there is a blurring effect.
One can also work from linear notes. A mind map can be created by outlining prose. An affinity diagram can be created by segregating sentences and finding themes.
What this means to me is that linear notes, prose, and story telling are good vehicles for communication and capture. Mind maps and affinity diagrams are good tools for processing and creating. Trying to create from linear notes usually means sucking it into a head, working with it there, then spitting it back out into notes. The whole purpose of the tools is to work it as an artifact rather than all in your head. This allows one to solve more complex problems with more than one person.
Story tellers, mind mappers, and researchers each prefer their respective tools. I wonder though how much we constrain ourselves. Tools have built in biases. They are like the electricians tool belt. When you need to strip a wire, pull out the wire strippers. When you need to attach a wire to a plug, bend the wire with pliers, then fasten it with a screw driver. Each problem has its own tool. On the other hand, the tool belt as an whole allows for creative combinations of order of application and combination.
Does the mind really work this way? Does the mind move from tool to tool to solve a problem? Is not the mind an inherently parallel system? Is it not capable of analysis and synthesis at the same time? Yet once we interact with artifacts, the process is sequentialized. All communication and group creation is constrained by the dimensionality of our interfaces and mediums.
I think it is important to recognize that visual processing is highly parallel, compared to listening to someone speak. Even though we are limited to serial interfaces for speaking, writing, and drawing, once a visual artifact is created, the artifact speaks back to a parallel interface. This is what makes models so effective.
To get the most out of visual artifacts requires a dynamic process of creation, listening, and interpersonal interaction. It is the using of all our sensory dimensions, our minds, and our social nature. By extension it is probably best to use multiple models as well, quite possibly at the same time.
I recently asked the CEO of a startup, one who is thoroughly an effectual thinker, what he thought of building an economic model similar to that found in "Developing Products in Half the Time." The answer was that he would not believe the model. I then asked a product manager of a company servicing a mature market the same question, and the answer was that not only do they build models, but they drive the model from data from past projects and industry analysis, and the CEO hammers every corner of the model until they believe it represents reality. Only then is a project approved. This CEO is a hard core causal thinker.
I was not at all surprised. The effectual thinker knows that data and assumptions are suspect and constant feedback is more reliable. The causal thinker knows that if the data is good, they can make better tradeoffs. Each has its place.
Regardless of ones thinking tendencies, a lot can be learned by experimenting with a model. By playing with assumptions and their effect on the economics, one can get a feel for which assumptions need better testing and how they relate to development tradeoffs.
I'll walk through and example and see what we can learn. Here is the story: we are a capital equipment supplier selling manufacturing equipment to electronics manufacturers in the US and Asia. The market size is $100M per year, our product is priced at $200K, and the market grows at 8% per year. The product is targeted at a newly developing market segment.
Let's get started on a model. The first thing we need to do is model revenue. Because we are targeting a new segment, we will model the market and sales separately. The market model has two components: the available market and the diffusion of the new product. The available market is:
Available = Market Size X Quality X Awareness X Buy.
The Quality factor is used to model the effect of entering the market and then improving quality. This is a simple way of accounting for learning through market experience. Awareness is one of the A's in the classic ATAR model. Buy models the fact that there are substitutable solutions and this is the portion of sales that can be expected.
The diffusion effect is the standard model that uses a Trial Rate and a Copy Rate.
Let's build out a spreadsheet for this and talk about its assumptions:
Market size is in units sold with an 8% growth rate. The market will only buy half as many systems in the first year and four years later quality does not matter. Awareness starts out at 25% and ends at 100% in year 7. The assumption is that there are many small customers all over the world that are hard to find. 60% of the market will buy this type of solution and 40% will by substitute solutions or adapt what they already use. The diffusion model uses a trial rate of 10% and a copy rate of 20% The adopter column displays the number of systems per year that the market will buy. The penetration column shows the total systems sold. At year ten 350 systems will be in use.
The important assumption is that there are multiple competitors entering the market at the same time, and this represents how the addressable market will play out as a whole. We now have to model how much of this market we can take:
This model says that we can make 80% of the addressable market aware of our product, but initially we can only support sales to 30% of the available market. By year 4 we can support 100% of the available market. Our sales conversion rate is 30%, which is a way of modeling market share. The model says we can take 30% of the market we can reach. The Quantity column represents repeat business on average for this type of product. Sales ramps up over the first few years as customers gain experience with the product in production and purchase risk decreases. The end result is a number of units sold.
Multiply these models together and with sales prices and we have a revenue model for the base case:
The goal is to model four sensitivities: COGS, Development Cost, Performance, and Delay to Market. We can model performance reduction and market delay in the revenue model. Performance is modeled by reducing the Buy column of the market model by 10%. If the performance is lowered, people will turn to substitutes. We could also model this in the sales conversion rate, but using Buy simplifies the model. Market delay is modeled by delaying availability one year and lowering the sales conversion rate to reflect a loss of market share. The assumption is that a one year delay will cause a 33% reduction in market share. If you play around with the spreadsheet, it becomes clear that it is loss of market share that causes the largest loss. If this is not the driving factor in a new product, you can model the delay in other ways. For example, if your competitor can lock up the input value chain you can model higher COGS. Here are all three models side by side:
The left graph is 10% performance reduction, the middle graph is the base case, and the right graph is the one year delay to market. Clearly in this case delay to market has the bigger effect.
Let's now model revenue and look at the other to sensitivities:
COGS is set to 55%, thus a 45% margin, which is a very conservative number. We model higher production costs by raising COGS 10%. Our SG&A is 25%. Development cost is $1M, and there is a $100K yearly development cost associated with product improvements. A cost overrun is modeled as a 10% increase in development cost. We use a 30% tax rate and make an approximate cash flow projection. We then calculate a IRR of 44%, a NPV, etc.
A note on the model: the base values are setup in a table so we can tweak the base assumptions.
The sensitivities are handled with switches:
Each sensitivity is tested using the switches and the result is graphed:
A one year delay to market has the greatest effect followed by manufacturing cost. Performance impact is much smaller and development cost is very much smaller! However, what gets a lot of attention in the heat of development? Development cost, especially if you are in a start up. Sometimes you just don't have the cash and you have no choice but to starve development. But, if your product development looks like this, your decisions should reflect it. This is the basis of product development in all Reinertsen's books on Lean Product Development.
How realistic is the model? Like most models the market model is the most difficult, and market share the hardest number to estimate. If you have developed similar products, you can use analogies. Performance requires some guess work, but development cost and manufacturing cost are usually fairly accurate, and when they are not, you can improve the numbers as the project develops. We can deal with the market delay estimation by testing, but guess what? You have to at least have a prototype product to do that? You can do better by concept testing before engineering development. There is no other choice than to do the best you can and refine the model as you go along.
Much of the value of economic modeling is in the thinking process. Are you a fast follower? Does the late entry effect market share? Does it effect development cost? Is this a winner take all market? How does performance affect sales? Even if the model is not perfect, you will probably gain a first order approximation of what is driving cost and what should be managed. The point is that your intuition may be inaccurate, so using models will test it. If you can build the model as a cross functional team, you will have a much better model. Modeling as a team forces you to justify assumptions and reach common agreement on how to manage the cost of product development so that everyone is aligned and not working at cross purposes and different assumptions on cost.
So, if this were your model, what would you do with it? What tradeoffs would you make? How would it affect your process and decision making?
Note: if you want a copy of the Numbers spreadsheet used in this post, send me an e-mail. See the contact page of the Website.
Standard product development wisdom says use a Phase Gate process, but is that always the best practice? Reinertsen is appropriately suspicious of methods and best practices, and offers principles in his Lean Product Development. In general I side with Reinersten, but are there circumstances where even Lean Product Development principles dare not go? Let's take a look at a special case of product development: products created by entrepreneurs.
According to Saras D. Sarasvathy at University of Washington, entrepreneurs use effectual reasoning as opposed to causal reasoning:
Causal rationality begins with a pre-determined goal and a given set of means, and seeks to identify the optimal - fastest, cheapest, most efficient, etc - alternative to achieve the goal.
Effectual reasoning... begins with a given set of means and allows goals to emerge contingently over time from the varied imagination and diverse aspirations of the founders and the people they interact with.
The contrast between the two forms of reasoning is shown in the following diagram from Saravathy:
The causal model starts with markets, and ends with segmentation. This approach is very similar to Phase Gate approaches to product development. The standard PDMA stages are:
- Opportunity Identification
- Concept Generation
- Concept Evaluation
Opportunity Identification selects markets and segments. At the end of Concept Evaluation the business case is complete and it is time to execute the Development and Launch. The Phase Gate framework assumes causal thinking.
Lean principles begin with an economic model that quantifies four economic objectives:
- Cycle Time
- Product Cost
- Product Value
- Development Expense
The goal is to be able to trade off each objective to maximize economic gains. The remaining principles are about maximizing economic outcomes. This works well with causal thinking, because causal thinking is about goals and how to reach them. Lean is about the goal of maximization of economic results by achieving flow.
Effectual thinking turns causal thinking on its head. Rather than plan then execute, one executes and then plans. Start with the customer, even selling what you don't yet have, then figure it out as you go. One is never sure what market they will end up in. Phase Gate is irrelevant in this context, because you Launch first, develop the product, then define your market as you go.
With Lean one can sell an undeveloped product, develop an economic model, and then apply Lean principles. However, in general effectual thinkers are in playing with new products in new markets. Saravathy describes the situation using the term Suicide Quadrant. It is very difficult to build an economic model in Suicide Quadrant.
Nonetheless, other Lean principles apply. According to Saravathy effectual thinking operates on three principles:
- Effectual reasoning emphasizes affordable loss
- Effectual reasoning is built upon strategic partnerships
- Effectual reasoning stresses the leveraging of contingencies
Lean has similar principles:
- Reduce loss from bad outcomes via fast feedback
- Exploit unplanned economic opportunities
- Use fast feedback to make learning faster and more efficient
Effectual reasoning and Lean both rely on feedback. This commonality makes Lean Product Development a better match for effectual thinkers than Phase Gate. Most entrepreneurs would reject Phase Gate simply because it represents causal thinking, which they reject outright. Saravathy points out that companies tend to shift from effectual thinking to causal thinking as they mature. Applying a limited form of Lean Product Development during the early life of an entrepreneurial company should allow a smooth transition to causal thinking. As the shift occurs Lean principles can be applied as appropriate until the foundational economic model becomes practical.
Phase Gate is more appropriate for a very mature company that wants to minimize risk at the expense of cycle time. Trying to force Phase Gate on an entrepreneur will result in an immediate immune response. So if you find yourself working for an entrepreneur and want to put some process in place, I suggest avoiding Phase Gate and take a look a Lean Product Development. Look up Reinertsen on Amazon and buy a couple of books or give me a jingle.