Our latest version of the Fast Excel Development Template fully supports modular systems. We have written here and there about modular systems. But, up until now, we have yet to provide in-depth content on how modules work and why you would use them instead of a single workbook. This article will be the first in a series to do exactly that.
The latest version of the Fast Excel Development Template is built by my good friend and Production-Scheduling.com Partner, Gabriele Tettamanzi. Gabriele has a unique perspective. He has both management experience (operations manager for a tier-one automotive supplier) and consulting capability (World-Class Manufacturing at Fiat Chrysler Auto). He knows Operational Excellence and supply chain inside-out. But also has deep knowledge in Management Systems and is a capable programmer.
I tell you this because I want to illustrate why it is important for us at Production-Scheduling.com to combine process excellence with systems development skills. In this article, I’ll show how these modular systems, built with the Fast Excel Development Template, can apply powerful ideas of that sit at the heart of lean-Agile best practices.
These ideas involve a bottoms-up approach to planning. This contrasts with the top-down approach that has largely failed with enterprise applications like ERP. The main takeaway is this: Planning systems are only as strong or weak as the underlying approach allows them to be. It is important to get the concept right and understand why top-down planning has often failed to help companies respond to uncertainty and prepare for the future.
This story will take a step back and challenge some of the assumptions around planning systems. It will then suggest some practical principles to use that can lead to better planning. If you would like a more hands-on tutorial of exactly how to develop modular Excel-based systems with the Development Template then go here. This link will take you to a webinar (or after 1 Nov 2022, a replay of a webinar) where we will provide exactly that. In the meantime, it might be helpful to understand some context.
Before we dive in, a word on software versus spreadsheets. The weaknesses of top-down planning may be illustrated with examples of packaged software. Our suggestion of a bottom-up solution may be implemented in the form of spreadsheets. However, this argument is not about how ERP software is dumb and spreadsheets should rule the world. We offer an Excel solution only because it is something that you can take away and start using today. It may be that a software platform is available that genuinely supports these ideas. If there is then let me know in the comments below. But it makes sense to understand and apply the ideas before embarking on a project to evaluate, approve and implement yet another software product.
I would like to give you an overview of how the Fast Excel Development Method works with modules and tools versus modular systems. And exactly what we mean with these terms. But first, let’s take a step back and ask- why?
ERP Failure and the Persistence of Spreadsheets
MRP (Material Requirements Planning) was one of the very first enterprise applications. We will leave the exact history to another article. For now let’s say that early computers were first used to explode Bills of Material at aircraft engine-makers GE and Rolls Royce in the 1950’s. The process and system of MRP was created commercially as MRP by Joe Orlicky and IBM in 1964.
Enterprise systems grew around this original function: MRP came first, then MRPII (Manufacturing Resources Planning) and finally ERP (Enterprise Resource Planning). ERP seems to have stuck as the term for software that runs the operations of a business. The “P” for planning is more of a historical artifact than a true description of the software’s function.
From its beginnings as a planning routine, functions upon functions have been added in the last 20 years. Now most ERP systems have very little to do with planning. They are much more like systems of transactional history and employee control rather than tools to help us make decisions about the future.
So, in spite of the name, it shouldn’t be a surprise that planning is often the area where ERP software systems perform the worst. On the whole, it seems like the accounting and transactional functions work better than do the planning and analytics. This is true for both operations and finance.
Why like this? Maybe because management pays more attention to closing the books and controlling finance than the nitty-gritty of operations and supply chain. Perhaps the transactional stuff is easier to standardise across companies that follow the same accounting standards, whereas the operational strategy across companies is more diverse.
ERP was often sold to management and the board of directors as a tool to achieve financial visibility and control. In order to accommodate the diverse needs of different businesses, many implementations involved customisation. And that customisation made things go wrong. Others attempted to put in the standard system without customisation. But if those standard processes didn’t match the needs of the business, then inevitably people don’t follow them, even though they are “supposed to”.
The result is that ERP/MRP may bring value to a business, but it is likely that value has less to do with planning and more to do with control.
You read this article on a website that provides ways to develop planning tools in Excel. The site has been going for more than 20 years. Currently, we get a few thousand visitors a month and have 17,419 subscribers (as of 15 October 2022). If ERP was good at planning then perhaps this would not be the case. Moreover, these subscribers are not small companies who cannot afford the “Best in Class” ERP systems. Very many are from multinational companies running $100m+ plants. The best estimate of median revenue of the organisations represented in the readership is in the range of tens to a hundred million US dollars or Euros per year.
Various surveys persistently rate the success of ERP projects at around 20-30%. Take a minute to think about that. If any other non-IT technology had an 70% failure rate then very quickly the capital spend would dry up. What company would commit capital spend to something with that track record? And so it might be for capital equipment, plant or almost any other capital project. There is something special about ERP (and other large enterprise applications) and it seems to play by different rules.
Now, you might say: The high profile failures (Lidl, Worth and Co, Revlon MillerCoors to name but a few) draw undue attention and that ignores the many successful case studies. But, if you strip out vendor-written or sponsored cases (SAP declared Lidl a winner in its “Innovation” category for 2016. The €500m SAP project was cancelled in 2017!) there is very little in the way of independent evidence of consistent success. And many independent findings of failure.
Compare this with another technology that was born at around the same time: CNC (Computer Numerical Control) Machining. Fair to say that manufacturing has been transformed by the precision, speed, productivity and quality of CNC machines. Any machine shop commissioning a new CNC today would have a very different expectation of success rate. If you tried to sell an engineering manager a CNC that had a track record of 70% failure in qualification and commissioning, you would rightly get laughed out of the room.
So, why do IT systems fail? Why do so many companies go into these projects with unrealistic expectations? A big question and not one I can attempt to fully answer here. I am confident to say that complexity plays a large part.
ERP and Complexity
An ERP is a complex system. An ERP implementation is a complex project. a complex system is defined by:
Complex systems are systems whose behavior is intrinsically difficult to model due to the dependencies, competitions, relationships, or other types of interactions between their parts or between a given system and its environment.Wikipedia, Complex System
Two critical features of a complex system that enterprise systems share are non-locality (action at a distance) and non-proportionality (small changes can have very big effects). In his excellent book, Avoiding IT Disasters: Fallacies about enterprise systems and how you can rise above them, Lance Gutteridge discusses why our common sense, developed in the physical world, is ill-equipped to cope with the world of software.
Our “common-sense” view of systems is mechanical. The way we think about systems (and organisations…) has roots in the industrial revolution. Like many other ways of thinking (at least in the West), this model is mostly derived from the Victorian Era with some roots in ancient Greece. It is so common and widespread that we accept it simply as how the world works. But it does not describe reality, it is just a mental model. Many times it is the wrong mental model.
In a mechanical view, a system is always local and proportional. If there is a problem, then the root cause is always found near to where the symptom occurs. And small errors will only have small effects, so you can go forward with known issues and resolve them through incremental improvement.
In this context, the contrast between enterprise software and machines like CNC is insightful. With machining, if you are getting out-of-tolerance parts, then it is likely to be something to do with the drive or cutting mechanism. Zoom into that part of the machine and find a worn or misaligned tool. If it is a programming error, then you can trace it back to the CAD file and the area of the part that corresponds to that failing tolerance. Improvements in CAD software, say, or the drive mechanism or tool maintenance can be made independently from each other and will benefit to every part that is made in the future with that improved system.
Not so, with ERP. The flow of data between relational database tables, input screens, scanners and decision-makers is tangled and complicated. It take dozens or hundreds of document pages to describe the function and logic. The only thing that fully describes the system is the source code itself. This is many layers removed from the user requirements and business process flows. A problem in one part can have multiple and disparate effects.
A mechanical system supports a top-down approach. You can start with requirements and specifications at the top-level and cascade this all the way down the individual elements. The whole is the sum of its parts. A complex system defies this top-down approach- the whole is more than the sum of its parts.
A complex system requires a way of thinking which challenges the mechanical model. This looks for bottom-up order which starts with the elements and how they interact with each other, rather than how they “ought to” interact, based on top-down objectives. Thinking about it this way, it is going to be difficult to take a top-down approach to implementing and managing a complex system. Perhaps this is why so many ERP projects fail.
If you have been part of a failed ERP implementation – and judging by my conversations over the past 10 years, many readers have – then it is not your fault. With enterprise software, we have unleashed a beast that humans cannot fully comprehend.
This is counter-intuitive. After all, most executives think they understand what an ERP system does. Most ERP systems have very rudimentary logic. The “thinking” objective of a successful ERP system is very basic. The vast majority of manufacturers are some great distance away from machine learning and artificial intelligence, despite the hype around Industry 4.0. Yet, even with basic logic and mathematics, still these projects fail.
To be fair, the definition of failure is broad. It might mean it blew the budget, overran the time. It doesn’t have to be a multi-million dollar write-off. More commonly, failure means that the system does not perform to specified requirements. It doesn’t do what it set out to do. As most companies don’t write off the expense and start again, the people who work in the business have to make do. And that usually means taking data out of the system and working it in Excel.
Excel and the Persistence of Spreadsheets
No surprise then, that Excel is still by far the most widely used software product in manufacturing, and probably business as a whole. Software vendors are keen to point out the perils of spreadsheets, not least because it is often embedded in the processes that they are trying to replace. And it is tempting to point out the various catastrophic errors that have arisen from spreadsheet mistakes.
But comparing ERP failure with spreadsheet errors is a category mistake. Every Excel workbook starts with a blank spreadsheet. Aside from purely home-grown solutions, every ERP system starts with a packaged product. Vendors will often enable and even encourage customisation (it drives revenue and shifts liability) but the starting point is usually a legacy system. Or more likely, a smush of legacy systems built up over years of acquisition and/or upgrade.
We have all seen some shockingly bad slideshows and read some diabolical documents. But we look to the author for responsibility, we don’t blame Powerpoint or Word. Excel is a tool… for developing tools. Sometimes many Excel files might work together as a system. And these tools and systems can be good, bad or ugly, depending on how they have been built.
Why develop Excel-based systems for enterprise applications? Because the alternative often doesn’t work and people spend too much time doing repetitive manual data work that can be easily automated. Do we need to throw out our ERP? Of course not. There are likely some things it does well (probably the transactional stuff) and we need to use that data not replicate it.
How do Modular Systems in Excel help With Complexity?
We develop systems using modules for two key reasons:
- To Control the Flow – divide a complex flow of information and decision-making into manageable chunks.
If we are planning demand, inventory, materials and capacity, we can encapsulate each of these functions in their own module. The user organisation can test and validate the inputs and outputs of each. This builds detailed understanding and trust. It avoids the “black-box” approach which takes one set of inputs and gives a final set of outputs, but little understanding of how they were derived.
- Create Reusable Building-Blocks – Modules that get defined once and can be deployed multiple times.
This means there is much more focus on improving a standard block of logic. That cycle of use and feedback is crucial to improvement. The faster the cycle turns, the faster the module will improve. When we have an improved version, we swap it with the one in deployment without needing to re-engineer the entire system.
Now, if we were to develop an entire integrated planning system, but just break the design into modules that connect together, there would be benefits. But this would still be a top-down approach. It might not help with those issues arising from complexity. If instead, each module is created with its own processes, able to make decisions and to respond to feedback, then it might gain some of the benefits from being an adaptive agent in a complex system rather than a set of gears and pulleys in a bigger machine.
This idea will take much more time to illustrate thoroughly. In time, I can flesh this out further in other articles later in this series. For now let’s consider an example with two scenarios:
Scenario A: Inventory is too high with too many shortages so we need to improve demand forecasting. We go out and buy new forecasting software and integrate it with our ERP. We now take a generated forecast and feed it straight into our planning routine so that it can reduce shortages and turn inventory faster. A small change in the forecasting algorithm can have some big effects in our performance. But we don’t really understand those changes (it is based on the vendor’s proprietary algorithm) and we have to wait and see if the changes are positive or negative.
Scenario B: Inventory is too high with too many shortages so we need to improve demand forecasting. We equip our demand planners with a module that creates forecasts and measures their accuracy. The planners use this measurement feedback to tune the algorithm. (Most forecasting software uses exponential smoothing and this formula is widely available and understood, so tweaking it doesn’t require proprietary knowledge). The demand planners use this information about forecasting performance to improve the linkage with sales and operations. The outputs help them prioritise their time and focus on the critical few parts that are difficult to forecast and need attention. This makes them more effective and directs the improvements into the place where there is most leverage. The effects on inventory availability and costs are then targeted and more predictable.
Scenario A depends on a full top-down approach to getting results. Management select the forecasting software and foist it onto the planners. The success is dependent on something hidden in the source code of the product that no-one in the company understands. Scenario B uses a bottom-up approach. It empowers the planners to identify all the advantages to them and then use those benefits to get better overall performance.
Scenario A requires one big implementation, but Scenario B can be done in a step-wise fashion. Scenario A requires a substantial capital budget up-front and requires a leap of faith. B has a more organic progression and uses success to build the business case for more.
There are strong advantages to creating semi-autonomous blocks of planning and feedback. This is the beginning of Systems Thinking with a focus on how the system is made up from interacting elements. It can lead to a much more responsive, adaptive organisation. There is much more to say on this, but for now we will move on to the practicalities of developing in Excel as in Scenario B.
How to Develop Good Excel Systems
On to developing good systems in Excel. Before I give you downloads, screen-share videos and tutorials, let’s think about some concepts. There are about 1800 words left in this article and it will take you about 7-8 minutes to finish reading it. Please take the time before jumping into spreadsheets and I promise it will be worthwhile.
I’d like to talk about how to avoid both the problems of complexity and the typical spreadsheet mistakes? Here are five simple rules to follow:
- Clearly define the process and how it creates value.
Make sure you know what you want to achieve and that it matches what the user wants and needs. You should be able to explain it in one picture or a few simple sentences.
- Take a step-wise or phased approach to the scope.
Start with the simple “Must-Haves” and then add “Nice-To-Have” later. Better to deploy a small number of effective functions, rather than one big system. Use a closed loop cycle (Plan-Do-Check-Act or similar) and verify each step or phase is working before moving onwards.
- Use a structured approach.
I offer up the Fast Excel Development Method as one approach to use and the template will help. “Separate out Data, Calculations and Reporting” – do this one thing and you are halfway there.
- Manage it in a flow.
Create the tool or system as a number of linear steps that link together flow. If there are a number of functional areas (material, capacity, inventory, demand) then put them each in a module and join the modules up in a flow. Do things one at a time, test and verify each step.
- Make it visible and visual.
Keep the logic visible and not hidden away and buried in code. Users may not know the detailed mechanics of the tool or system, but they should know the inputs, the main logic steps and how the outputs are calculated. Making the flow visible and visual is one of the key benefits of using spreadsheets from other forms of systems development. Avoid the “black box” approach where no-one but the developer knows how it works.
I could write an article on each of these, – and perhaps I will. In the remainder of this article, I would like to show how the Fast Excel Method and Template can scale up. From a simple tool to an integrated planning system, the Fast Excel Methodology and Template can scale in a step-wise manner. All without needing to write any VBA code.
A Few Words On Terminology
Many of these terms are used in other fields. I find that sometimes people come to the site expecting something from a certain word, for example “template”. when the download does not match this expectation, then it can cause confusion or disappointment. This simple glossary might help align on certain terms.
- Fast Excel Development Method
A set of principles and techniques for developing tools and systems using Excel spreadsheets. Here is an overview.
- Fast Excel
Abbreviation of Fast Excel Development Method above.
- Development Template
A free Excel workbook prebuilt with template sheets and VBA macros for developing tools and systems using the Fast Excel Development Method.
- Planning Template
An Excel workbook that has some simple planning logic in it. We tend not to offer planning templates because they usually violate the principle of “Separate Data, Calculations and Reporting”. Read here for an example of a planning template and more explanation of the difference. Instead we offer free planning tools and systems that are built using the Development Template.
- (Planning) Tool
This is usually a single Excel workbook that performs a planning function. Examples might be Capacity Planning, Material Planning or Material-Constrained Scheduling. It is more powerful than a template and is more equivalent to a software tool. A tool will usually have data table inputs rather than typing values into a spreadsheet.
- (Planning) System
A set of Excel workbook modules that each perform a function. The system is often multi-functional and can scale up to full integrated planning (demand, inventory, capacity, MRP, purchasing etc.). A system has a number of advantages over a tool that will be explained below.
A module is a single Excel workbook that performs a single planning function. Multiple modules work together as a system. The output of a module usually a table of data as an input to another module. A tool or a set of tools can be made to work together as a modules in a system.
Structure and Flow in a Fast Excel Development
There are two key ideas in the rules above for effective Excel development: Structure and Flow. The Development Template provides both and developments that use it can be understood by anyone who understands the Fast Excel Development Method.
The information flows from the input tables, through the calculations and into the output table or a report. You can see the flow in three key ways:
- From cell to cell using formulas in the same worksheet. This is a standard of the Fast Excel Development Method since the beginning. All calculations in a single sheet are placed in formula placed in row 8. The formula work left to right, starting in column A in the case of a Table sheet or the most immediate right-hand column in the case of Query, Stack or Pivot sheets.
- From worksheet to worksheet in the same workbook. The direction of flow is from left to right. Hence, the first sheets are green input sheets, they pass data to some grey calculation sheets. That combines with other input sheets and then a final calculation sheet before the yellow report sheets at the end. The link between worksheets is the usual way of cell references, look-ups from named ranges and pivot tables.
- From Module to Module in the same system. Each of the modules will have sheets that flow as above. Then then output of one module becomes the input of the next. This flow is defined in a master module. The link between Modules is not cell references and linked workbooks, but rather text files. Here are two different examples
With three defined ways for information to flow, we have the beginnings of a hierarchy. This is listed above from the bottom (most detail) upwards.
How do we know when to add another formula versus start another sheet? How do we know whether to have one big workbook with lots of sheets or break it up into modules? Here are some guidelines for each:
- Formulas. Keep going with formulas for calculations on the same table of data. If you have things to be done on sales orders at the level of sales order lines, then do all of that in the same sales orders sheet. When you have a change in table structure, usually when performing a table operation then you need a new worksheet.
- Worksheets. So, a new worksheet for each table. The older style of Fast Excel Development would just keep adding worksheets. Now, we would go to a new module when there is a change in function. Or when you would like to contain one block of function and control its inputs and outputs.
- Module Workbooks. You can add new modules with a change in function (demand planning, MRP, capacity, etc.). This way you can scale up without needing to change the things that are already working. There are a lot of benefits in keeping each module in a controlled size so you don’t end up with 30mb files with 10’s or 100’s of sheets.
The structure means that you can manage things top-down (what high-level function in the system is being deployed) and bottom-up (what is the logic and data required to achieve the function). You drill down to formulas and zoom out to modules. Developers need to know the high details in each formula; admins will be happy understanding the worksheet level; and decision-makers the modules and how it ties to the business.
How Structure and Flow Help Manage Complexity
Planning systems like this one are very much more specialised and focused than an ERP system. Planning is about the future and a focused planning system will not try and manage historical transactions. It would not be possible to map out the information flow for a typical ERP system without many documents and diagrams that show all the paths.
So, we can work alongside the transactional system. Our Excel-based planning system can use all that transactional data. But by maintaining focus on the future and a narrow set of decisions to be made in planning (what, how much and when do we buy, make and send), the system can keep a clear structure and flow.
The structure enables us to define exactly what inputs and outputs each step can take. These inputs can be verified, validated and controlled. This means the overall project can be divided into small, self-contained units of delivery. Anyone who has seen the power of Process Mapping (and Value-Stream Mapping, SIPOC, Agile, DevOps et al) will appreciate the having a clearly defined flow that shows how value is created.
This means that systems developed with this method can be done in days and weeks rather than months, quarters or even years. And the project success rate is in the 90%+ range. Not because this method works magic, but because the expectations are set against a small number of critical deliverables. And the visible structure and flow means that we humans with our physical worldview can understand and control it better.
Takeaways: How Do You Get Started?
We will shortly be releasing version 4.2 of our Fast Excel Development Template. Soon, this will be a link to an article to the demonstration system that my colleague Gabriele Tettamanzi will write. This webinar on 1-November 2022 will show you how to use it.
Like many things, it is best understood with an example. We will show you how to build a simple system with modules. See you in the webinar.