Some professions, such as software developers, regularly utilize the similarity heuristic. For software developers, the similarity heuristic is utilized when performing debugging tasks. A software bug exhibits a set of symptoms indicating the existence of a problem. In general, similar symptoms are caused by similar types of programming errors. By comparing these symptoms with those of previously corrected software flaws, a developer is able to determine the most probable cause and take an effective course of action. Over time, a developer’s past experiences will allow their use of the similarity heuristic to be highly effective, quickly choosing the debugging approach that will likely reveal the problem’s source.
Definition from Wikipedia, the free encyclopedia
In the somewhat hermetically sealed world of programming, the similarity heuristic model as it is described above makes a great deal of sense as one is usually dealing with what are commonly referred to as single stream, “static elements or attributes.” However, the operational attributes of the real-world are vastly different in that they are dynamic in nature, consisting of multiple transactional streams. Therefore when you take this same historic baseline reference approach, and confine the understanding of the problem to a single transactional stream (in this case historic attributes) you end up with a somewhat myopic view of the process. This of course is indicative of the equation-based models that are employed by the majority of ERP software vendors, and is one of the main reasons why the resulting applications have been largely ineffective.
The key differentiators with the methodology as it is outlined, and those that are emerging is the recognition that while past experience can be one of many important indicators of future performance, a single stream point of reference can quickly become irrelevant especially when it lacks an adaptive capacity through which real-world variables are identified, captured and applied to produce a sustainable reference model.
Like the similarity heuristic method, other approaches are also limited by the same capture once – use many, sequential thought process. The iterative or successive approximation methodology for example, estimates the value for an unknown quantity by repeatedly comparing it to a sequence of known quantities (note the word sequence). Once again, and at least on the surface, this approach also seems reasonable. But as is the case with the similarity heuristic method, the same problems occur when the known quantities or declarative values are not dynamically monitored and updated to reflect real-world circumstances. Nor broad enough that they include all multiple attribute streams that influence a collective outcome. This means that the reliability of a declarative value of a “known” quantity diminishes over time as it becomes either too narrow in scope or completely ineffective as the assumptive elements or sequence become outdated. My research has shown that the rate of “reliability” degeneration is further accelerated in those instances in which the known elements are not properly understood due to a lack of collaboration at the formulation stage.
Applying the above principles to a supply chain practice, demonstrates that the iterative model is reflective of a sequential architecture or thought process in that it predominantly relies upon attributes that have been previously defined in the static “chain” such as purchasing (including indigenous sub-attributes like price, historic quality and delivery performance, etc), tracking and order fulfillment, as well as financial reconciliation (internal, multiple streams). While the attribute tags themselves can remain constant, their assigned weighted value (re level of importance) including those of the corresponding sub-attributes can and do change dynamical at the transactional level. Depending on the weighted importance that is given to each attribute and sub-attribute, combined with the impact of external attribute streams (which are often subject to unanticipated or previously unexpected changes), means that the collective effect will likely and dramatically alter the “known values” upon which the model was originally built. This means that neither the attributes (upon which the comparative approximations are based) including external variables are static in nature, thus making a reliable comparative model somewhat unreliable as time progresses. Therefore the traditional application of the iterative methodology in which either a declarative or imperative programming method is used to define a single attribute stream of practice is largely ineffective relative to quantifying and managing the supply chain process.
The utilization of equation-based methodologies in which an attempt is made to establish a reference baseline that in reality is not static in nature, but is instead in a state of constant evolution, has contributed to the 85% rate of initiative failure.
In fact, by attempting to establish a single stream, non-adaptive reference baseline, organizations are forced to institute a change management strategy in an effort to align the operational realities of their business with a model that is not reflective of how they or their transactional partners actually function in the real world.
We need to Build Camcorders, Not Improve Canvases
In a paper I wrote in early 2005, I made reference to the fact that software developers needed to build camcorders versus improving canvases. It was an interesting paper to write on many levels as the concepts of agent-based versus equation-based modeling was not widely known in the general market.
The interest in their significant differences was even less at that time as the rate of initiative failures had just started to enter public consciousness.
And while the analogy may be somewhat “dated” so to speak, its message has taken on even greater meaning with the emergence of Web 2.0 (through to Web 4.0) as well as new business models such as Software as a Service (SaaS) and even Free and Open Source Software (FOSS).
Here is the excerpt from that paper. (Please note that I have removed the name of the original vendor and have replaced it with the ABC Company moniker.)
Traditional solutions such as those developed by ABC Company are built around what is referred to as an equation-based model. Equation-based models utilize interactive assumptions, and therefore focus primarily on the somewhat static interaction between independent entities. (The key words here are static interactions and assumptions.)
Recall for a moment the example of the artist attempting to paint a portrait of a subject who is in perpetual motion. While the artist may capture for a brief moment a portion of the image on canvas, the reality is that the effort will always be a work in progress. (It is important to note once again, that most e-procurement initiatives take 1, 2 or more years to implement, only to meet customer expectations a mere 25% of the time.)
While you can improve the quality of the paint the artist is using, add more advanced lighting, and even improve brush stroke speed and techniques, the fact remains that the end result is unlikely to change in any significant way. The only way to achieve the desired result is to create an environment where the subject is stationary at all times. To be specific, you have to restrict the subject from moving, from doing what is natural. (Reference change management.)
Now picture if you will, the same “moving” subject being captured on film with a camcorder. Regardless of the time of day or changing locations, the subject is always captured in real time. In essence, you are now able to adapt and even interact with the subject’s environment. Unlike a painting, when you finish one shot, you can quickly and easily move onto the next, without losing what you have already captured. When required, you have the ability to playback the previous images within seconds.
In this example, you are not asking the subject to change the way he or she operates. Nor are you trying to restrict his or her natural movements. This is at the heart of agent-based modeling. It is the reason that traditional equation-based applications such as the one offered by ABC Company will never achieve the maximum results in the shortest period of time.
The fact remains that it is imperative to simultaneously engage all parties to a transaction on a real time basis, with an ability to meet an evolving set of demands as required. Intelligent, strategic sourcing and procurement, through to tracking and fulfillment and finally financial reconciliation with a “true cost” auditing capability is the earmark of a solution which truly leverages the power of the Internet. To effectively develop this type of solution, one needs to abandon equation-based modeling which is driven by the aforementioned interactive assumptions, and replace it with agent-based modeling which is driven by the ability to understand the unique operating attributes of all trading partners on a continuous real-time basis. This leads to a synchronization of independent capabilities, which in turn creates an environment of adaptive responsiveness, producing immediate results in the area of supply chain efficiencies.
This paper, which interestingly enough experiences periods of renaissance as a result of a growing international readership, indicates that the problem-solving methodologies that are commonly employed by most organizations is represented by the artist attempting to paint a still life subject (attribute).
In reality however, the “subjects” or attributes in a supply (chain) practice are in a state of constant motion and change. This means that the artist, no matter how skilled or advanced is always limited by the tools (or methodologies) he or she is using.
To effectively capture the dynamic elements within this kind of environment, a different methodology such as strand commonality (re camcorder) must be employed to ensure that an accurate picture is captured on an ongoing basis, thereby bridging or synchronizing the chasms between multiple transactional streams.
While this example is an illustrative, 10,000 foot level introduction to the differences between the traditional methodologies that are still in use, and those that are now emerging, it will hopefully create a starting point of reference that will widen the lens through which supply chain practice is understood and ultimately improved.