I was recently asked to provide a contextual point of reference that would assist the uninitiated public in understanding the reasons why the GoC’s Shared Services program has become such a contentious issue.
The Answer was Baseball!
Rather than getting into a complicated dissertation of the differences between an agent-based model versus an equation-based model where the prescribed processes reflect and therefore adapt to stakeholder characteristics and objectives, I tried to think of what I would say to one of my sports fanatic friends. And it hit me, the answer is baseball!
Think of it this way, what would happen if the Commissioner of Baseball delivered an edict that all players had to use the same bat. Same weight, same length, same everything. Would it make baseball a better game?
What about hockey. Would Wayne Gretzky have been as successful as he was if during his playing days the hockey hierarchy made it mandatory that all players use the same sticks, the same skates and so on! You see where I am going here.
The fact is that each player is uniquely different, and while there are established “generalized” standards in terms of equipment specifications, by and large there is a vast and somewhat diverse number of products that are in use. And each piece of equipment is carefully selected by the players themselves to reflect their individual skills and adapt to their unique playing style.
So, if the Commissioner of baseball extended his reach beyond the realm of centrally establishing a set of guiding principles based on maintaining a balance between player proclivities relative to equipment and ensuring a consistently fair game by dictating a single or “shared” uniformity to one particular type of product what would happen?
And while this is an overly simplified analogy of the Shared Services concept, it is nonetheless a reasonable reflection of the GoC’s desire to arbitrarily enforce a single platform across the entire government enterprise regardless of individual department interests.
Tunnel Vision
Like individual athletes various government departments and agencies have selected different “equipment” through which they can best meet their unique organizational requirements from an operational standpoint.
Transport Canada for example has made a heavy commitment to Oracle, while Heritage Canada is firmly established within the SAP platform. And while the objective is to deliver value at both the department level as well as on a collective government-wide basis, each are using the tools they feel most comfortable with in terms of “playing at their peak levels of performance.”
Rather than recognizing the individual skill sets (re unique operating attributes) of individual departments or agencies, the GoC hierarchy has made the decision that they alone know what is best in terms of delivering superior results.
In essence, the “Commissioners” of the Federal Government have decided to make all players use the same equipment rather than establishing a collaborative guideline in which important differences are taken into consideration.
Worse yet, and as demonstrated by past player strikes and lockouts in the professional leagues, external stakeholders such as the fans (or in the case of the GoC, suppliers) are left out in the cold.
Adaptive Capacity leads to True Stability and Consistency of Outcome!
Unfortunately the GoC Shared Services strategy employs the somewhat simplistic and unimaginative approach of imposing centralized control at an operational level. This is based on the belief that centralization of functional capacity will provide a stable platform through which best value results will be achieved. Think of the “one bat” for all players scenario.
History has proven that this conclusion could not be further from the truth. In reality, the adaptive capacity of a program reflects a collaborative process in which key stakeholders work toward identifying and achieving a collective, best value outcome. In practical terms, the unique operating strengths of different stakeholders are actually leveraged to achieve both stability of process and consistency of outcome.
Ignoring the steroids fiasco for a moment, while Barry Bonds used a 32 oz bat to hit 73 home runs in 2001, Mark McGwire preferred a 35 oz bat to hit 70 dingers in 1998.
And as a means of illustrating the importance of being able to adapt to changing conditions, Babe Ruth started out using a 54 oz bat early in his career, only to ultimately change to a 27 oz bat in 1927 when he hit his 60 home runs.
Can you imagine telling the Babe or Bonds or for that matter McGwire that they could only use one type of bat? Would it have made them better players? Even more important would it have made the game of baseball better?
Thinking beyond the Known
So in the absence of tangible evidence that a broadly applied Shared Services strategy, in which an unwilling majority of key stakeholders are being dragged along kicking and screaming, will produce superior and sustainable results, why do it?
At the beginning of all my conferences I state that the one objective that I am hoping to achieve is to inspire the people in attendance to think outside of the framework of that with which they are most familiar and comfortable.
There is nothing radical or innovative about a Shared Services strategy. It is not representative of a new lexicon in supply chain terminology or principles, nor is it particularly creative when instituted as an enterprise-wide program.
However for most traditional thinking executives, especially from within the IT ranks, it is a comfort zone of familiarity from the days when technology was seen as the future of business (anyone remember the paperless office), and computers which usually occupied an entire floor or two were managed by a rigid set of standards and individuals few understood.
In a dynamic world where synchronization has replaced sequential process thinking, this tenet is not only out of step, it is out of time.
Or to end where I began, let’s not limit the batters to a standard that stifles their ability to do their best for both themselves and the game as a whole.
To inquire about upcoming conference dates or to schedule seminars for your organization contact Jennifer Cameron at thesenses@rogers.com, or (phone 613-231-7116).
Dan McCabe
March 13, 2008
If baseball bats cost $1M each, there would be fewer bats in baseball..
When it comes to looking at IT solutions, the price of many different solutions become prohibitive..
Chet Frame
March 13, 2008
Great post, Jon! Isn’t the purpose of the bat, regardless of it’s weight or composition, to help the team win? Baseball has implemented some rules as to the size and weights of bats and how far up the barrel one can put the pine tar, and they have some fairly strict guidelines about the generic bat.
In systems speak, you can have different solution systems, but they have to meet some specific criteria and they have to be able to communicate with other systems. The system may have an Oracle label or an SAP label (like a Louisville slugger with the name of your favorite player). Isn’t it more important that the system be used to help the GoC to move ahead more rapidly and communicate what is being done more efficiently?
procureinsights
March 13, 2008
Thank you for your comment Dan. In that one single statement you have demonstrated why traditional technology-centric initiatives (such as the GoC is pursuing) do not work, especially from a cost standpoint.
This is because the $1 million to which you have referred is not based on nor is it a reflection of the quality of the end product that is being delivered (a $1 million bat just doesn’t make sense no matter what it is made of). In reality, the $1 million price tag is actually based on supporting the business model (re infrastructures) of the “bat” manufacturer.
So what is the justification for spending the $1 million? Certainly the success of the Commonwealth of Virginia’s eVA program clearly demonstrates that you do not need to spend a $1 million to obtain a superior product to achieve superior results.
So what you have really uncovered Dan is a problem that plagues all levels of the Government. An individual department’s expenditure of $1 million for their bat is no more justified that the GoC hierarchy’s expenditure of many times that amount under the auspices of saving the taxpayer money through a platform (or bat if we want to stay with the baseball analogy) rationalization program.
Or using the most simplistic illustration possible, instead of having 20 different department’s pay $1 million each for their own “bat” (a $20 million expenditure collectively), the GoC hierarchy is saying that everyone can use the same “bat” for only $10 million. (Note: while this example illustrates the approach from a conceptual perspective, the fact that the GoC hierarchy has yet to disclose the “actual” costs associated with the shared services program – including the costs associated with implementing the necessary conversion and compliance mechanisms, we are to a degree shadow boxing. However you get my point.)
That said I was recently asked a question about the “future of ERP/SCM Software. And in the spirit of the comment I made relative to inspiring those who attend my conferences to “think outside of the framework with which they are most familiar and most comfortable,” I feel that it is appropriate to share my response to this question.
My Response:
There is both a technical as well as business answer to your question David (the name of the individual who had asked the question).
From a business perspective, Software as a Service (SaaS) is an emerging model that is being offered by many of the “next generation” vendors such as COUPA. However it is not indigenous to new organizations or even to a specific “emerging” technology.
Based on my series of interviews with an Ariba senior executive I wrote an article titled The Ariba Interviews: Re-engineering the Future of On-Demand (I have provided a URL link to the article below).
SaaS, which was originally referred to as an On-Demand model, is not new to the industry. However its adoption and subsequent introduction to the market by incumbent vendors such as Ariba and SAP have been somewhat slow due to the challenges associated with the model’s inability to support their current infrastructures. Specifically, the SaaS model substantially reduces the revenue streams associated with traditional licensing models and adjunct maintenance contracts thereby making it virtually impossible for these companies to generate the necessary levels of revenue to maintain and/or sustain profitability.
In fact it appears that Ariba’s more aggressive move as of late to an On-Demand pricing structure was predicated more by their continuing financial struggles (between 2001 and 2004 Ariba lost $3 billion on $1 billion in sales), rather than a shift in technological capabilities or visionary ideals.
Once again the Ariba Interviews article should prove to be an interesting read.
From the technical perspective, the utilization of an agent-based model versus a traditional equation-based model relative to application development represents a significant technological breakthrough. This in turn has led to the emergence of what Mohanbir Sawhney and Jeff Zabin referred to in their 2001 book, The Seven Steps to Nirvana: Strategic Insights into eBusiness Transformation, as the meta-enterprise application.
I have included links to two additional articles that you should find rather interesting as well.
To sum it up best, the dramatic transformations that are taking place from both a business and technological perspective will reshape supply chain practices for the next 25 years and beyond.
Dan, there is no longer any justification for the multi-million initiatives spanning a period of many years. With the emergence of new methodologies, technologies and pricing models instead of $1 miilion, each “bat” now costs $359.99. And as I am sure you will agree, this is a pricing model in which every player (re department) should have the flexibility to chose within acceptable and collectively agreed upon guidelines.
Anne
March 13, 2008
great analogy. baseball must put some parameters around bats: length, no lighther than and no heavier than etc, but within that range the choice is personal? so in the shared services situation given the choices are already made (I do think some limitations could be put in this area, but I also think the market itself is rationalizing so the choices are fewer) should the focus move to making them more interoperable? or should they simply make sure the outputs in terms of what is reported and how and to whom is more standard i.e. financial data and supporting information? I think determining a standard series of reports that would help manage, attribute accountability and governance would then help determine the inputs and eventually maybe more complete and more timely information to make decisions with. your thoughts? wouldn’t it be nice to know what it actually costs to deliver a program…. to actually have the data on all aspects of operations and delivery including procurement?
procureinsights
March 13, 2008
Thank you for your comment Chet.
Your last statement speaks volumes.
I recently read an excerpt from one of my favorite books by Dale Neef titled e-Procurement from Strategy to Implementation. What was interesting (and applicable to your comment) is that Neef indicated that at one time business people considered EDI and e-procurement to be synonymous with one another. He pointed out however that the challenges associated with standardization combined with being prohibitively expensive (especially in the case of SME suppliers) limited its overall effectiveness.
Unfortunately, the mindset that had governed that generation of business thought process (including IT) is far too prevalent in today’s “synchronized” world. As a result, the current strategy eschews the logic of your suggestion that it should be “more important that the system be used to help the GoC to move ahead more rapidly and communicate what is being done more efficiently.”
The real solution (and resolution) is in the approach. In essence the GoC or for that matter the majority of organizations in both the private and public sector, have to look outside of the framework with which they are most familiar.
Just yesterday a question regarding the utilization of the SCOR methodology came across my desk.
I believe that the answer I had provided will hopefully illustrate what I mean in terms of thinking “outside of the framework” of current thinking.
Here is what I had said . . .
While I am in the process of completing an evaluation of the SCOR methodology, my initial findings have raised a number of red flags relative to the repeated references to supply “chains’ and what some of the research material has cited as the “5 steps (to) repeat over and over again between suppliers, the company, and customers.” This concern with SCOR methodology includes the assertion that “each step is a link in the supply chain that is critical in getting a product successfully along each level.”
The fact remains that close to 85% of all initiatives fail to achieve the expected results, and the recommendations the SCOR methodology suggest are a “sequential” approach to both testing the veracity of and structuring the guidelines for an organizational supply “practice.”
In essence, it is reflective of the traditional equation-based models in which an attempt is made to establish a “set” chain of characteristics in which stakeholder adoption or compliance is a key component.
With the emergence of the Metaprise, the term supply chain has been dropped and replaced with the more appropriate “supply practice” moniker. In this methodology, an agent-based model is utilized to first understand the unique operating attributes of diverse stakeholders, and then seeks to develop an “adaptive” model whereby the prescribed “process” reflects and therefore adapts to stakeholder characteristics and objectives.
For example, when Boeing refers to a complex adaptive network, what they are really discussing is using an agent-based model whereby the unique operating attributes of key stakeholders are first understood individually and then (through a collaborative effort) are linked collectively through establishing what they refer to as “flow paths.” This latter exercise is tied into identifying the common points of connectivity between seemingly disparate stakeholders (and stakeholder objectives). In essence, it reflects a theory of process I discovered and developed starting in 1998 and what I have come to call “strand commonality.”
What I like about the Boeing story (as well as similar case references) is that it creates a context or a point of reference in real-world applicability than can be easily recognized from a general market perspective.
So as you pursue a greater understanding of the various models that are being offered as the “new standard” for measuring supply practice performance (including agent-based modeling), you should keep in mind that an effective methodology understands and then adapts to the real-world operating attributes of your organization, and not the other way around.
In the meantime, here is one of the many links to sites, which provide an overview of the SCOR model:
http://scm.ncsu.edu/public/facts/facs041027.html
In conclusion Chet, your logic and insight point the way to an improved mindset which requires business leaders to think and see a situation in a new light. Whether or not they are up to the challenge will ultimately determine if their organization is part of the 85% majority of failed initiatives or if they will in fact make the transformation to become one of the 15% of companies that are successful.
Mike O'Neil
March 13, 2008
Not a bad analogy. I would suggest that you separate applications software from infrastructure software / tools needed to support / run those applications because PWGSC has already said that the departments will continue to control their individual applications but what they haven’t said is that departments may have to modify their applications to run on the Shared / Managed Service infrastructure. That means that COTS products that user departments have already spent a lot of money on to integrate with their departmental applications may not be acceptable to the Shared / Managed Service supplier who delivers and controls the service because they either don’t have the product in their product suite or have a competing internal / partner product that they would prefer to use despite the fact that the department’s product may cost less and / or provide more functionality. Unless PWGSC is willing to commit independent resources to continually monitor and insist on product offerings that add value and cost savings for the GoC, Shared / Managed Service initiatives will simply be a reflection of what the winning vendor has or wants to offer as COTS solutions regardless of price or functionality improvement considerations.
procureinsights
March 13, 2008
Thank you for your response Anne.
The key part of your statement is when you refer to “parameters.” Much like the State of North Carolina’s MOU with its higher education instituions, centrally establishing “guidelines” or parameters is important. However, where the limitations or the “rationalization” to which you had referred occurs is in those instances where the interests at the local or departmental levels are ignored or not properly understood. And this is my point.
What is interesting is that with the emergence of the Metaprise platform in which the historic challenges associated with interoperability and timely data sharing constraints have been substantially removed, the need for establishing a restrictive standard across an enterprise no longer exists.
However, the majority of project champions (at both the client and vendor level) have come from an era of proprietary protectionism and therefore are having some difficulty in grasping the new processes that reflect today’s synchronized practices. In essence, they are making decisions based on technological limitations that no longer exist instead of the operational imperatives that are required.
As for quantifying and qualifying the costs, that is a discussion that in and of itself requires a separate post. However I cannot help but wonder why a set implementation price has not been established and published against an accurately calculated return in terms of savings.
Make no mistake I appreciate the complexity of such an undertaking. That said embarking on a course of action in which the objectives and results are not clearly defined from the beginning, opens the door to a repeat of the AT Kearney situation or the problems that have been experienced by other governments such as the City of Houston and the State of California.
procureinsights
March 13, 2008
I appreciate you taking the time to comment Mike.
You are of course correct relative to the need to separate applications from infrastructure etc (please see my response to Dan McCade from CSC in Ottawa re conversion and compliance, as well as my other comments). However, to attempt to introduce this element into the mix as part of an effort to provide a point of reference has and would likely continue to lose a fair number of people.
That said once a much broader and collective understanding of the core principles that govern the Shared Services strategy has been successfully established, introducing your relevant points on COTS for example becomes much easier to understand.
And, the more people that understand the GoC’s program as well as its implications both domestically as well as internationally, the greater the interest and the resulting opportunity to effect meaningful change.