My Generation — Baby Boom Economics, Demographics, and Technological Stagnation

“You promised me Mars colonies, instead I got Facebook.” — MIT Technology Review cover over photo of Buzz Aldrin

“As a boy I was promised flying cars, instead I got 140 characters.”  — attributed to Marc Maron and others

I have been in a series of meetings over the last couple of weeks with colleagues describing the state of the technology industry and the markets it serves.  What seems to be a generally held view is that both the industry and the markets for software and technology are experiencing a hardening of the arteries and a resistance to change not seen since the first waves of digitization in the 1980s.

It is not as if this observation has not been noted by others.  Tyler Cowen at George Mason University noted the trend of technological stagnation in the eBook The Great Stagnation: How America Ate All the Low-Hanging Fruit of Modern History, Got Sick, and Will(Eventually) Feel BetterCowen’s thesis is not only to point out that innovation has slowed since the late 19th century, but that it has slowed a lot, where we have been slow to exploit “low-hanging fruit.”  I have to say that I am not entirely convinced by some of the data, which is anything but reliable in demonstrating causation in the long term trends.  Still, his observations of technological stagnation seem to be on the mark.  His concern, of course, is also directed to technology’s affect on employment, pointing out that, while making some individuals very rich, the effect of recent technological innovation doesn’t result in much employment

Cowen published his work in 2011, when the country was still in the early grip of the slow recovery from the Great Recession, and many seized on Cowen’s thesis as an opportunity for excuse-mongering and looking for deeper causes than the most obvious ones: government shutdowns, wage freezes, reductions in government R&D that is essential to private sector risk handling, and an austerian fiscal policy (with sequestration) in the face of weak demand created by the loss of $8 trillion in housing wealth that translated into a consumption gap of $1.2 trillion in 2014 dollars

Among the excuses that were manufactured is the meme that is still making the rounds about jobs mismatch due to a skills gap.  But, as economist Dean Baker has pointed out again and again, basic economics dictates that the scarcity of a skill manifests itself in higher wages and salaries–a reality not supported by the data for any major job categories.  Unemployment stood at 4.4 percent in May 2007 prior to the Great Recession.  The previous low between recession and expansion was the 3.9 percent rate in December 2000, yet we are to believe that suddenly in the 4 years since the start of one of the largest bubble crashes and resulting economic and financial crisis, that people no longer have the skills need to be employed (or suddenly are more lazy or shiftless).  The data do not cohere.

In my own industry and specialty there are niches for skills that are hard to come by and these people are paid handsomely, but the pressure among government contracting officers across the board has been to drive salaries down–a general trend seen across the country and pushed by a small economic elite and therein, I think lies the answer more than some long-term trend tying patents to “innovation.”  The effect of this downward push is to deny the federal government–the people’s government–from being able to access the high skills personnel needed to make it both more effective and responsive.  Combined with austerity policies there is a race to the bottom in terms of both skills and compensation.

What we are viewing, I think, that is behind our current technological stagnation is a reaction to the hits in housing wealth, in real wealth and savings, in employment, and in the downward pressure on compensation.  Absent active government fiscal policy as the backstop of last resort, there are no other places to make up for $1.2 trillion in lost consumption.  Combine this with the excesses of the patent and IP systems that create monopolies and stifle competition, particularly under the Copyright Term Extension Act and the recent Leahy-Smith America Invents Act.  Both of these acts have combined to undermine the position of small inventors and companies, encouraging the need for large budgets to anticipate patent and IP infringement litigation, and raising the barriers to entry for new technological improvements.

No doubt exacerbating this condition is the Baby Boom.  Since university economists don’t seem to mind horning in on my specialty (as noted in a recent post commenting on the unreliability of data mining by econometrics),  I don’t mind commenting on theirs–and what has always surprised me is how Baby Boom Economics never seems to play a role in understanding trends, nor as predictors of future developments in macroeconomic modeling.  Wages and salaries, even given Cowen’s low-hanging fruit, have not kept pace with productivity gains (which probably explains a lot of wealth concentration) since the late 1970s–a time that coincides with the Baby Boomers entering the workforce in droves.  A large part of this condition has been a direct consequence of government policies–through so-called ‘free trade” agreements–that have exposed U.S. workers in industrial and mid-level jobs to international competition from low-paying economies.

The Baby Boom, given an underperforming economy, saw not only their wages and salaries lag, but also saw their wealth and savings disappear with the Great Recession, when corporate mergers and acquisitions weren’t stealing their negotiated defined benefit plans, which they received in lieu of increases in compensation.  This has created a large contingent of surplus labor.  The size of the long-term unemployed, though falling, is still large compared to historical averages, is indicative of this condition.

With attempts to privatize Social Security and Medicare, workers now find themselves squeezed and under a great deal of economic anxiety.  On the ground I see this anxiety even at the senior executive level.  The workforce is increasingly getting older as people hang on for a few more years, perpetuating older ways of doing things. Even when there is a changeover, oftentimes the substitute manager did not receive the amount of mentoring and professional development expected in more functional times.  In both cases people are risk-averse, feeling that there is less room for error than there was in the past.

This does not an innovative economic environment make.

People who I had known as risk takers in their earlier years now favor the status quo and a quiet glide path to a secure post-employment life.  Politics and voting behavior also follows this culture of lowered expectations, which further perpetuates the race to the bottom.  In high tech this condition favors the perpetuation of older technologies, at least until economics dictates a change.

But it is in this last observation that there is hope for an answer, which does confirm that this is but a temporary condition.  For under the radar there are economies upon economies in computing power and the ability to handle larger amounts of data with exponential improvements in handling complexity.  Collaboration of small inventors and companies in developing synergy between compatible technologies can overcome the tyranny of the large monopolies, though the costs and risks are high.

As the established technologies continue to support the status quo–and postpone needed overhauls of code mostly written 10 to 20 years ago (which is equivalent to 20 to 40 software generations) their task, despite the immense amount of talent and money, is comparable to a Great Leap Forward–and those of you who are historically literate know how those efforts turned out.  Some will survive but there will be monumental–and surprising–falls from grace.

Thus the technology industry in many of its more sedentary niches are due for a great deal of disruption.  The key for small entrepreneurial companies and thought leaders is to be there before the tipping point.  But keep working the politics too.

Saturday Music Interlude — Phox performing “Slow Motion” and “Blue and White”

Phox is a sextet doing alternative folk and indie music ostensibly out of Madison, Wisconsin after returning to their home state after various individual career failures and wrong turns.  According to their own copy, they originated as friends in high school from the hamlet of Baraboo, WI, “a place where kids often drink poisoned groundwater and become endowed mutants.”  They hit the musical circuit last year and became overnight successes after having hit the woodshed for two or so years prior.  They caught everyone’s attention at SXSW, the iTunes Festival, and Lollapalooza.  Their debut self-titled album was released June 24th on Partisan Records.  Here is a video of them while they were in the beginning of their rise:

 

Why Can’t We Be Friends — The Causes of War

Paul Krugman published an interesting opinion piece in the Sunday New York Times entitled “Why We Fight Wars” in which he attempts to understand why developed, relatively affluent countries still decide to wage war, despite the questionable economic costs.  Many websites seconded his observations, particularly those that view social systems and people as primarily rational economic beings.  I think the problem with Mr. Krugman’s opinion–and there is no doubt that he is a brilliant economist and observer of our present state of affairs with a Nobel to his name no less–is that he doesn’t fully comprehend that while the economic balance sheet may argue against warfare, there are other societal issues that lead a nation to war.

Warfare, its causes, and the manner to conduct it was part of my profession for most of my life.  My education was dedicated not only to my academic development but also to its utility in understanding the nature of civilization’s second oldest profession–and how to make what we do in waging war–at the tactical, operational, strategic level–that much more effective.  In the advanced countries we attempt to “civilize” warfare, though were it to be waged in its total state today, it would constitute post-industrial, technological mass murder and violence on a scale never seen before.  This knowledge, which is even recognized by peripheral “Third World” nations and paramilitary organizations, actually make such a scenario both unthinkable and unlikely.  It is most likely this knowledge that keeps Russian ambitions limited to insurgents, proxies, Fifth Columnists, and rigged referendums in its current war of conquest against Ukraine.

Within the civilized view of war, it begins with Clausewitz’s famous dictum: “War is the attainment of political ends through violent means.”  Viewing war as such we have established laws in its conduct.  The use of certain weapons–chemical and biological agents for instance–are considered illegal and their use a war crime; a prohibition honored throughout World War II, Korea, Vietnam, and most other major conflicts.  Combatants are to identify themselves and, when they surrender, are to be accorded humane treatment–a rule that has held up pretty effectively with notable exceptions recorded by Showa Japan, North Korea, and North Vietnam and–tragically and recently–by the United States in its conduct in the War on Terror.  War is not to be purposely waged on non-combatants and collective punishment as reprisals for resistance are prohibited.  There are also others that apply, such as Red Cross and medical persons being protected from attack.  In the U.S. military, the conduct of personnel at war are also restricted by the rules of engagement.  But in all cases the general law of warfare dictates that only the necessary amount of force to achieve the desired political ends is to be exercised–the concept of proportionality applied to a bloody business.

Such political ends typically reflect a society’s perception of its threats, needs, and grievances.  Japan’s perception that the United States and Western Europe was denying it resources and needed its own colonial possessions is often cited as the cause of its expansion and militarism under Showa rule.  Germany’s economic dislocations and humiliation under the Allies is often blamed for the rise of Hitler, rabid nationalism, and expansionism.  In both cases it seemed that at the societal level both nations possessed the characteristics on the eve of war that is typically seen in psychotic individuals.  Other times these characteristics seemed to behave like a disease, infecting other societies and countries in proximity with what can only be described as sociopathic memes–a type of mass hysteria.  How else to explain the scores of individuals with upraised hands in fealty to obviously cruel and inhumane political movements across the mess of human history–or the systematic collection and mass murder of Jews, Gypsies, Intellectuals, and other “undesirables”: not just in Germany but wherever the influence of this meme spread across Europe, Africa, and Asia?

Nations can also fool themselves in learning the wrong lessons from history.  Our own self-image of righting the wrongs of the Old World go back to our anti-colonial roots and the perceptions of our immigrant ancestors who were either rejected by or rejected that world.  Along these lines, the example of Munich in the 20th century has been much misused as a pretext for wars that have ended disastrously or created disastrous blowback resulting from the fog of war simply because the individuals assessing the strategic situation told themselves convenient stories gleaned from an inapplicable past and ignored the reality on the ground.  We have seen this in my lifetime in Vietnam, Iraq (twice), and Afghanistan.

For all of the attempts to “civilize” warfare and lend it rationality, there comes a time when its successful prosecution requires the rejection of rationality.  This is why soldiers and combatant personnel use euphemisms to dehumanize their enemy: it is easier to obliterate a person who is no longer seen as human.  Correspondingly the public is inflamed, the press becomes a tool of the war party, and dissent is viewed with suspicion and penalized.  This is why warfare cannot be interpreted as an extension of neo-classical or–any–economics.  There are no rational actors; at least, not as it is presently conducted by modern nation-states no matter their level of economic development or the maturity of their political systems.  War is unhinged–part of the savagery found in all of us from our primate pasts.

One of my most effective professors when I was seeking my Masters in Military Arts and Sciences was the brilliant historian Dr. Roger J. Spiller–a protégé’ of T. Harry Williams.  “We are always learning,” he would say in repeating a familiar refrain in the military profession, “the lessons from the last war.”  For students at the Army Command and General Staff College it was the critique that doctrine (and therefore the organization and construction of the force) was based on the 1967 Arab-Israeli War; probably the only analogue that could be used in Iraq and–unfortunately for Russia–if they decide to turn their armor on Ukraine or any Article V NATO countries.

Aside from these few exceptions, however, the American way of total warfare that we learned first in our own Civil War and then perfected on the battlefields of Europe and Asia–and our success in its use–has rendered it largely obsolete.  It has been obsolete for quite some time because warfare has changed; its practitioners have evolved.  It has changed because its present incarnation is being prosecuted by people and groups that have no significant power and so use violence to destroy power.  This is the purpose of the terrorist.  Even the strength of this new form of warfare–Low Intensity Conflict–is transient–evident only in tactical situations.  What it cannot do is establish legitimacy or power.  Thus, meeting violence with violence only exacerbates the situation in these cases because power is further eroded and–along with it–legitimacy.  We see the results of the vacuum caused by this inability to adjust to the new warfare in the political instability in both Iraq and Afghanistan–and the rise of ISIS.

While I would argue that the use of economic balance sheets are not what we need in assessing the needs to ensure global stability and peace, we do require a new theory of war that infuses greater rationality into the equation.  Clausewitz–and his intellectual successor Antoine-Henri Jomini–in looking at the guerilla warfare in Spain against French rule, simply admonishes war’s practitioners not to go there.  It is not until T. E. Lawrence and Mao Zedong that we have a modern theory to address this new, societal form of “revolutionary” warfare and then only from the perspective of the revolutionary that wishes to establish neo-colonial, authoritarian, or totalitarian regimes.

Thus, we possess the old templates and they no longer work.  With the specter of nuclear weapons still held over the head of humanity we can ill afford to view every situation as a nail, needing a hammer.  We must, I think, follow the lead as advocated by Hannah Arendt, who distinguished the differences between power, strength, force, violence, and authority.  There is, as John Dewey observed, a connection in consequences between means and ends.  The modern form of violence through terrorism or paramilitary revolution has all too often, in the last quarter of the 20th century and the first decades of the 21st century, led to new oppression and totalitarianism.  This has probably been inevitable given the indiscriminate brutality of the struggles.  Diplomacy backed by credible power and sufficient military force to counteract such violence is the new necessary face of achieving stability.  Contrary to the assertions of neo-cons at the time, the very thing we needed in the wake of 9-11 was an effective police action in lieu of chaotic regional warfare.

Interestingly, the insight between means and ends in warfare was perceived early by George Washington when he won his struggle over the conduct of the war against the methods of Nathaniel Greene.  Greene’s irregular methods of warfare were designed to win the war but to unmake a nation, while Washington’s method–the existence of the disciplined continental army as the strategic center of the revolution–was designed to make a nation once the war was over.  Unfortunately for the French and the Russians, there was no George Washington to see this important distinction in their own revolutions.

So too in the 21st century is this connection between means and ends in the handling of conflict–and terrorism–important.  The years since the fall of the Soviet Union seem to have turned the clock back to 1914 for the pressures and conflicts that were held in check by a bi-polar world: the Balkans, the Middle East, Eastern Europe all have been engulfed in conflict.  The tragedy that can result given the new technologies and approaches for inflicting violence and chaos on civilization require that we not apply 1914 methods in meeting them.

Sixteen Tons — Data Mining, Big Data, and the Asymmetry of variables and observations

Last Thursday I came upon what I can only interpret as an ironic comment at Mark Thoma’s Economist’s View blog entitled “Data Mining Can be a Productive Activity.”  I went to the link and it went to a VOX article by Castle and Hendry entitled “Data Mining with more variables than observations.”  All I could think after the opening line:  “A typical Oxford University econometrics exam question might take the form: ‘Data mining is bad, so mining with more candidate variables than observations must be pernicious. Discuss.'” was: are these people serious?

Data mining is a general term in high tech and not a specific approach to finding patterns and trends in large elements of data.  The authors–and I’m guessing that they are not alone in the econometrics profession–seem to be addressing a “Just Say No” approach to performing what for most of us who deal in statistical analysis and modeling of large datasets do every day, largely based on the fact that it involves these scary things called computers that run this mysterious thing behind the scenes called “code.”  Who knows what horrors may await us as we mistakenly draw causations from correlations by anything more than the use of Access or Excel spreadsheets?  It seems that Oxford dons need to get out more.

The use of microeconomic data mining has been in general use for quite some time in many businesses and business disciplines with a great deal of confidence and success (too much success in the medical insurance, financial services, and social networking fields to raise legal and ethical objections).  So the assertion that seems to be based on those of a single group of econometricians does seem to be odd.  In the end it seems to be a setup for a proprietary set of calculations placed within an Excel spreadsheet given the name “Autometrics.”  This largely argues for the proper approach to the organization of data rather than a criticism of data mining in general.

The discriminators among data mining and data mining-like technologies involve purpose, cost, ease of use, scalability, and sustainability.  New technologies are arising every year that allow for increased speed, more efficient grouping, and compression to allow organizations to handle what previously was thought to be “big data.”  Thus the concept of data mining and big data is a shifting one as our technologies drive toward greater capability in integrating and interpreting large datasets.  The authors cite the techniques of taking large data to prove anthropomorphic global warming as one of the success stories of large scale modeling based on large data.  Implicit in acknowledging this is that not every variable needs to be included in a model–only the relevant variables that drive and explain the results being produced.  There is no doubt that reification of statistical results is a common fallacy, but people had been doing that long before the development of silicon-based artificial intelligence.

There is no doubt that someday we will reach the limit of computational capabilities.  But for someone who lived through the nonexistent “crisis” of limited memory in the early ’90s followed not too after by the bogus Y2K “bug,” I am not quite ready to throw in the towel on the ability of data mining and modeling to effectively provide the tools for the more general discipline of econometrics.  We are only beginning to crack heuristic models in approaching big data and on the cusp of strong AI.

 

Monday Night Music Interlude — Shannon McNally on “One on One”

A lot of blogging to catch up on as I return from yet another extended trip

The classic country lilt is somewhat misleading for this Long Island native.  According to the site Allmusic, she was greatly influenced by folk-blues from her parents’ record collection.  I first heard about (and heard) her at her electric live performance at the 2007 New Orleans Jazzfest, where her rendition of “Sweet Forgiveness” (which can be heard on the critically acclaimed album North American Ghost Music) set the crowd on fire.  Please enjoy.

Weekend Musical Interlude — Guy Clark performing “El Coyote”

Not much more can be said of Guy Clark that has not already been written.  I had the pleasure of seeing him in concert with Lyle Lovett, Joe Ely, and John Hyatt a few years ago as they toured the southwest.  Clark embodies the best of folk and what was called “country and western” music, with emphasis on western.  His songs tell stories that are genuine and organic to their surroundings, embodying the best of what pure art should embody: clarifying and transforming what is apparent into something else–an insight into the human condition, recording it with all of its imperfections and in all of its embodiments.

“El Coyote” tells a story that has been much in the news the last few years, generating outrage and suspicion: the emigration of people into the United States from south of the border.  There are many reasons for the migration just as our forebears had their reasons for coming to this land.  The reasons in the American southwest are a bit more complicated than many would acknowledge, the border being somewhat fluid over the last 160 years, with trade and movement flowing both ways, which I learned first hand when I resided for many years in New Mexico.  Clark’s story song is told in the third person, but from the perspective of the campesinos.  As such, it harkens back to the music of Woody Guthrie, telling the story from the perspective of those whose lives and destinies are being recorded in song.

A few days ago I viewed for the first time the excellent biopic of Hannah Arendt starring Barbara Sukowa.  Arendt is a philosopher whose intellectual power and influence marks her as the essential source for understanding the human capacity for doing evil.  Her clear-eyed observations of people in extraordinary times and circumstances disturbed many of her contemporaries, but it is this intellectual honesty that marks her as one of the giants in recording and understanding human nature.  Her first-hand insights confirmed what Joseph Conrad wrote sixty-three years before in “Heart of Darkness,” that “The mind of man is capable of anything–because everything is in it, all of the past as well as all of the future.”

Arendt’s insight in her work, but most especially in Eichmann in Jerusalem, was that human evil is not only banal, but in its most common form is derived by the denial of thought, which is the most basic human activity that defines each of us as human.  By refusing to think about (and therefore take responsibility for) the consequences of his actions, Adolf Eichmann, a petty bureaucrat, was able to commit a very great evil, a horrendous crime.  The humanity of the people being led to their slaughter became unimportant–a commodity–and so it became easy to do what he was ordered to do because they were stripped of their humanity by the absence of thought.

We must be mindful as a people, I think, that thought leads to the acceptance of the humanity of others, which leads to empathy, sympathy, and–eventually–to basic human compassion and decency.  Stereotypes, euphemisms, and slogans are evils designed to deny people their basic humanity and there is no doubt that the purveyors of such devices do so with that harmful intent.  We must resist the easy path of thoughtlessness, and appeals to fear and tribal loyalty.

People can be undocumented in coming into a strange land, but people can never be “illegal.”  It wasn’t too long ago when my own swarthy forebears were pejoratively called “W.O.P.s”–a term that is derived from the acronym “with out papers;” that is, those of Italian descent who were undocumented and therefore, “illegal.”

We are a nation of immigrants.  My sympathy and advocacy for decency is with the campesinos and with the children seeking safety in “the land of the free and the home of the brave.”  We simply need to live up to those words which, in the end, is what defines a people as exceptional, given the all too common penchant for cruelty in human history.

Synchronicity — What is proper schedule and cost integration?

Much has been said about the achievement of schedule and cost integration (or lack thereof) in the project management community.  Much of it consists of hand waving and magic asterisks that hide the significant reconciliation that goes on behind the scenes.  From an intellectually honest approach that does not use the topic as a means of promoting a proprietary solution is that authored by Rasdorf and Abudayyeah back in 1991 entitled, “Cost and Schedule Control Integration: Issues and Needs.”

It is worthwhile revisiting this paper, I think, because it was authored in a world not yet fully automated, and so is immune to the software tool-specific promotion that oftentimes dominates the discussion.  In their paper they outlined several approaches to breaking down cost and work in project management in order to provide control and track performance.  One of the most promising methods that they identified at the time was the unified approach that had originated in aerospace, in which a work breakdown structure (WBS) is constructed based on discrete work packages in which budget and schedule are unified at a particular level of detail to allow for full control and traceability.

The concept of the WBS and its interrelationship to the organizational breakdown structure (OBS) has become much more sophisticated over the years, but there has been a barrier that has caused this ideal to be fully achieved.  Ironically it is the introduction of technology that is the culprit.

During the first phase of digitalization that occurred in the project management industry not too long after Radsdorf and Abudayyeah published their paper, there was a boom in dot coms.  For business and organizations the practice was to find a specialty or niche and fill it with an automated solution to take over the laborious tasks of calculation previously achieved by human intervention.  (I still have both my slide rule and first scientific calculator hidden away somewhere, though I have thankfully wiped square root tables from my memory).

For those of us who worked in project and acquisition management, our lives were built around the 20th century concept of division of labor.  In PM this meant we had cost analysts, schedule analysts, risk analysts, financial analysts and specialists, systems analysts, engineers broken down by subspecialties (electrical, mechanical, systems, aviation) and sub-subspecialties (Naval engineers, aviation, electronics and avionics, specific airframes, software, etc.).  As a result, the first phase of digitization followed the pathway of the existing specialties, finding niches in which to inhabit, which provided a good steady and secure living to software companies and developers.

For project controls, much of this infrastructure remains in place.  There are entire organizations today that will construct a schedule for a project using one set of specialists and the performance management baseline (PMB) in another, and then reconciling the two, not just in the initial phase of the project, but across the entire life of the project.  From the standard of the integrated structure that brings together cost and schedule this makes no practical sense.  From a business efficiency perspective this is an unnecessary cost.

As much as it is cited by many authors and speakers, the Coopers & Lybrand with TASC, Inc. paper entitled “The DoD Regulatory Cost Premium” is impossible to find on-line.  Despite its widespread citation the study demonstrated that by the time one got down to the third “cost” driver due to regulatory requirements that the projected “savings” was a fraction of 1% of the total contract cost.  The interesting issue not faced by the study is, were the tables turned, how much would such contracts be reduced if all management controls in the company were reduced or eliminated since they contribute as elements to overhead and G&A?  More to the point here, if the processes applied by industry were optimized what would the be the cost savings involved?

A study conduct by RAND Corporation in 2006 accurately points out that a number of studies had been conducted since 1986, all of which promised significant impacts in terms of cost savings by focusing on what were perceived as drivers for unnecessary costs.  The Department of Defense and the military services in particular took the Coopers & Lybrand study very seriously because of its methodology, but achieved minimal savings against those promised.  Of course, the various studies do not clearly articulate the cost risk associated with removing the marginal cost of oversight and regulation. Given our renewed experience with lack of regulation in the mortgage and financial management sectors of the economy that brought about the worst economic and financial collapse since 1929, one may look at these various studies in a new light.

The RAND study outlines the difficulties in the methodologies and conclusions of the studies undertaken, especially the acquisition reforms initiated by DoD and the military services as a result of the Coopers & Lybrand study.  But, how, you may ask does this relate to cost and schedule integration?

The present means that industry uses in many places takes a sub-optimized approach to project management, particularly when it applies to cost and schedule integration, which really consists of physical cost and schedule reconciliation.  A system is split into two separate entities, though they are clearly one entity, constructed separately, and then adjusted using manual intervention which defeats the purpose of automation.  This may be common practice but it is not best practice.

Government policy, which has pushed compliance to the contractor, oftentimes rewards this sub-optimization and provides little incentive to change the status quo.  Software industry manufacturers who are embedded with old technologies are all too willing to promote the status quo–appropriating the term “integration” while, in reality, offering interfaces and workarounds after the fact.  Those personnel residing in line and staff positions defined by the mid-20th century approach of division of labor are all too happy to continue operating using outmoded methods and tools.  Paradoxically these are personnel in industry that would never advocate using outmoded airframes, jet engines, avionics, or ship types.

So it is time to stop rewarding sub-optimization.  The first step in doing this is through the normalization of data from these niche proprietary applications and “rewiring” them at the proper level of integration so that the systemic faults can be viewed by all stakeholders in the oversight and regulatory chain.  Nothing seems to be more effective in correcting a hidden defect than some sunshine and a fresh set of eyes.

If industry and government are truly serious about reforming acquisition and project management in order to achieve significant cost savings in the face of tight budgets and increasing commitments due to geopolitical instability, then systemic reforms from the bottom up are the means to achieve the goal; not the elimination of controls.  As John Kennedy once said in paraphrasing Chesterton, “Don’t take down a fence unless you know why it was put up.”  The key is not to undermine the strength and integrity of the WBS-based approach to project control and performance measurement (or to eliminate it), but to streamline it so that it achieves its ideal as closely as our inherently faulty tools and methods will allow.