Let the Journey Begin — Mentoring a Better Project Manager

I have been involved in discussions lately regarding mentoring in the project management and IT business management field.  The question is: what does it take to build a better project manager given the rapidly changing paradigm defining the profession?

Having mentored many younger people over the course of a 22 year plus career in the United States Navy–and then afterward in private business–I have given this line of thought a great deal of consideration.  Over the years I have been applying personnel development and growth strategies as one assigned to lead both men and women among the uniformed military, civil service, and contractor communities.  Some of these efforts were notable for their successes.  In a few cases I failed to inspire or motivate.

Thus, I have kept a catalogue of lessons learned in helping me identify key elements in keeping people motivated in seeking a specialty or career.  Among these are the opportunity for growth, greater responsibility, and recognition of achievement.  Note that I do not mention compensation.  What I have found is that compensation, while important to one’s quality of life, is not a significant issue if that factor is viewed as equitable and commensurate to the effort involved.

In most cases, where financial rewards were considered inadequate compared to the job at hand, it was a negative factor in employee retention which, after all, is the key factor in developing someone to be a project manager or to eventually take any kind of senior responsibility.  In very few cases was it a net positive or a significant motivator, except in sales.  Young people tend to accept lower levels of compensation if they can see a path to advancement and greater rewards within a reasonable amount of time.  Firms that fail to provide this path, or that are not loyal to their employees in the manner of crafting career-focused compensation, can expect no loyalty in return.

The Foundation

Education, obviously, is the antecedent factor.  But one must view life as an educational journey in order to be effective in all of the phases of growth and responsibility.  The most common basic credential in proving one’s ability to learn is the Baccalaureate degree.  This is not a necessary condition.  I have known many brilliant people who were self-taught in any number of subjects.  But given that project management involves, in most cases, some technical knowledge and expertise, it is essential that the individual at least be exposed to that knowledge and demonstrate proficiency in the basics that underlie the area of competency.

But we can take this too far.  For example, in my experience, there are too many people that are extremely good in the technical aspects of what they are developing who make very poor project managers.  The reason for this is that a project is not an end item.  It is a social system consisting of people.  The people in the project management office are already proven to have a level of proficiency at their jobs.  If they do not, then that is where the greatest value lies in having a project manager, who must select the team, manage the team, coach the team, and lead the team.  The project is given resources, a scope, a project charter, and it is the project team that develops the plan and will execute against that plan.  A project manager that feels that they are the only technically competent member of the team will soon burn out.

Thus, the best project managers–the best managers of any kind–generally should have a multi-disciplinary education.  This education can be formal or informal, through accredited institutions, but also augmented by technical training and education, and perhaps graduate education.  I have often said that I would prefer to have someone with a strong liberal arts education in lieu of the dedicated specialist.  I can mold someone with a broad outlook and teach them to know what they need to know through a dedicated plan of adult learning, job assignment, and development.  It is also easier to explain concepts to someone at least exposed to information management, English usage and literature, history, human psychology, organizational behavior, mathematics, statistics, applied and theoretical science, and all of the other areas of knowledge which at one time or another must be referenced in running an organization in a technical field in the real world.  It is very difficult to teach someone to unlearn preconceived or bad habits, or an individual who has a doctrinaire attitude, or someone totally clueless about human motivations, emotions, and needs.  This is aside from winnowing out the run of the mill sociopath.

Proficiency in language and communication is also essential.  What is written is a direct reflection of one’s quality of thought.  Thus, when I find that a young individual does not have the verbal acuity necessary to be understood clearly, that is the first area of remediation that I undertake.  Writing a cohesive and logical sentence–or expressing oneself verbally in a clear and logical manner–is essential to one’s personal growth and the ability to work with others.

Step One — Beginnings

The first assignment for the individual slated for project management should be at the most basic level of proficiency, within one of the project management competencies.  This could be to work with project schedulers, systems engineers, cost managers, technical personnel, risk management, logistics, procurement, or any of the other areas necessary to support the organization’s projects.  But rather than developing a specialist who will then rise through that specialty to a senior position, the individual should be laterally transferred on a regular schedule from one specialty to another, given the appropriate skillset and expertise.

This method of variety in work assignment over time in the early stages of the individual’s career will lay the foundation for an appreciation of all aspects of the business.  It also establishes the culture of the learning organization, maintaining the interest of the individual through variety and personal growth.  The point of this method is to get the individual to a stage gate in assessing their capabilities and potential for further growth.

If the individual demonstrates the ability to adapt to different environments and challenges, to obtain new skillsets, and to thrive across multiple job assignments, they can then be advanced to the next level toward greater responsibility, perhaps involving supervisory or management duties.

For others, their limitations will also identify how they can best contribute to the organization.  Perhaps one specific specialty appeals to them.  Advancement and opportunities for these individuals may be more limited, depending on the size of the organization, but that is not necessarily the case.  Subject matter experts (SMEs) are essential to the success of the project team, otherwise there would be no team.  In these cases, investment in further education and training in the area of expertise is essential to employee development and retention.  The experience that they garner from working in other areas of the business also increases their value to the organization since they do not have to learn the basics on the job.

Step Two — Intermediate Development

In the development of military officers the Services first focus on that knowledge necessary to the tactical level of the organization.  As the individual rises in rank and continues to prove their competency, they are first transitioned to the operational level–which is the area that bridges the intersection between smaller units, junior staff, and medium sized units involving a great deal of responsibility.  This may involve junior command of some sort.  The final step in this process is to teach the officer about the strategic level, which involves duties related to senior command or senior staff.

The rationale is that one must walk before they can be expected to run.  In the area of project management this will involve assignments over time serving in various competencies in support of projects of increasing dollar value and complexity.  Thus, for example, we may assign a junior individual to be a project manager with a small staff involving limited funds for the development and deployment of a system of fairly short duration–one to two years.  As they continue to develop over time we will assign them to roles that are a natural progression commensurate with their skills and track record.

As with employee development in step one, the purpose of step two (which may involve subsequent lateral assignments as the individual rises in the organization) is to get them to a stage gate to assess whether they will eventually be able to handle the most complex project assignments, involving the greatest risk to the organization.

Step Three — Professional Competency

A few years ago there was a famous bit of pop psychology running through the professional business community asserting that 10,000 hours of practice is required to turn someone into an expert.  It turns out that this assertion is not supported by the science.  There are a number of factors that contribute to one’s competency in a subject and it could be as simple as emotional affinity–or involve any number of factors such as socio-economic background and upbringing, education and training, the ability to concentrate, the structure of the mind, neonatal development, among others.  This is beyond the odd child prodigy that burns brightly before the age of 10 and then fails to maintain their advantage into adulthood.

But given that the individuals we have mentored have thrived under our adult pedagogic approach, as well as having survived under real life conditions as they were brought along in a progressive manner commensurate with their ability to grow and learn, their areas of competency should be readily apparent.  This should be the last step in the mentoring effort.

The challenge for project management today is to break down the traditional barriers constructed by line and staff, and division of labor thinking.  Project management now demands a cross-disciplinary skillset.  With our information systems able to provide a cohesive and integrated view of the condition of the project in ways that were impossible just five years ago, we need to develop a cadre of individuals who can not only understand the information, but who possess the critical skills, maturity, knowledge, education, judgment, and context to use that information in an effective manner.

Sunday Music Interlude — Leon Bridges performing “Coming Home” Live

Hailing from Fort Worth Texas, Leon Bridges has broken onto the music scene with a handful of essential-when-you-hear-them soul and R&B.  His voice is reminiscent–both smooth and deep–as “Dock of the Bay”-era Otis Redding and “Chain Gang”-era Sam Cooke.  He made the WXPN The Key’s February 9th edition “Gotta Hear Song of the Week.”  In looking up his bio he is a mystery.  He has a website, Soundcloud, and Facebook page.  Obviously he feels content to speak through his music, which is extraordinary.

Here he is performing live at Little Rock’s White Water Tavern.

Over at AITS.org — Red Queen Race: Why Fast Tracking a Project is Not in Your Control

“Well, in our country,” said Alice, still panting a little, “you’d generally get to somewhere else—if you run very fast for a long time, as we’ve been doing.”

“A slow sort of country!” said the Queen. “Now, here, you see, it takes all the running you can do, to keep in the same place. If you want to get somewhere else, you must run at least twice as fast, as that!”Through the Looking-Glass and What Alice Found There, Chapter 2, Lewis Carroll

There have been a number of high profile examples in the news over the last two years concerning project management.  For example, the initial rollout of the Affordable Care Act marketplace web portal was one of these, and the causes for its faults are still being discussed. As I write this, an article in the New York Times indicates that the fast track efforts to create an Ebola vaccine are faltering…

To read the remainder of this post please to go to this link.

Margin Call — Schedule Margin and Schedule Risk

A discussion at the LinkedIn site for the NDIA IPMD regarding schedule margin has raised some good insight and recommendations for this aspect of project planning and execution.  Current guidance from the U.S. Department of Defense for those engaged in the level of intense project management that characterizes the industry has been somewhat vague and open to interpretation.  Some of this, I think, is due to the competing proprietary lexicon from software manufacturers that have been dominant in the industry.

But mostly the change in defining this term is due to positive developments.  That is, the change is due to the convergence garnered from long experience among the various PM disciplines that allow us to more clearly define and distinguish between schedule margin, schedule buffer, schedule contingency, and schedule reserve.  It is also due to the ability of more powerful software generations to actually apply the concept in real planning without it being a thumb in the air-type exercise.

Concerning this topic, Yancy Qualls of Bell Helicopter gave an excellent presentation at the January NDIA IPMD meeting in Tucson.  His proposal makes a great deal of sense and, I think, is a good first step toward true integration and a more elegant conceptual solution.  In his proposal, Mr. Qualls clearly defines the scheduling terminology by drawing analogies to similar concepts on the cost side.  This construction certainly overcomes a lot of misconceptions about the purpose and meaning of these terms.  But, I think, his analogies also imply something more significant and it is this:  that there is a common linkage between establishing management reserve and schedule reserve, and there are cost/schedule equivalencies that also apply to margin, buffer, and contingency.

After all, resources must be time-phased and these are dollarized.  But usually the relationship stops there and is distinguished by that characteristic being measured: measures of value or measures of timing; that is, the value of the work accomplished against the Performance Management Baseline (PMB) is different from the various measures of progress recorded against the Integrated Master Schedule (IMS).  This is why we look at both cost and schedule variances on the value of work performed from a cost perspective, and physical accomplishment against time.  These are fundamental concepts.

To date, the most significant proposal advanced to reconcile the two different measures was put forth by Walt Lipke of Oklahoma City Air Logistics Center in the method known as earned schedule.  But the method hasn’t been entirely embraced.  Studies have shown that it has its own limitations, but that it is a good supplement those measures currently in use, not a substitute for them.

Thus, we are still left with the need of making a strong, logical, and cohesive connection between cost and schedule in our planning.  The baseline plans constructed for both the IMS and PMB do not stand apart or, at least, should not.  They are instead the end result of a continuum in the construction of our project systems.  As such, there should be a tie between cost and schedule that allows us to determine the proper amount of margin, buffer, and contingency in a manner that is consistent across both sub-system artifacts.

This is where risk comes in and the correct assessment of risk at the appropriate level of measurement, given that our measures of performance are being measured against different denominators.  For schedule margin, in Mr. Qualls’ presentation, it is the Schedule Risk Analysis (SRA).  But this then leads us to look at how that would be done.

Fortuitously, during this same meeting, Andrew Uhlig of Raytheon Missile Systems gave an interesting presentation on historical SRA results, building models from such results, and using them to inform current projects.  What I was most impressed with in this presentation was that his research finds that the actual results from schedule performance do not conform to any of the usual distribution curves found in the standard models.  Instead of normal, triangle, or pert distributions, what he found is a spike, in which a large percentage of the completions fell exactly on the planned duration.  Thus, distribution was skewed around the spike, with the late durations–the right tail–much longer than the left.

What is essential about the work of Mr. Uhlig is that, rather than using small samples with their biases, he using empirical data to inform his analysis.  This is a pervasive problem in project management.  Mr. Qualls makes this same point in his own presentation, using the example of the Jordan-era Chicago Bulls as an example, where each subsequent win–combined with probabilities that show that the team could win all 82 games–does not mean that they will actually perform the feat.  In actuality (and in reality) the probability of this occurring is quite small.  Glen Alleman at his Herding Cats blog covers this same issue, emphasizing the need for empirical data.

The results of the Uhlig presentation are interesting, not only because they throw into question the results using the three common distributions used in schedule risk analysis under simulated Monte Carlo, but also because they may suggest, in my opinion, an observation or reporting bias.  Discrete distribution methods, as Mr. Uhlig proposes, will properly model the distribution for such cases using our parametric analysis.  But they will not reflect the quality of the data collected.

Short duration activities are designed to overcome subjectivity through their structure.  The shorter the duration, the more discrete the work being measured, the less likely occurrence of “gaming” the system.  But if we find, as Mr. Uhlig does, that 29% of 20 day activities report exactly 20 days, then there is a need to test the validity of the spike itself.  It is not that it is necessarily wrong.  Perhaps the structure of the short duration combined with the discrete nature of the linkage to work has done its job.  One would expect a short tail to the left and a long tail to the right of the spike.  But there is also a possibility that variation around the target duration is being judged as “close enough” to warrant a report of completion at day 20.

So does this pass the “So What?” test?  Yes, if only because we know that the combined inertia of all of the work performed at any one time on the project will eventually be realized in the form of a larger amount of risk in proportion to the remaining work.  If the reported results are pushing risk to the right because the reported performance is optimistic against the actual performance, then we will get false positives.  If the actual performance is pessimistic against actual performance–a less likely scenario in my opinion–then we will get false negatives.

But regardless of these further inquiries that I think need to be made regarding the linkage between cost and schedule, and the validity of results from SRAs, we now have two positive steps in the right direction in clarifying areas that in the past have perplexed project managers.  Properly identifying schedule reserve, margin, buffer, and contingency, combined with properly conducting SRAs using discrete distributions based on actual historical results will go quite far in allowing us to introduce better predictive measures in project management.

One-Trick Pony — Software apps and the new Project Management paradigm

Recently I have been engaged in an exploration and discussion regarding the utilization of large amounts of data and how applications derive importance from that data.  In an on-line discussion with the ever insightful Dave Gordon, I first postulated that we need to transition into a world where certain classes of data are open so that the qualitative content can be normalized.  This is what for many years was called the Integrated Digital Environment (IDE for short).  Dave responded with his own post at the AITS.org blogging alliance, countering that while such standards are necessary in very specific and limited applications, that modern APIs provide most of the solution.  I then responded directly to Dave here, countering that IDE is nothing more than data neutrality.  Then also at AITS.org I expanded on what I proposed to be a general approach in understanding big data, noting the dichotomy in the software approaches that organize the external characteristics of the data to generalize systems and note trends, as opposed to those that are focused on the qualitative content within the data.

It should come as no surprise then, given these differences in approaching data, that we also find similar differences in the nature of applications that are found on the market.  With the recent advent of on-line and hosted solutions, there are literally thousands of applications in some categories of software that propose to do one thing with data, or that are focused one-trick pony applications that can be mixed and matched to somehow provide an integrated solution.

There are several problems with this sudden explosion of applications of this nature.

The first is in the very nature of the explosion.  This is a classic tech bubble, albeit limited to a particular segment of the software market, and it will soon burst.  As soon as consumers find that all of that information traveling over the web with the most minimal of protections is compromised by the next trophy hack, or that too many software providers have entered the market prematurely–not understanding the full needs of their targeted verticals–it will hit like the last one in 2000.  It only requires a precipitating event that triggers a tipping point.

You don’t have to take my word for it.  Just type in a favorite keyword into your browser now (and I hope you’re using VPN doing it) for a type of application for which you have a need–let’s say “knowledge base” or “software ticket systems.”  What you will find is that there are literally hundreds if not thousands of apps built for this function.  You cannot test them all.  Basic information economics, however, dictates that you must invest some effort in understanding the capabilities and limitations of the systems on the market.  Surely there are a couple of winners out there.  But basic economics also dictates that 95% of those presently in the market will be gone in short order.  Being the “best” or the “best value” does not always win in this winnowing out.  Certainly chance, the vagaries of your standing in the search engine results, industry contacts–virtually any number of factors–will determine who is still standing and who is gone a year from now.

Aside from this obvious problem with the bubble itself, the approach of the application makers harkens back to an earlier generation of one-off applications that attempt to achieve integration through marketing while actually achieving, at best, only old-fashioned interfacing.  In the world of project management, for example, organizations can little afford to revert to the division of labor, which is what would be required to align with these approaches in software design.  It’s almost as if, having made their money in an earlier time, that software entrepreneurs cannot extend themselves beyond their comfort zones in taking advantage of the last TEN software generations that provide new, more flexible approaches to data optimization.  All they can think to do is party like it’s 1995.

For the new paradigm in project management is to get beyond the traditional division of labor.  For example, is scheduling such a highly specialized discipline rising to the level of a profession that it is separate from all of the other aspects of project management?  Of course not.  Scheduling is a discipline–a sub-specialty actually–that is inextricably linked to all other aspects of project management in a continuum.  The artifacts of the process of establishing project systems and controls constitutes the project itself.

No doubt there are entities and companies that still ostensibly organize themselves into specialties as they did twenty years ago: cost analysts, schedule analysts, risk management specialists, among others.  But given that the information from the these systems: schedule, cost management, project financial management, risk management, technical performance, and all the rest, can be integrated at the appropriate level of their interrelationships to provide us a cohesive, holistic view of the complex system that we call a project, is such division still necessary?  In practice the industry has already moved to position itself to integration, realizing the urgency of making the shift.

For example, to utilize an application to query cost management information in 1995 was a significant achievement during the first wave of software deployment that mimicked the division of labor.  In 2015, not so much.  Introducing a one-trick pony EVM “tool” in 2015 is laziness–hoping to turn back the clock in ignoring the obsolescence of such an approach–regardless of which slick new user interface is selected.

I recently attended a project management meeting of senior government and industry representatives.  During one of my side sessions I heard a colleague propose the discipline of Project Management Analyst in lieu of previously stove-piped specialties.  His proposal is a breath of fresh air in an industry that develops and manufacturers the latest aircraft and space technology, but has hobbled itself with systems and procedures designed for an earlier era that no longer align with the needs of doing business.  I believe the timely deployment of systems has suffered as a result during this period of transition. 

Software must lead, and accelerate the transition to the new integration paradigm.

Thus, in 2015 the choice is not between data that adheres to conventions of data neutrality, or to those that utilize data access via APIs, but in favor of applications that do both.

It is not between different hard-coded applications that provide the old “what-you-see-is-what-you-get” approach.  It is instead between such limited hard-coded applications, and those that provide flexibility so that business managers can choose among a nearly unlimited pallet of choices of how and which data, converted into information, is available to the user or classes of user based on their role and need to know; aggregated at the appropriate level of detail for the consumer to derive significance from the information being presented.

It is not between “best-of-breed” and “mix-and-match” solutions that leverage interfaces to achieve integration.  It is instead between such solution “consortiums” that drive up implementation and sustainment costs, bringing with them high overhead, against those that achieve integration by leveraging the source of the data itself, reducing the number of applications that need to be managed, allowing data to be enriched in an open and flexible environment, achieving transformation into useful information.

Finally, the choice isn’t among applications that save their attributes in a proprietary format so that the customer must commit themselves to a proprietary solution.  Instead, it is between such restrictive applications and those that open up data access, clearly establishing that it is the consumer that owns the data.

Note: I have made minor changes from the original version of this post for purposes of clarification.

Saturday Music Interlude — Anoushka Shankar featuring Norah Jones performing “The Sun Won’t Set” and “Traces of You”

The beautiful song “The Sun Won’t Set” has come back to me from time-to-time during both happy and challenging moments.  Anoushka Shankar and Norah Jones are half-sisters, having in common their father, the legendary sitarist Ravi Shankar.  Shankar died at the age of 92 during the making of the album that comprise these songs.  The album, Traces of You, is up for the Grammy Award in the Best World Music Album category.  Traces of You is a song cycle that reprises the cycle of life.  Jones collaborates with Shankar on three songs, providing an occasional narrative voice between ten emotive instrumental songs.  Given that it was given birth during a very difficult and painful time in the life of Ms. Shankar, Traces of You, as with most artistic work, moves into the category of being a merely good to a great album, reflecting the turmoil, pain, love, and hope that she apparently experienced at the time–transcending pedestrian concerns into ones that are insightful and empathic.

Rolling Stone has an interesting interview with Jones regarding the album and the experience of performing with her sister.  Sadly, there are no video recordings of “The Sun Won’t Set.”  The audio recording, however, is all you need.  “Traces of You”, which follows, shows the comfortable interplay between Shankar and Jones.

I Get By With A Little Help… — Avoiding NIH in Project Management

…from my colleagues, friends, family, associates, advisors, mentors, subcontractors, consultants, employees.  And not necessarily in that order.

The term NIH in this context is not referring to the federal agency.  It is shorthand, instead, for “Not Invented Here”.  I was reminded of this particular mindset when driving through an old neighborhood where I served as a community organizer.  At one of the meetings of a local board, which was particularly dysfunctional (and where I was attempting to reform their dysfunction), a member remarked:  “I am tired of hearing about how this or that particular issue was handled somewhere else.”  Yes, I thought, why would we possibly want to know how Portland, or D.C., or Boston, or Denver, or Phoenix–or any of the number of other places faced with the same issue–effectively or ineffectively dealt with it before us?  What could they possibly teach us?

When we deal with a project management organization, we are dealing with a learning system.  Hopefully an effectively adaptive system.  The qualifier here is important.  The danger with any tightknit group is to fall into the dual traps of Groupthink and NIH.  The first deals with the behavior relating to conformity within small groups based on the observations and study by William H. Whyte and his successors.  The second is the mindset that the issues faced by the group are unique to it; and so the use of models, tools, experience, and proven statistical and leading indicators do not apply.

A project management organization (or any complex human organization) is one that adapts to pressures from its environment.  It is one with the ability to learn, since it is made up of entities with the ability to create and utilize intelligence and information, and so it is unique from biological systems that adapt over time through sexual and natural selection.  Here is also an important point:  while biological evolution occurs over long spans of time, we don’t see the dead ends and failures of adaptation until the story is written–at least, not outside of those who work in the microbiological field where evolution of viruses and bacteria occur rapidly.  So for large animals and major species it appears to be a Panglossian world, which it definitely is not.

When we take Panglossian thinking into the analogies that we find in social and other complex adaptive systems, the fallacies in our thinking can be disastrous and cause great unnecessary suffering.  I am reminded here of the misuse of the concept of self-organization in complex systems and of the term “market” in economics.  Organizations and social structures can “self-organize” not only into equilibrium but also into spirals of failure and death.  Extremely large and complex organizations like nation-states and societies are replete with such examples: from Revolutionary France to Czarist Russia, to recent examples in Africa and the Near East.  In economics, “the market” determines price.  The inability of the market to self-regulate–and the nature of self-organization–resulted in the bursting of the housing bubble in the first decade of this century, precipitating a financial crisis.  This is the most immediate example of a systemic death spiral of global proportions, which was finally resolved (finally) only with a great deal of intervention by rational actors.

So when I state: hopefully an effective adaptive system, I mean one that does not adapt itself by small steps into unnecessary failure or wasted effort.  (As our business, financial, economic, and political leaders did in the early 2000s).  Along these same lines, when we speak of fitness in surviving challenges (or fitness in biological evolution), we do not imply the “best” of something.  Fitness is simply a convenient term to describe the survivors after all of the bodies have been counted.  In nature one can survive due to plain luck, through capabilities or characteristics of inheritance fit to the environmental space, through favorable chance isolation or local conditions–the list is extensive.  Many of these same factors also apply to social and complex adaptive systems, but on a shorter timescale with a higher degree of traceable proximate cause-and-effect, depending on the size and scale of the organization.

In project management systems, while it is important to establish the closed-loop systems necessary to gain feedback from the environment to determine whether the organization is effectively navigating itself to achieve its goals against a plan, it is also necessary to have those systems in place that allow for leveraging both organizational and competency knowledge, as well as third-party solutions.  That is, broadening the base in applying intelligence.

This includes not only education, training, mentoring, and the retention and use of a diversified mix of experienced knowledge workers, but also borrowing solutions outside of the organization.  It means being open to all of the tools available in avoiding NIH.  Chances are, though the time, place, and local circumstances may be different, someone has faced something very similar before somewhere else.  With the availability of information becoming so ubiquitous, there is very little excuse for restricting our sources.

But given this new situation, our systems must now possess the ability to apply qualitative selection criteria in identifying analogous information, tempered with judgment in identifying the differences in the situations where they exist.  But given that most systems–including systems of failure–organize themselves into types and circumstances that can be generalized into assumptions, we should be able to leverage both the differences and similarities in developing a shortcut that doesn’t require all of the previous steps to be repeated (with a high degree of repeating failure).

In closing, I think it important to note that failure here is defined as the inability of the organization to come up with an effective solution to the problem at hand, where one is possible.  I am not referring to failure as the inability to achieve an intermediate goal.  In engineering and other fields of learning, including business, failure is oftentimes a necessary part of the process, especially when pushing technologies in which previous examples and experience cannot apply to the result.  The lessons learned from a failed test in this situation, for example, can be extremely valuable.  But a failed test that resulted from the unwillingness of the individuals in the group to consider similar experience or results due to NIH is inexcusable.