The Monster Mash — Zombie Ideas in Project and Information Management

Just completed a number of meetings and discussions among thought leaders in the area of complex project management this week, and I was struck by a number of zombie ideas in project management, especially related to information, that just won’t die.  The use of the term zombie idea is usually attributed to the Nobel economist Paul Krugman from his excellent and highly engaging (as well as brutally honest) posts at the New York Times, but for those not familiar, a zombie idea is “a proposition that has been thoroughly refuted by analysis and evidence, and should be dead — but won’t stay dead because it serves a political purpose, appeals to prejudices, or both.”

The point is that to a techie–or anyone engaged in intellectual honesty–is that they are often posed in the form of question begging, that is, they advance invalid assumptions in the asking or the telling.  Most often they take the form of the assertive half of the same coin derived from “when did you stop beating your wife?”-type questions.  I’ve compiled a few of these for this post and it is important to understand the purpose for doing so.  It is not to take individuals to task or to bash non-techies–who have a valid reason to ask basic questions based on what they’ve heard–but propositions put forth by people who should know better based on their technical expertise or experience.  Furthermore, knowing and understanding technology and its economics is really essential today to anyone operating in the project management domain.

So here are a few zombies that seem to be most common:

a.  More data equals greater expense.  I dealt with this issue in more depth in a previous post, but it’s worth repeating here:  “When we inform Moore’s Law by Landauer’s Principle, that is, that the energy expended in each additional bit of computation becomes vanishingly small, it becomes clear that the difference in cost in transferring a MB of data as opposed to a KB of data is virtually TSTM (“too small to measure”).”  The real reason why we continue to deal with this assertion is both political in nature and also based in social human interaction.  People hate oversight and they hate to be micromanaged, especially to the point of disrupting the work at hand.  We see behavior, especially in regulatory and contractual relationships, where the reporting entity plays the game of “hiding the button.”  This behavior is usually justified by pointing to examples of dysfunction, particularly on the part of the checker, where information submissions lead to the abuse of discretion in oversight and management.  Needless to say, while such abuse does occur, no one has yet to point quantitatively to data (as opposed to anecdotally) that show how often this happens.

I would hazard to guess that virtually anyone with some experience has had to work for a bad boss; where every detail and nuance is microscopically interrogated to the point where it becomes hard to make progress on the task at hand.  Such individuals, who have been advanced under the Peter principle must, no doubt, be removed from such a position.  But this often happens in any organization, whether it be in private enterprise–especially in places where there is no oversight, check-and-balances, means of appeal, or accountability–or government–and is irrelevant to the assertion.  The expense item being described is bad management, not excess data.  Thus, such assertions are based on the antecedent assumption of bad management, which goes hand-in-hand with…

b. More information is the enemy of efficiency.  This is the other half of the economic argument to more data equals greater expense.  And I failed to mention that where the conflict has been engaged over these issues, some unjustifiable figure is given for the additional data that is certainly not supported by the high tech economics cited above.  Another aspect of both of these perspectives also comes from the conception of non-techies that more data and information is equivalent to pre-digital effort, especially in conceptualizing the work that often went into human-readable reports.  This is really an argument that supports the assertion that it is time to shift the focus from fixed report formatting functionality in software based on limited data to complete data, which can be formatted and processed as necessary.  If the right and sufficient information is provided up-front, then additional questions and interrogatories that demand supplemental data and information–with the attendant multiplication of data streams and data islands that truly do add cost and drive inefficiency–are at least significantly reduced, if not eliminated.

c.  Data size adds unmanageable complexity.  This was actually put forth by another software professional–and no doubt the non-techies in the room would have nodded their heads in agreement (particularly given a and b above), if opposing expert opinion hadn’t been offered.  Without putting too fine a point on it, a techie saying this to an open forum is equivalent to whining that your job is too hard.  This will get you ridiculed at development forums, where you will be viewed as an insufferable dilettante.  Digitized technology for well over 40 years has been operating under the phenomenon of Moore’s Law.  Under this law, computational and media storage capability doubles at least every two years under the original definition, though that equation has accelerated to somewhere between 12 and 24 months.  Thus, what was considered big data, say, in 1997 when NASA first coined the term, is not considered big data today.  No doubt, what is considered big data this year will not be considered big data two years from now.  Thus, the term itself is relative and may very well become archaic.  The manner in which data is managed–its rationalization and normalization–is important in successfully translating disparate data sources, but the assertion that big is scary is simply fear mongering because you don’t have the goods.

d.  Big data requires more expensive and sophisticated approaches.  This flows from item c above as well and is often self-serving.  Scare stories abound, often using big numbers which sound scary.  All data that has a common use across domains has to be rationalized at some point if they come from disparate sources, and there are a number of efficient software techniques for accomplishing this.  Furthermore, support for agnostic APIs and common industry standards, such as the UN/CEFACT XML, take much of the rationalization and normalization work out of a manual process.  Yet I have consistently seen suboptimized methods being put forth that essentially require an army of data scientists and coders to essentially engage in brute force data mining–a methodology that has been around for almost 30 years: except that now it carries with it the moniker of big data.  Needless to say this approach is probably the most expensive and slowest out there.  But then, the motivation for its use by IT shops is usually based in rice bowl and resource politics.  This is flimflam–an attempt to revive an old zombie under a new name.  When faced with such assertions, see Moore’s Law and keep on looking for the right answer.  It’s out there.

e.  Performance management and assessment is an unnecessary “regulatory” expense.  This one keeps coming up as part of a broader political agenda beyond just project management.  I’ve discussed in detail the issues of materiality and prescriptiveness in regulatory regimes here and here, and have addressed the obvious legitmacy of organizations to establish one in fiduciary, contractual, and governmental environments.

My usual response to the assertion of expense is to simply point to the unregulated derivatives market largely responsible for the financial collapse, and the resulting deep economic recession that followed once the housing bubble burst.  (And, aside from the cost of human suffering and joblessness, the expenses related to TARP).  Thus we know that the deregulation of banking had gone so well.  Even after the Band-Aid of Dodd-Frank the situation probably requires a bit more vigor, and should include the ratings agencies as well as the real estate market.  But here is the fact of the matter: such expenses cannot be monetized as additive because “regulatory” expenses usually represent an assessment of the day-to-day documentation, systems, and procedures required when performing normal business operations and due diligence in management.  I attended an excellent presentation last week where the speaker, tasked with finding unnecessary regulatory expenses, admitted as much.

Thus, what we are really talking about is an expense that is an essential prerequisite to entry in a particular vertical, especially where monopsony exists as a result of government action.  Moral hazard, then, is defined by the inherent risk assumed by contract type, and should be assessed on those terms.  Given the current trend is to raise thresholds, the question is going to be–in the government sphere–whether public opinion will be as forgiving in a situation where moral hazard assumes $100M in risk when things head south, as they often do with regularity in project management.  The way to reduce that moral hazard is through sufficiency of submitted data.  Thus, we return to my points in a and b above.

f.  Effective project assessment can be performed using high level data.  It appears that this view has its origins in both self-interest and a type of anti-intellectualism/anti-empiricism.

In the former case, the bias is usually based on the limitations of either individuals or the selected technology in providing sufficient information.  In the latter case, the argument results in a tautology that reinforces the fallacy that absence of evidence proves evidence of absence.  Here is how I have heard the justification for this assertion: identifying emerging trends in a project does not require that either trending or lower level data be assessed.  The projects in question are very high dollar value, complex projects.

Yes, I have represented this view correctly.  Aside from questions of competency, I think the fallacy here is self-evident.  Study after study (sadly not all online, but performed within OSD at PARCA and IDA over the last three years) have demonstrated that high level data averages out and masks indicators of risk manifestation, which could have been detected looking at data at the appropriate level, which is the intersection of work and assigned resources.  In plain language, this requires integration of the cost and schedule systems, with risk first being noted through consecutive schedule performance slips.  When combined with technical performance measures, and effective identification of qualitative and quantitative risk tied to schedule activities, the early warning is two to three months (and sometime more) before the risk is reflected in the cost measurement systems.  You’re not going to do this with an Excel spreadsheet.  But, for reference, see my post  Excel is not a Project Management Solution.

It’s time to kill the zombies with facts–and to behead them once and for all.

Islands in the Stream — Data Streams and Data Islands in Project Management

In meeting, conference, and travel mode but a few thoughts on issues that have arisen over the last few days.

One of these is the concept of understanding how data flows within and between organizations.  The confusion often arises by mixing in issues of reporting and report formatting into the discussion.  No doubt these perspectives continue to persist–a type of Zombie argument–because is it hard for non-techies to understand that, given data, such views–which oftentimes are posed as impediments and counterfactuals to data optimization–become trivial.

Here is the deal: right now, especially in contractual relationships in the public sphere, project performance management data is submitted to demonstrate accountability and progress in the expenditure of public monies.  A similar relationship also exists in private industry among parties both within large organizations and through contractual and fiduciary relationships.

When it comes to project management, there are a number of areas relevant to measuring progress from which data must be collected.  The issue here is determining what data to collect, store, and process–and the most effective way of disseminating, analyzing, and recording responses to such data once it is transformed through processing into information.  Thus, we have a plethora of data streams in a typically complex organization or system.  In working with project management data there are a number of areas of overlap, redundancy, and suboptimization.  This last is typically represented by proprietary and stove-piped data repositories that resist optimization of data across all areas of the organization or system that require access.  These islands also resist integration with other data that can optimize the value of the data in providing further insights.

To eliminate this redundancy and unnecessary complexity requires a systemic approach to data streaming.  What are the systems of record?  Who are the necessary consumers of the data?  How can this data be optimized?  What does not go into this equation are concerns of report formatting (especially artifacts based on human-readable formatting that were conceived under non-digital technology), and levels of reporting.  This last consideration becomes trivial when answering the questions listed at the beginning of this paragraph.  Furthermore, technologies that break down proprietary barriers in the translation and integration of data are an important consideration toward optimization.  For example, the application of the international UN/CEFACT XML standard with well-defined data exchange instructions (DEIs) focused on particular industries is one way of overcoming limitations imposed by proprietary barriers.  The use of standard APIs, given open rules of engagement in defining data syntax, is also another approach.

In the end, the objective is to reduce the number of data streams in order to resolve redundancies and the needs of multiple project stakeholders that can leverage commonalities.  Such an approach also reduces the disruption of organizational processes that result from supplemental information requests, and inefficiencies that rely on suboptimized human-readable submittals.

 

The Future — Data Focus vs. “Tools” Focus

The title in this case is from the Leonard Cohen song.

Over the last few months I’ve come across this issue quite a bit and it goes to the heart of where software technology is leading us.  The basic question that underlies this issue can be boiled down into the issue of whether software should be thought of as a set of “tools” or an overarching solution that can handle data in a way that the organization requires.  It is a fundamental question because what we call Big Data–despite all of the hoopla–is really a relative term that changes with hardware, storage, and software scalability.  What was Big Data in 1997 is not Big Data in 2016.

As Moore’s Law expands scalability at lower cost, organizations and SMEs are finding that the dedicated software tools at hand are insufficient to leverage the additional information that can be derived from that data.  The reason for this is simple.  A COTS tools publisher will determine the functionality required based on a structured set of data that is to be used and code to that requirement.  The timeframe is usually extended and the approach highly structured.  There are very good reasons for this approach in particular industries where structure is necessary and the environment is fairly stable.  The list of industries that fall into this category is rapidly becoming smaller.  Thus, there is a large gap that must be filled by workarounds, custom code, and suboptimized use of Excel.  Organizations and people cannot wait until the self-styled software SMEs get around to providing that upgrade two years from now so that people can do their jobs.

Thus, the focus must be shifted to data and the software technologies that maximize its immediate exploitation for business purposes to meet organizational needs.  The key here is the arise of Fourth Generation applications that leverage object oriented programming language that most closely replicate the flexibility of open source.  What this means is that in lieu of buying a set of “tools”–each focused on solving a specific problem stitched together by a common platform or through data transfer–that software that deals with both data and UI in an agnostic fashion is now available.

The availability of flexible Fourth Generation software is of great concern, as one would imagine, to incumbents who have built their business model on defending territory based on a set of artifacts provided in the software.  Oftentimes these artifacts are nothing more than automatically filled in forms that previously were filled in manually.  That model was fine during the first and second waves of automation from the 1980s and 1990s, but such capabilities are trivial in 2016 given software focused on data that can be quickly adapted to provide functionality as needed.  What this development also does is eliminate and make trivial those old checklists that IT shops used to send out in a lazy way of assessing relative capabilities of software to simplify the competitive range.

Tools restrict themselves to a subset of data by definition to provide a specific set of capabilities.  Software that expands to include any set of data and allows that data to be displayed and processed as necessary through user configuration adapts itself more quickly and effectively to organizational needs.  They also tend to eliminate the need for multiple “best-of-breed” toolset approaches that are not the best of any breed, but more importantly, go beyond the limited functionality and ways of deriving importance from data found in structured tools.  The reason for this is that the data drives what is possible and important, rather than tools imposing a well-trod interpretation of importance based on a limited set of data stored in a proprietary format.

An important effect of Fourth Generation software that provides flexibility in UI and functionality driven by the user is that it puts the domain SME back in the driver’s seat.  This is an important development.  For too long SMEs have had to content themselves with recommending and advocating for functionality in software while waiting for the market (software publishers) to respond.  Essential business functionality with limited market commonality often required that organizations either wait until the remainder of the market drove software publishers to meet their needs, finance expensive custom development (either organic or contracted), or fill gaps with suboptimized and ad hoc internal solutions.  With software that adapts its UI and functionality based on any data that can be accessed, using simple configuration capabilities, SMEs can fill these gaps with a consistent solution that maintains data fidelity and aids in the capture and sustainability of corporate knowledge.

Furthermore, for all of the talk about Agile software techniques, one cannot implement Agile using software languages and approaches that were designed in an earlier age that resists optimization of the method.  Fourth Generation software lends itself most effectively to Agile since configuration using simple object oriented language gets us to the ideal–without a reliance on single points of failure–of releasable solutions at the end of a two-week sprint.  No doubt there are developers out there making good money that may challenge this assertion, but they are the exceptions to the rule that prove the point.  An organization should be able to optimize the pool of contributors to solution development and rollout in supporting essential business processes.  Otherwise Agile is just a pretext to overcome suboptimized developmental approaches, software languages, and the self-interest of developers that can’t plan or produce a releasable product in a timely manner within budgetary constraints.

In the end the change in mindset from tools to data goes to the issue of who owns the data: the organization that creates and utilizes the data (the customer), or the proprietary software tool publishers?  Clearly the economics will win out in favor of the customer.  It is time to displace “tools” thinking.

Note:  I’ve revised the title of the blog for clarity.

Sunday Music Interlude — Patti Smith performing “My Blakean Year”

Blogging is still light due to travel and other responsibilities.  But in the meantime, I strongly recommend that you stop off at AITS.org for the latest thoughts and trends in IT project management.  In particular, check out the important blog post from Dave Gordon regarding aligning projects with organizational strategy.  I have a post coming to the Blogging Alliance as well, and a few posts I’ve been pecking at for this page.  In the meantime, here is some music that came to me on the radio in returning home from a recent trip, and which speaks to the heart, from the irreplaceable punk poet laureate Patti Smith.

 

For What It’s Worth — More on the Materiality and Prescriptiveness Debate and How it Affects Technological Solutions

The underlying basis on the materiality vs. prescriptiveness debate that I previously wrote about lies in two areas:  contractual compliance, especially in the enforcement of public contracts, and the desired outcomes under the establishment of a regulatory regime within an industry.  Sometimes these purposes are in agreement and sometimes they are in conflict and work at cross-purposes to one another.

Within a simple commercial contractual relationship, there are terms and conditions established that are based on the expectation of the delivery of supplies and services.  In the natural course of business these transactions are usually cut-and-dried: there is a promise for a promise, a meeting of the minds, consideration, and performance.  Even in cases that are heavily reliant on services, where the terms are bit more “fuzzy,” the standard is that the work being performed be done in a “workmanlike” or “professional” manner, usually defined by the norms of the trade or profession involved.  There is some judgment here depending on the circumstances, and so disputes tend to be both contentious and justice oftentimes elusive where ambiguity reigns.

In research and development contracts the ambiguities and contractual risks are legion.  Thus, the type of work and the ability to definitize that work will, to the diligent contract negotiator, determine the contract type that is selected.  In most cases in the R&D world, especially in government, contract types reflect a sharing and handling of risk that is reflected in the use of cost-plus type contracts.

Under this contract type, the effort is reimbursed to the contractor, who must provide documentation on expenses, labor hours, and work accomplished.  Overhead, G&A, and profit is negotiated based on a determination of what is fair and reasonable against benchmarks in the industry, which will be ultimately determined through negotiation of the parties.  A series of phases and milestones are established to mark the type of work that is expected to be accomplished over time.  The ultimate goal is the produce a prototype end item application that meets the needs of the agency, whether that agency is the Department of Defense or some other civilian agency in the government.

The period of performance of the contracts in these cases, depending on the amount of risk in research and development in pushing the requisite technology, usually involving several years.  Thus, the areas of concern given the usually high dollar value, inherent risk, and extended periods, involve:

  1. The reliability, accuracy, quality, consistency, and traceability of the underlying systems that report expenditures, effort, and progress;
  2. Measures that are indicative of whether all of the aspects of the eventual end item will meet elements that constitute expectations and standards of effectiveness, performance, and technical achievement.  These measures are conducted within the overall cost and schedule constraints of the contracted effort;
  3. Assessment over the lifecycle of the contract regarding external, as well as internal technical, qualitative, and quantitative risks of the effort;
  4. The ability of items 1 through 3 above to provide an effective indication or early warning that the contractual vehicle will significantly vary from either the contractual obligations or the established elements outlining the physical, performance, and technical characteristics of the end item.
  5. The more mundane, but no less important, verification of contractual performance against the terms and conditions to avoid a condition of breach.

Were these the only considerations in public contracting related to project management our work in evaluating these relationships, while challenging, would be fairly cut-and-dried given that they would be looked at from a contracting perspective.  But there is also a systemic purpose for a regulatory regime.  These are often in conflict with one another.  Such requirements as compliance, surveillance, process improvement, and risk mitigation are looking at the same systems, but from different perspectives with, ultimately, differing reactions, levels of effectiveness, and results.  What none of these purposes includes is a punitive purpose or result–a line oftentimes overstepped, in particular, by private parties.  This does not mean that some regulations that require compliance with a law do not come with civil penalties, but we’ll get to that in a moment.

The underlying basis of any regulatory regime is established in law.  The sovereign–in our case the People of the United States through the antecedent documents of governance, including the U.S. Constitution and Constitutions of the various states, as well as common law–possesses an inherent right to regulate the health, safety, and welfare of the people.  The Preamble of the U.S. Constitution actually specifies this purpose in writing, but in broader terms.  Thus, the purposes of a regulatory regime when it comes to this specific issue are what are at issue.

The various reasons usually are as follows:

  1. To prevent an irreversible harm from occurring.
  2. To enforce a particular level of professional conduct.
  3. To ensure compliance with a set of regulations or laws, especially where ambiguities in civil and common law have yielded judicial uncertainty.
  4. To determine the level of surveillance of a regulated system that is needed based on a set of criteria.
  5. To encourage particular behaviors.
  6. To provide the basis for system process improvement.

Thus, in applying a regulation there are elements that go beyond the overarching prescriptiveness vs. materiality debate.  Materiality only speaks to relevance or significance, while prescriptiveness relates to “block checking”–the mindless work of the robotic auditor.

For example, let’s take the example of two high profile examples of regulation in the news today.

The first concerns the case of where Volkswagen falsified its emissions test results for a good many of its vehicles.  The role of the regulator in this case was to achieve a desired social end where the state has a compelling interest–the reduction of air pollution from automobiles.  The regulator–the Environmental Protection Agency (EPA)–found the discrepancy and issued a notice of violation of the Clean Air Act.  The EPA, however, did not come across this information on its own.  Since we are dealing with multinational products, the initial investigation occurred in Europe under a regulator there and the results passed to the EPA.  The corrective action is to recall the vehicles and “make the parties whole.”  But in this case the regulator’s remedy may only be the first line of product liability.  It will be hard to recall the pollutants released into the atmosphere and breach of implicit contract with the buyers of the automobiles.  Whether a direct harm can be proven is now up to the courts, but given that executives in an internal review (article already cited) admitted that executives knew about deception, the remedies now extend to fraud.  Regardless of the other legal issues,

The other high profile example is the highly toxic levels of lead in the drinking water of Flint, Michigan.  In this case the same regulator, the EPA, has issued a violation of federal law in relation to safe drinking water.  But as with the European case, the high levels of lead were first discovered by local medical personnel and citizens.  Once the discrepancy was found a number of actions were required to be taken to secure proper drinking water.  But the damage has been done.  Children in particular tend to absorb lead in their neurological systems with long term adverse results.  It is hard to see how the real damage that has been inflicted will make the damaged parties whole.

Thus, we can see two things.  First, the regulator is firmly within the tradition of regulating the health, safety, and welfare, particularly the first category and second categories.  Secondly, the regulatory regime is reactive.

While obviously the specific illnesses caused by the additional pollution form Volkswagen vehicles is probably not directly traceable to harm, the harm in the case of elevated lead levels in Flint’s water supply is both traceable and largely irreversible.

Thus, in comparing these two examples, we can see that there are other considerations than the black and white construct of materiality and prescriptiveness.  For example, there are considerations of irreversible harm, prevention, proportionality, judgment, and intentional results.

The first reason for regulation listed above speaks to irreversible harm.  In these cases proportionality and prevention are the main concerns.  Ensuring that those elements are in place that will prevent some catastrophic or irreversible harm through some event or series of events is the primary goal in these cases.  When I say harm I do not mean run of the mill, litigious, constructive “harm” in the legal or contractual sense, but great harm–life and death, resulting disability, loss of livelihood, catastrophic market failure, denial of civil rights, and property destruction kind of harm.  In enforcing such goals, these fall most in line with prescriptiveness–the establishment of particular controls which, if breached, would make it almost impossible to fully recover without a great deal of additional cost or effort.  Furthermore, when these failures occur a determination of culpability or non-culpability is determined.  The civil penalties in these cases, where not specified by statute, are ideally determined by proportionality of the damage.  Oftentimes civil remedies are not appropriate since these often involve violations of law.  This occurs, in real life, from the two main traditional approaches to audit and regulation being rooted in prescriptive and judgmental approaches.

The remainder of the reasons for regulation provide degrees of oversight and remedy that are not only proportional to the resulting findings and effects, but also to the goal of the regulation and its intended behavioral response.  Once again, apart from the rare and restricted violations given in the first category above, these regulations are not intended to be enforced in a punitive manner, though there can be penalties for non-compliance.  Thus, proportionality, purpose, and reasonableness are additional considerations to take into account.  These oftentimes fall within the general category of materiality.

Furthermore, going beyond prescriptiveness and materiality, a paper entitled Applying Systems-Thinking to Reduce Check-the-Box Decisions in the Audit of Complex Estimates, by Anthony Bucaro at the University of Illinois at Urbana-Champaign, proposes an alternative auditing approach that also is applicable to other types of regulation, including contract management.  The issue that he is addressing is the fact that today, in using data, a new approach is needed to shift the emphasis to judgment and other considerations in whether a discrepancy warrants a finding of some sort.

This leads us, then, to the reason why I went down this line of inquiry.  Within project management, either a contractual or management prerogative already exists to apply a set of audits and procedures to ensure compliance with established business processes.  Particular markets are also governed by statutes regulating private conduct of a public nature.  In the government sphere, there is an added layer of statutes that prescribe a set of legal and administrative guidance.  The purposes of these various rules varies.  Obviously breaking a statute will garner the most severe and consequential penalties.  But the set of regulatory and administrative standards often act at cross purposes, and in their effect, do not rise to the level of breaking a law, unless they are necessary elements in complying with that law.

Thus, a whole host of financial and performance data assessing what, at the core, is a very difficult “thing” to execute (R&D leading to a new end item), offers some additional hazards under these rules.  The underlying question, outside of statute, concerns what the primary purpose should be in ensuring their compliance.  Does it pass the so-what? test if a particular administrative procedure is not followed to the letter?

Taking a broader approach, including a data-driven and analytical one, removes much of the arbitrariness when judgment and not box-checking is the appropriate approach.  Absent a consistent and wide pattern that demonstrates a lack of fidelity and traceability of data within the systems that have been established, auditors and public policymakers must look at the way that behavior is affected.  Are there incentives to hide or avoid issues, and are there sufficient incentives to find and correct deficiencies?  Are the costs associated with dishonest conclusions adequately addressed, and are there ways of instituting a regime that encourages honesty?

At the core is technology–both for the regulated and the regulator.  If the data that provides the indicators of compliance come, unhindered, from systems of record, then dysfunctional behaviors are minimized.  If that data is used in the proper manner by the regulator in driving a greater understanding of the systemic conditions underlying the project, as well as minimizing subjectivity, then the basis for trust is established in determining the most appropriate means of correcting a deficiency.  The devil is in the details, of course.  If the applied technology simply reproduces the check-block mentality, then nothing has been accomplished.  Business intelligence and systems intelligence must be applied in order to achieve the purposes that I outlined earlier.

 

Sunday Music Interlude — Dylan LeBlanc performing “Cautionary Tale”

What better way to get back to usual blogging than to share the latest discovery in new music.

According to Allmusic, Dylan LeBlanc hails from Louisiana and is the son of Muscle Shoals session singer/songwriter/guitarist James LeBlanc.  The elder LeBlanc’s music has been performed by artists as varied as Jo Dee Messina, Rascal Flatts, Billy Ray Cyrus, Travis Tritt, Trace Adkins, Chris LeDoux, Kenny Chesney, and a number of other artists.  What this means for LeBlanc fils is that he has been immersed in music from the start.  He began writing music at the age of 11 and has followed the alt-country, singer-songwriter, indie-rock, and Americana genres.  His style, to my ears, is a bit more bluesy and within the American folk music tradition, if a bit updated.  He is out of Shreveport, Louisiana, and has just released a new album entitled Cautionary Tale, which has gotten raves by DJs.  According to the reviewer at NPR’s First Listen, LeBlanc’s early success in landing a recording contract (he is but 25 years old) pushed him to substance and alcohol abuse, from which he emerged just prior to recording this album.  As a result, the lyrics and sound display a maturity beyond his years.  Here he is performing the apropos “Cautionary Tale,” the title track of the album.

 

A Short Word From Our Sponsor

The holidays are over and the New Year has arrived, so it’s time to get back into serious blogging. No doubt, given the weight of the topic of my last post, you may have already figured that out, but there is a slight difference. I have tried to restrict myself mostly to subjects closest to my vocation (which seems to have chosen me): project management in general, but also information theory and systems, risk, the physical nature of complex adaptive systems of a particular kind, and related topics. All of these posts are related to issues that I’ve encountered, read about, or are actively working in the discipline. Some are reintroductions, reconsiderations, and extensions of academic work that I completed. But I’ve also attempted to mix it up and brought in other areas of interest, some of which provide insight into life and everything else.

A little over a month ago I wrote the post The Report of My Death Was An Exaggeration and many colleagues, followers, and friends (and a few detractors–haters gonna hate) speculated and asked what it was all about. I’m a bit old fashioned in this tell-all plugged-in society on personal matters. I compartmentalize my public and business personas, and separate them from my personal life. As a U.S. Naval officer I learned to do this a long time ago and considered it an essential means of meeting my professional responsibilities. My position as a commissioned officer was a public trust, that is, I had a duty, and as a public trust my personal preferences, prejudices, opinions, etc. were walled off from what the job demanded. To be effective one has to militate against one’s ego. After all, the office is more important than the individual who fills it. But I opened the door and now that we are no longer in the moment I will spill the beans.

Well I had a bit of a cancer scare and had to go through a battery of tests, one of which was particularly painful. The verdict delivered by my doctor pretty well followed in tenor if not in the details the Woody Allen snippet that I provided from the movie Hannah and Her Sisters–both in my imagination and what turned out to be the good news of reality. Even if the news had gone the other way I’m sure I could have found a humorous snippet to match it: there can never be enough gallows humor. Much as in the case of Allen’s character, this caused a reassessment on my part. It’s not a first. This was, after all, not the first time that I considered and had to contemplate my mortality under trying or frightening circumstances. In the end, after a bit of a hiatus, my conclusion was that it’s time to get back to work, which had been the same conclusion after those other events. For those still searching for the “meaning” of life, here one perspective from this small voice: the purpose of life is to live it; and to do so in the words of Mark Twain, “to live so that when we come to die even the undertaker will be sorry.”

From the blogging perspective what this means is that there will occasionally be more posts on other topics apart from project management, like the last one, which, I think, are essential to understanding our times. My thirst for knowledge and understanding is expansive, and so this blog is part of my exploration. Some of my posts are extensions of my formal education in business, contract, and project management, literature, political science, and history. But others are subjects in which I am not as strong, where my formal education was limited and where I am largely self-taught. In this latter category my limitations are the same as others who have engaged in self-education outside of the university, and so this blog is an attempt to overcome them. I promise to try to keep it interesting, informative, well researched, current, and essential. I write to test out ideas and to share my exploration of the way the world works both at the macro and micro level on topics of interest. It is my conversation with the world. So don’t expect perfection or the rigor of a peer-reviewed paper. Also, don’t expect this to be advertisement or marketing for my commercial pursuits–those worlds are separate.

Okay, now back to business.

keep-calm-and-blog-on-798