There is a Hole… in your Software Development

matrix_failureThere’s a growing divide lurking in the heart of today’s system of systems product development, as system complexity continues to steadily increase.  Left uncorrected,  the  dysfunction stands to topple some of the most elaborate (and colossally expensive) enterprise software solutions to date by rendering their perceived benefits moot.  All those nice, warm promises of accelerated time to market and improved execution might get tossed right out the nearest window.  What could possibly do such a thing?  Some kind of inter-dimensional space-potato-monster thing?  Ben Affleck Batman?  No, the truth is less horrifying but equally devastating:  it’s the growing digital divide between hardware and software development, between Product Lifecycle Management (PLM) and Software source control.

Up to this point the challenge has been met mostly with the strategy of not worrying and being happy – which works great for a reggae cover but not so much for actually overcoming obstacles.  Let’s look at the situation at the epicenter of product development, where the most complex products meet many millions of lines of code.   We’re talking about the very source and pillar of most PLM technologies, where requirements management and complex source control is paramount:  that would be Aerospace & Defense.  But for Defense especially, the prognosis isn’t so hot.  An article in the December 2 issue of Aviation Week, has some rather humbling revelations about the health of defense projects:

“Citing a 2011 Government Accountability Office (GAO) report that identified $402 Billion in budget overruns and schedule slippages of up to 22 months for the largest acquisition programs, [US Air Force Lt. Gen (ret.) George] Muellner says that “most major weapons systems development cost overruns are in excess of 30 percent, and because of that, several major defense acquisition programs fail.”  At the Society of Experimental Test Pilots symposium in Anaheim, California in September, Muellner said the common thread to many of the issues is the increasing complexity and integrations of systems.  “Software flight test has become enormous.”

Excuse me, I think my wallet is burning.  But is it just testing that’s the problem?  Of course not.  Hold on to your butts:

“The GAO report concluded that testers had not in fact played a significant part in the endemic problems.  However, it did uncover major issues, including weak alignment of the requirement, development and test communities.  “All three treat it as a serial process and as a result, all the trades that need to occur don’t get done.  It’s a key part of the problem,” he says.  The report also noted serious flaws in systems engineering and the obvious point that without improvements in this overarching discipline, problems with inadequate software will persist.”

Mr. Wizard.  Get me the hell out of here.

But why?  Despite the fact that many PLM systems have long retained some type of source control capability, the truth is the two are very different universes.  The very nature of software development, i.e. the continuous branching,  forking, and merging inherent in any codebase has proven rather unnatural for PLM.  When code spans across product lines in highly non-linear ways, capturing a configuration across a hierarchical product structure becomes especially daunting, if not practically impossible.  English translation: you can’t draw straight lines between hardware and software.  The reaction to this problem, for the most part, has been a cheat.  The most common approach is that software development is managed in a wholly separate  system with design cycles completely independent of the hardware development lifecycle.  In others, certain software milestones are represented in the hardware structure.  But such representations are only symbolic, if only to keep project schedules aligned to arbitrary Work Breakdown (WBS) structure and keep the bean-counters satiated, but doing nothing to address the fundamental design alignment.  And how effective has that cheat been?  Well it looks like at least $402 Billion less than it should be in one example.

But why hasn’t this problem toppled product development entirely?  For one, we humans are really good at delivering one principal in business: the brute force methodology.  Otherwise known as the throw more warm bodies at it philosophy.  With reasonable time, the million monkeys at their typewriters will poop out the all the works of Shakespeare, marginally stealthy fighter jets, and probably-good-enough warp drive in approximately that order.  But as complexity continues to increase, the monkeys are starting to think about jumping back to the trees.

Is there anywhere else to go?  Perhaps.

“As systems become more complex and integrated, developers and testers must identify and implement new approaches to software development and testing to reduce cost and schedule impact.”

The solution lies in eliminating the serial handoff between software and hardware such that system of system design occurs in parallel across all disciplines.  And that will require embracing that nonlinearity.  That is exceptionally hard.  Such an approach likely  requires all new methodologies and all new software tools, whether as one monolithic system or a collection of highly specialized, yet highly integrated federated systems.  Perhaps it may seem like tumbling down the rabbit hole, but it sounds like it’s time to slam down that red pill and get started.  Well, unless you happen to have an extra $402 billion lying around.

  • jan takke

    Ed, Very true!! Ever since Computer Aided Software Engineering (CASE) was called that rather than CAD-S (next to CAD-E and CAD-M), some 40 years ago, this has legitimized a very isolated way of (not) looking at the integration of the various disciplines present in state-of-the-art product development.

    Since functionality of products has become distributed across all disciplines (and subject to interrelated change processes) it is about time for a more holistic view on design tools.

    • It’s going to be a difficult transition, because you’ll also need holistic engineers to use those new tools. The advantage is that the new generation has been trained since birth to process multiple inputs – as much as it has been panned that may save us yet.

  • Ryan

    Sounds more like the traditional “greed is good” problem. If we let programs go over budget like that what else can you contribute it to? Plain and simple greed.

    • It’s a good point Ryan, for defense spending especially, the US government simply allows this to happen. It’s really perplexing to me though – having been involved with a few of these… I often see product development teams worked to death on their own time. Yet budget performance continues to be disappointing. All signs are pointing to a management crisis and (as I mentioned in the article) a distinct lack of a holistic view of the design. Despite all the management tools, methodologies, and software too many uncontrolled surprises are wrecking programs.

  • Youhey

    One aspect of problem is because of the low integration maturity, rigid data-models, etc. of our IT-Systems (PLM, SLM, etc.), which can be hopefully solved in 3 to 7 years!!

    The more important aspect is however cultural and need probably 10 to 15 years. Indeed new generation of engineers and system developers are looked for!

    I read somewhere that University Kaiserslautern in Germany want to introduce “System Engineering” as a new study program for M.Sc. It should be then a mixed discipline of Mechanical, Software and Process engineering lectures with focus on real PLM-STRATEGY. Just wonder, what the syllabus are :).