High performance computing to solve acquisition?

Enter HPCMP, the High Performance Computing Modernization Program. HPCMP provides the DoD with massive supercomputing capabilities and all classes of networks to transport data, as well as software to help the DoD execute trillions of computations per second in support of its development programs. All that capability translates into the potential to dramatically accelerate development timelines. It is not so much that HPCMP is a new organization, because it is not. It’s that the DoD is changing the way it does development, and HPCMP is an enabler to help make that happen.

 

At the recent 20th Annual Systems Engineering Conference, the keynote speaker, Vice Adm. Paul Grosklags, commander of Naval Air Systems Command, had a simple message: “We need to increase the speed of capability development.” He said the way to do that was to design, develop and sustain fully integrated capabilities in a model-based digital environment.

 

What does that mean? In basic terms, it means taking the archaic design-build test-development cycle that takes anywhere from 10 to 30 years to execute, and then inserting a modeling step on the front end.

 

Modeling should not be an after thought. To the contrary, good systems design begins with modeling, and the two must be closely linked in an iterative development process. Models are physics-based, high-fidelity tools that provide an authoritative digital surrogate, or “twin,” that can be used to rapidly test new designs, performance attributes and further develop a system before any metal is cut. The digital twin, as it is called, is used to make informed decisions throughout a system’s life cycle.

That was from a Defense News article from Jan. 2018. Here’s a recent update on that.

My fear about such a modeling program is that it necessarily has an incomplete concept of what is achievable. It will push our attention to the system configurations which are a consequence of its parameters, when in fact the most important part about the competitive design, test, and evaluation process is to learn new things and discover the relevant parameters of the model which we would like to use.

Now, I’m not saying this HPCMP project shouldn’t be pursued. It definitely should. It can lead us to neglected designs and push us closer to something “better.”

But, it cannot be the single source of design validation before prototyping and development. That would make it dangerous. It shouldn’t even be a focal point, for fear of biasing too many engineers to think only in terms of optimizing within its parameters.

The Army was using the model to whittle down competitors to a contract award. If contractors knew in advance the method of evaluation, they would optimize to it. That seems backward. HPCMP can never be an “objective” evaluation tool unless we have a complete and consistent theory of the physical world (I wonder what Godel would say about that). Adhering to such evaluation could then lead to a further suppression of diversity and exploratory developments, and handicap further weapon systems development.

But the DOD has also used the models to aid in government selection of requirements and design choices. The computer proposes solutions which officials can narrow down. I think this can be appropriate, but they shouldn’t use the tool to preempt contractor design creativity. It should be used by expert designers to aid in judgment about the best design, and nothing more. Tests are the only real evaluation.

Many times, the most important advances come because of unarticulated hunches that are doggedly pursued in a trial-and-error way, with feedback from the real world.

Be the first to comment

Leave a Reply