Here’s the General Accountability Office’s take on Middle Tier of Acquisition (MTA) in space systems from a recent report.
Remember, MTA efforts must be rapidly prototyped or fielded within a 5 year timeframe, or seek an extension from the acquisition executive. Moreover, an MTA may be one piece of a larger set of capabilities. For example, the Space Force has an MTA effort for the Protected Tactical Enterprise Service to provide anti-jam communications to another MTA effort for the Protected Tactical SATCOM itself.
Multiple MTAs can be strung together for a larger program, or they can can be sequential (e.g., rapid prototyping to rapid fielding), or can spin off into major capability acquisitions, or paired with software acquisitions, and so forth. Here’s what GAO had to say on that:
Throughout this testimony, we refer to programs currently using the MTA pathway as “MTA programs,” although some of these programs may also currently use or plan to subsequently use one or more other pathways before fielding and eventual capability. For the purposes of this testimony, we use the word “effort” to refer specifically to the activities undertaken using a single adaptive acquisition framework (AAF) pathway or any of the paths provided by an AAF pathway (for example, the rapid prototyping path of the MTA pathway).
That gets to a very deep issue with oversight. Usually, a defense program is acquired under a single acquisition plan and pathway. This allows for baselining the entire cost, schedule, and technical characteristics to derive a way to measure the program delivery over its life. But if an MTA effort simply delivers one component of a larger program that feeds off of several acquisitions, then how do you measure performance to the overarching “program”?
Here’s GAO again:
The flexibility the MTA pathway provides acquisition programs also entails some challenges for reporting, monitoring, and oversight. In June 2021, we reported that DOD had trouble tracking cumulative cost, schedule, and performance data for programs transitioning between acquisition pathways or conducting multiple efforts using the same pathway and had yet to develop an overarching data collection and reporting strategy. In that report, we found that the lack of such a strategy not only limits DOD’s visibility into these programs but also hinders the quality of its congressional reporting and makes the full cost and schedule of the eventual weapon system more difficult to ascertain.
DOD’s ongoing efforts to improve its reporting on MTA programs will become increasingly important as these programs, some of which involve critical space-based capabilities, approach the end of their planned 5-year prototyping phase and transition to other pathways. We have conducted a recent review of efforts to implement acquisition reforms and found that DOD had yet to determine key aspects of its efforts such as what information to report. We made two recommendations including that DOD fully implement leading practices for managing reform efforts, such as by developing an implementation plan to track progress. DOD concurred with both recommendations and described planned or ongoing actions to address them.
The current concept of the “program of record” reduces the defense enterprise into a set of analytically independent programs of record, there is no method for baselining efforts that evolve over time, merge into one another, and leverage enterprise tools. The “program of record” reliance on measuring variance to baseline is an industrial era notion that worked well for repetitive manufacturing of widgets. It does not capture the value generated by creative, adaptive, and innovative behavior associated with modern technology development.
Efforts to cohere every software, urgent, or MTA program into a larger program simply destroys the original intention of the new pathways. It creates a new, system of systems design problem. The purpose of iterative/modular acquisition pathways is because no one is smart enough to fully specify an entire major program in advance of prototyping and incremental delivery.
Treating each program as a stovepipe of capability that doesn’t interact with other programs has simply led to the interoperability problems DoD has today — not to mention everything being built “full stack” and avoiding enterprise efficiencies from sharing IT and component backbones.
However, Congress and other oversight stakeholders will need transparency and some measure of control over these MTA efforts that evolve over time. If they cannot rely on Acquisition Program Baselines to measure cost growth, then what is the method of oversight in a dynamic, innovative world?
This is something that needs to be worked on, pronto. Here are some ideas for what contextual oversight could look like within a broader concept of a portfolio:
- Real-Time Spend Reports. Organizations should report obligations and expenditures with multiple dimensions of program tagging as well as traceability to deliverables.
- Metrics of Effectiveness. Metrics should be tailored to the program context. For example, a command and control system might track the number of connected shooters and sensors, the number and types of users, time to complete particular workflows, system uptimes, time to restore critical capabilities, user satisfaction, and so forth.
- Descriptive Analysis. Rather than spending months at a time creating a lifecycle estimate, actual cost data should be continually curated and connected with technical attributes into a single source of truth that helps inform incremental decisions.
- Program Traceability. Project costs and technical outcomes at the lowest possible level should be mapped to their antecedents and dependencies between programs, creating a “family tree” of individual efforts.
- Human Factors. Participant and stakeholder perspectives should be reported using the multi-disciplinary methods of project histories and linked to the strategic landscape.
Note that all these methods of contextual reporting should be part of a “living” stream of reporting, continually updating and curating. There’s more on the history of defense oversight and recommendations for the future in my recent NPS paper, Pathways to Defense Budget Reform.
Ideas on the future of oversight as very welcome here!
P.S. — I wonder whether GAO had troubles a decade ago with the GPS program that was split into three MDAPs: GPS III space vehicle, GPS Ground (OCX), and GPS Military User Equipment (MGUE). I believe they treated each as its own independent APB, even though the satellites have been launched many years before the ground station or user equipment that can make use of the new M-Code was ready.
Leave a Reply