Air Force Featured Stories

Innovative tool streamlines requirements process

  • Published
  • By Katherine Kebisek
  • Air Mobility Command Public Affairs
Air Mobility Command officials recently introduced an innovative tool that is transforming the command's process for gathering and prioritizing Command, Control, Communications, Computers and Intelligence, or C4I, requirements, and will soon be expanded to include other requirements.

The Enterprise Requirements Evaluation Tool, or ERET, streamlines the C4I requirements process giving senior leaders the information they need to make decisions on the command's C4I investments, ultimately providing Airmen the capabilities they need to accomplish the mission. The tool, developed collaboratively by personnel in three headquarters AMC directorates, was built completely within existing resources; it simply took some creativity to bring it to life.

When a program or platform needs to be modified or upgraded -- for example, information technology or aircraft -- personnel submit a requirement to the capability's designated Requirements and Planning Council, or R&PC. R&PCs gather requirements from across the command then prioritize them based on areas such as mission criticality, risk and available funding. Once "racked and stacked," senior leaders make the final determination on which requirements will receive investment funding.

The ERET, a simple tool built using SharePoint, walks personnel through the process of submitting C4I requirements by having them answer a series of questions by selecting responses from a drop-down menu. The tool then calculates a score for the requirement, allowing it to be objectively prioritized against other requirements. One of the most valuable features of ERET is that it provides a repository of requirement information so users can reference and update past requirements rather than research and submit the information every year.

Prior to ERET, the command used a manual, paper-based checks-and-balances process to gather and assess its C4I requirements. While it got the job done, there were several areas for improvement.

"(The process) wasn't widely visible to everybody, and a lot of people didn't really understand it," said Jan Van Horn, the AMC command and control systems functional manager, AMC directorate of operations. This lack of understanding, he said, often resulted in personnel providing insufficient information on requirements which made it difficult to effectively prioritize them. Sometimes requirements didn't even move forward because of missing information.
Additionally, with the variety of mission needs, it was often difficult to rack and stack requirements against each other.

"The problem with comparing requirements is that they're so different," said Baxter Swift, the chief information office support consultant, AMC directorate of communications and information. "How do you compare an aircraft requirement to an IT requirement to a training requirement ... they're all important but so incomparable."

Swift, Van Horn and others knew there had to be a better way to conduct this process. Their goal was to develop a tool that would somehow provide information and a score for each requirement, allowing them to be objectively racked and stacked. And the tool had to be simple, otherwise personnel would be reluctant to use it.

The team began by determining what criteria would be used to score requirements. They studied the Air Mobility Master Plan to understand and incorporate the strategic direction of the command, then worked with AMC's Strategic Planning division to determine capability gaps within the command's four mission areas: airlift, air refueling, aeromedical evacuation and mission support.

"With such varying requirements, the only way to really compare them is against the strategic direction of the organization and see what [the requirement] contributes to that,"  Swify said. "We went through and identified all of the capability gaps for all the (mission areas) ... and that was a big score contributor. If a requirement was going to fill a capability gap then it would typically score better than one that would not."

"It's very important for our senior leaders to be able to link requirements to the gaps and the focus areas," said Teri Alesch, a requirements analyst, AMC Directorate of Strategic Plans, Requirements and Programs. "It's often a matter of explaining (the requirement) better and that's what ERET helps us do." Alesch further clarified that ERET informs decision makers, but it does not make the decision; it simply provides a solid, objective starting point where military judgment can then be applied.

Since ERET has only been used in one requirements cycle so far, it's too early to determine metrics for how much time the tool will save. However, the team has already noticed such significant improvements in the process that the AMC director of strategic plans, requirements and programs recently directed them to expand the tool to all R&PCs for the next cycle.

"We saved (decision makers) a lot of time and gave them, I think, more confidence in the product that came out of it," Van Horn said. He noted one general officer level "rack and stack" meeting that traditionally took at least 90 minutes was finished in less than 20 minutes because of the solid presentation, scores, and information provided for the prioritized requirements.

"I think this year there will be a lot more time savings for those of us working at the Requirements Working Group action officer levels because when the call for requirements goes out, so will the link for ERET," said Heidi Kukowski, an AMC directorate of operations operations analyst. "We can pull inputs from last year and it will just be a re-verification."

Kukowski added that a huge benefit for stakeholders is that ERET provides them insight into the requirements process, allowing them to understand how to get their requirement visibility, and why it did or didn't rank.

As the team helps the AMC R&PCs adapt and implement ERET, they also hope to share the tool with other MAJCOMs. Swift said he recently gave permissions to a counterpart at a system program office who was able to test out the tool and adjust it to fit his mission. The two shared lessons learned and questions, ultimately improving both of their processes.

"Anybody that works in requirements probably struggles with how to compare one requirement to the next ... at the end of the day it's always going to come back to money. What this tool is really trying to do is learn as much about a requirement as possible so if I have one more dollar to spend I can put it in the right place to get the most value for the organization," Swift said. "That's what everybody wants to do: invest wisely so they're building for the future."