and the Case of the Missing Interface Requirements

ArgonDigital - enterprise automation experts

Share This Post

10-22-2013 11-32-20 AMInterface requirements have taken centre-stage with the launch of the federal government’s online marketplace for insurance: As eager enrollees rushed to access the new website, they saw first-hand what happens when a system developed by more than 50 different contractors lacks detailed definition of the data passing between systems.

President Obama recently ordered a surge of IT personnel to help address problems with the site. What they discovered were foundational problems with the way data is exchanged between federal systems and the contractor’s systems. As described in a recent New York Times article, “One major problem slowing repairs, people close to the program say, is that the Centers for Medicare and Medicaid Services, the federal agency in charge of the exchange, is responsible for making sure that the separately designed databases and pieces of software from 55 contractors work together.”

While creating working interfaces seems critical path for the health exchange’s requirements team, especially when the system they are creating is five times larger than a large international bank’s; there seems to be major gaps in how enrolment data is formatted, transmitted,  and validated as it passes between systems. The most troubling issues that have cropped up are with enrolment data.  Once the system determines a users’ eligibility for coverage, it sends that eligibility data along with subsidy information to insurers. “Insurers have found that the system provides them with incorrect information about some enrollees, repeatedly enrolls and cancels the enrollments of others, and simply loses the enrollments of still others.” Without adequately defining exactly how data will be formatted and validated, issues of incomplete or repeated enrollments cannot be addressed.

Here at ArgonDigital, we have found that when writing requirements for large interconnected systems, there are a few key questions that must be answered and coordinated across systems:

  • What is the source of the data being transmitted?
  • What is the destination of the data being transmitted?
  • How is data transformed as it is transmitted from system to system?
  • Does a system have the ability to create, update, view, or delete that data?
  • How is personally identifiable data secured?
  • What is the peak volume of concurrent users at launch, after 6 months, after a year?
  • What is the growth factor for users of this system?

To help capture the answers to these questions, we commonly use RML System and Data models, such as:

Over the last 10 months, hardware and software requirements for the federal exchange site have been modified seven times. With extensive rework under-way, hopefully the eighth change request will include improved interface requirements and RML models.

More To Explore