Obamacare rollout disaster is nothing new

By Alan Bean

“I’m going to try and download every movie ever made, and you’re going to try to sign up for Obamacare, and we’ll see which happens first.”

This faux challenge from Jon Stewart to Health and Human Services Secretary Kathleen Sebelius gets to the essence of public bafflement over the disastrous launch of the Affordable Care website.  We have traveled so far down the digital highway, people think, that there’s almost nothing computer technology can’t do . . . like downloading thousands of movies on demand.  So, why can’t a simple government website allow people to sign up for medical insurance?  How hard can that be?

Here’s our deep suspicion: the Obamacare website failed because everything the federal government touches turns to garbage.  While the IT wizards in private industry remake the world, this clanky old government website can’t accomplish simple tasks.  Wouldn’t you know it?

But this simplistic analysis is wrongheaded.

The government’s website is an amalgamation of dozens of private companies contributing individual pieces to the larger puzzle.  Prior to launch, IT company reps told their congressional inquisitors, our little puzzle piece worked just fine.  But when they put the puzzle together nothing worked.  The picture on the puzzle box showed a puppy rolling in a green meadow; but the completed puzzle looked like giant robots battling it out in a dystopian moonscape.  Our firm can’t be held responsible for a system failure.

This happens all the time.  When you download a movie, you are using relatively simple software designed to perform a simple task.  Impressive, to be sure; but straightforward.  When you create a computer program designed to link dozens of discrete systems into a working whole, things can quickly go south.

The difficulty of integrating a series of very different computer systems came home to me when I was researching the unjust treatment of the IRP-6, a group of IT pioneers from Colorado Springs working to create software for the federal government.  The scene is post-9-11 America.  Washington officialdom has realized that the attack on the twin towers exploited the fact that American law enforcement agencies lacked the technology to coordinate diverse databases and intelligence systems.  Dozens of huge household-name IT companies had been working to fix the problem without success.

IRP, a small, underfunded operation in Colorado, believed it could contribute the puzzle pieces that would make the giant robots look more like a puppy.   Some federal officials were interested until the FBI decided to treat the IRP professionals as common criminals and throw them in prison for over a decade.  You can read the whole tragic story here.

When I listen to the talking heads lament the egregious failures of the Obamacare rollout, I am reminded of the decade-long failure of the intelligence community to integrate intelligence systems that didn’t speak the same language.  It wasn’t just that the FBI couldn’t share intel with the NSA; even the various departments within the FBI functioned as communication silos.

Here is the section of my IRP-6 story that discusses the problem as it relates to the FBI.  

Too big, too complex, too complicated . . .

In 2005, Glenn A. Fine, the U.S. Department of Justice’s inspector general, released a report stating that four years after 9-11, the FBI was still generating paper reports and using fifty different data bases that duplicated one another.  In September of 2000, Congress handed the IT giant SAIC $380 million to solve these problems.   In the wake of the 9-11 fiasco, the FBI requested an additional $70 million and Congress gave them $78 million and added DynaCorp, another IT giant, to the payroll with the insistence that the end date of the project must be accelerated.  By the summer of 2004 the expensive software was declared “unfit for use.”

In 2005, the FBI announced that its new Sentinel project would do what VCF had failed to do at an additional cost of $425 million.  This time the end-date was 2009 and the Lockheed Martin Corp. received the lucrative contract.  By 2011, analysts were calling Sentinel “a case study in federal IT projects gone awry” and complaining about “missed deadlines, budget overruns, and feature shortcomings.”  After Lockheed Martin had been paid $325 million without producing significant results, the government issued a stop-work order and started negotiating with smaller, more agile companies.

In a recent interview, FBI Chief Technology Officer Jack Israel explained why industry giants like IBM, SAIC, DynaCorp and Lockheed Martin have repeatedly failed to deliver usable investigative and intelligence software to government agencies.

I’ve been in IT development in government for over 10 years. I started at NSA, then 5 years at the FBI, and I finished about a year at DHS. I grew very frustrated working on large IT programs. Because, by and large, I came to believe that these programs just don’t work. It doesn’t matter who you are, because unless you can logically break them down into very small pieces – I think that’s the way to go – they normally fail.  They’re too big; they’re too complex, and too complicated.

Israel offered this diagnosis in the summer of 2012, just after Sentinel finally plugged some of the communication holes IRP could have addressed eight years earlier.  IRP’s Clinton Stewart isn’t claiming that CILC could have solved all the FBI’s problems, “but they were at 16% [of where they needed to be] when we talked to them, and our product would have taken them to 85%.”

Sound familiar?

People will eventually be able to buy medical insurance through the federal website; but if the issues Jack Israel represent a rough-and-read parallel, don’t expect a quick fix.