Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Equifax Breach Caused by Lone Employee’s Error, Former CEO Says (nytimes.com)
94 points by DLay on Oct 4, 2017 | hide | past | favorite | 80 comments


http://web.mit.edu/2.75/resources/random/How%20Complex%20Sys...

7) Post-accident attribution accident to a ‘root cause’ is fundamentally wrong. Because overt failure requires multiple faults, there is no isolated ‘cause’ of an accident. There are multiple contributors to accidents. Each of these is necessary insufficient in itself to create an accident. Only jointly are these causes sufficient to create an accident. Indeed, it is the linking of these causes together that creates the circumstances required for the accident. Thus, no isolation of the ‘root cause’ of an accident is possible. The evaluations based on such reasoning as ‘root cause’ do not reflect a technical understanding of the nature of failure but rather the social, cultural need to blame specific, localized forces or events for outcomes.


Damn right. Security forensics should operate more like the NTSB. There are policy, cultural, process, organizational, team and more factors to consider in the totality of MECE-like structured forensics with hopefully a report and recommendations at the end. Political or timid audits aren’t useful in correcting deficiencies wherever they may exist if they jump to a narrow conclusion too quickly.


That quote sounds good, but I don't think it's necessarily applicable to this situation. The author seems to be talking about complex systems that are designed and operated to be robust against failure, like the space shuttle. Saying that Challenger blew up because of an O-ring is technically correct but also horribly wrong, as an example. Equifax IT does not appear to be operating at a level to prevent a single failure from causing terrible damage all on its own.

That aside, it's hardly true that one person can bear all the blame for not patching their systems, even if they did successfully prevent patches from happening. For one thing, how the hell did they keep their job after doing that? Unless it was the CEO (well, now that they have a new CEO maybe they'd like to put all the blame on him), there was someone up the chain who could insist that the patch get applied. I think you definitely could apply root cause analysis techniques here, and I strongly suspect that such analysis would uncover numerous serious deficiencies in Equifax's IT operations. Of course, guessing that a large boring corporation has terrible IT practices is similar to guessing that a given duck quacks and has wings, so there's that.


> Equifax IT does not appear to be operating at a level to prevent a single failure from causing terrible damage all on its own.

they're operating at a level where over a hundred and thirty million people could have their ability to get a mortgage, open a bank account, or start a business harmed. If you think that such responsibility does not mandate the highest requirements for data safety, you should not work in this industry.


> That quote sounds good, but I don't think it's necessarily applicable to this situation. The author seems to be talking about complex systems ...

Companies, and the people, teams, and processes that those companies are comprised of, are complex systems in the manner the paper is discussing.


Great quote. Always worth mentioning the book "A field guide to human error". A quote from that book:

> Throwing out the Bad Apples, lashing out at them, telling them you are not happy with their performance may seem like a quick, nice, rewarding fix. But it is like peeing in your pants. It gets nice and warm for a little while, and you feel relieved. But then it gets cold and uncomfortable and you look like a fool.

Explanation of why the bad apple theory doesn't work - https://goo.gl/LPKMns


Good to know.

And what about the person who’s job was to make sure that one guy did his job?

And the guy who was in charge of that person?

And the department who’s job was makin sure nothing was insecure?

And the guy managing them?

Yep. All one guys fault. Poor guy, ruining the American credit monitoring system for the rest of us.


Sounds like they're pointing fingers now.

It's funny. They should know that if a single employee can cause all that, it usually means they had some severe problems buried in their management.


Probably an engineer, definitely not someone in management. Just like the VW scandal, which I'm sure you've heard was all caused by a single rouge junior engineer. Blast those engineers, one day we'll figure out how to run the company without them.


As somebody with full IT administration access I am just too aware of how easy it is to entirely destroy an entire company with a single inadvertent command. And the risk of this happening increases swiftly with stress levels and work piling up and "just get this done quickly!", "why aren't you done yet?", "we need this tomorrow!" etc. And that work environment is the norm - not the exception.


I’ve been there, and I’ve seen someone come really close to ruining a company with that one mistake.

But even in a small company there were others who could patch things. There were people above me who kept an eye on if patches were applied (or at least reported to be applied).

It wasn’t just ‘we told guy X to patch and never followed up’.


it is also worth noting that the experience of being "responsible" for a company being destroyed could cause suicidal risks for the respective person. I once accidentally didn't close the door to the office with a key - the door had a defect which caused it to not lock sometimes - this made me realize how heavy of a burden it would have been if the company would have gotten robbed or something - which could have been very well its end. That was a very tough experience - luckily nothing happened.


Exactly! It's like saying a bug in production is caused by the programmer who wrote it. While actually it's the lack of a proper QA process.


$3B/yr in revenue; 9500 employees

One person responsible for the security of the enterprise.

If there is truly one person for a company this large, then he was setup to fail from the beginning. The management is negligent and incompetent for not creating a system for this. That's his job.

I think more likely, the CEO is full of shit and they're scape goating some poor person. But even if that's not the case, this is a terrible thing for him to admit. If he's really that incompetent, he has no business in management. Hopefully he never works in management again. Kiddo needs to go back to school, he's clearly forgotten all of his training.


Well, this is what happens when people call for jail time. The moment someone goes to jail for something like this, it will change security issues forever: People will stop reporting breaches, and developers themselves will be at risk of going to jail.

Also, if you try to kill Equifax, companies will stop reporting breaches.

I don't know what the ultimate outcome of all of this will be, but it's important to keep perspective. People are out for blood, and it's both scary to watch and unsettling to think of the precedents it might set.


> Also, if you try to kill Equifax, companies will stop reporting breaches.

Equifax is a special case though. It isn't Equifax that needs to be killed, it's the concept of credit reporting agencies in general -- they inherently constitute systemic risk. The more private information is concentrated in one place, the more attractive a target it creates for attackers and the more severe the consequences of a breach.

We need to figure out a way to make data warehousing operations like this impractical so these dangerous targets no longer exist.

One good step would be to prohibit the use of social security numbers for anything other than social security.


> One good step would be to prohibit the use of social security numbers for anything other than social security.

That's already true, it's just ignored and not enforced.


> That's already true, it's just ignored and not enforced.

That should make it easier to start punishing people for doing it then.

Even if the penalty was only a $100 fine, multiply that by 320 million counts and it turns into real money.


> Also, if you try to kill Equifax, companies will stop reporting breaches.

The breach is not why I want the Equifax CEO (and everyone on the board) to go to prison. I want them to go to prison because of what they did and didn't do after the breach. The CEO is at best incompetent but that is a very generous reading of what took place.


> Also, if you try to kill Equifax, companies will stop reporting breaches.

The US government killed Arthur Andersen. Financial fraud is still reported. Equifax is not too big to dissolve.

https://en.wikipedia.org/wiki/Arthur_Andersen#Enron_scandal


Many reasonable people seem to believe that the backlash and horror inm response to the US government killing of Arthur Andersen and the subsequent job losses were what led to the later toothless reactions by the DoJ to subsequent corporate scandals:

http://www.npr.org/2017/07/11/536642560/is-the-justice-depar...

http://www.slate.com/articles/podcasts/slate_money/2017/07/t...

https://www.amazon.com/Chickenshit-Club-Justice-Department-C...


I mean this entirely seriously: perhaps its time we be less reasonable people.


Do you have any problem with the concept of credit bureaus, and is it possible that it's influencing your reaction?

edit: also, so this is topical, the CEO is clearly incorrect. Even if one employee were somehow responsible for the unpatched version of struts being present on a particular system, there is guilt by omission/neglect for at least 1) poor secure enclave design (attackers getting access to the system, degree of access that system had, lack of partitioning of sensitive data) 2) lack/deficiency of red team/pen testing 3) lack of process in static scanning of deployed code.. it goes on. As someone noted, seems like negligent process given the degree they were a target. Granted, I'm going off the publicly disclosed information, so who knows..


No problem with the concept, only the execution. If you can't secure the data, you don't get the data (by law). That simple.

This is Econ 101 (incentives matter). If you do not penalize negative behavior, there is no reason for it not to continue.

Those security vulnerability notifications should've gone into a tool to be actioned by a team (JIRA, PagerDuty, whatever) with follow up and verification (audit logs from their CICD pipeline confirming a patched version had been deployed to all environments dev through prod); that's an organizational and leadership failure, which should have consequences.

Disclaimer: I work in the financial services industry in security, but not CRAs.

EDIT: Agree with your assessment edit, its a total failure of risk management within the org. Again, you need dire consequences when that occurs.


I think someone else commenting on this story correctly noted that penalties create a reverse incentive for disclosure, and there could be negative externalities in that incentive structure (to riff on the Econ 101 them). Since you are in the field, have you heard if it was confirmed vector of CVE-2017-5638? I have heard so little confirmation/attribution here... it's really hard to attribute how much was negligence without knowing how long the attackers were present, how many systems were compromised. I added a comment (I think) before your reply that generally agrees that this had to be a systemic failure on several levels. I still struggle with the fact that on some level Equifax were the victims of a criminal action, though.

EDIT: hoping to do this quickly to avoid comment/edit race conditions, but I wish I knew the right answer in re. penalties. Think about the system that exists with doctors, malpractice insurance industry, medical liability, review boards and the benefits society gets from transparent disclosure from reviews of medical errors. Honestly, short of jailing people, it's hard to see how criminal liability in this case could be worse than potential civil liability.


Maybe or maybe not. What if employees could blow the whistle on negligent handling of user's data, and even get some kind of whistle-blower reward for it? This would strongly encourage companies to keep their houses in order, I imagine.


BS they already only report breaches when required by Law.


*and caught red-handed.


...years later.


Jail would be a leap. Has a company ever had to pay a financial penalty for neglent handling of customer information?


>Kiddo needs to go back to school, he's clearly forgotten all of his training.

No he hasn't. He never had it. He was exposed as a fraud who didn't belong.

His golden parachute turns to lead. The investors take a bath. Its the only reasonable way any of this will ever get fixed.


That's an interesting point. My other comment suggested one way we could give employees a vested interest in reporting negligence in security and the handling of users' data. You're saying the shareholders (or at least the large ones) should realize this is a risk to their money and act accordingly. Something of a two-pronged approach to keeping a company honest.


Of course. Stock prices tank, dividends disappear to pay fines and restitution, and the board calls an emergency meeting and sends that half-wit packing amidst a cloud of words like "malus" and "clawback".

It goes entirely without saying that a CEO with a failure of that magnitude in his wake is lucky to get a job managing a pizza joint after this.


But now he's experienced in handling issues of this type. It's a bonus for his employment.


In my experience, a lot of corporate entities have bad rules like "30 days to review patches before they go live", or "no patches not reviewed by team X" that slow down changes. These sorts of caveats are both hard to change, and even harder to circumvent, because big companies make change difficult as they usually have more to lose than to gain.

If you look at the article, it matches this idea:

> ... Mr. Smith referred to an “individual” in Equifax’s technology department who had failed to heed security warnings and did not ensure the implementation of software fixes that would have prevented the breach.

I doubt one individual is responsible for every patch in the organisation, and I reckon that Equifax likely has many individuals each responsible for different systems, all of whom have to deal with a central security department before they can, well, patch their system. I further bet the internal politics are off the chart, and the security team is a "no, you can't do that" department who makes things worse.

I put money on there being plenty of "individuals" who are each responsible for patching different systems at Equifax, and while this particular breach was in system X, A-W might, at another time, have been the epicentre of a breach for similar reasons related to internal processes that make moving fast nigh on impossible.

Now, while that's no excuse, I think the fault is likely not the individual who missed the patch, but the interaction between departments with different goals (political and practical) combined with an internal structure that makes changes glacially slow, and this sort of breach inevitable.


>"The company sent out an internal email requesting that its technical staff fix the software, but “an individual did not ensure communication got to the right person to manually patch the application,” Mr. Smith told the subcommittee."

So someone forgot to forward an email? What else does ensuring email communication got to the right person mean?

When the security or hundreds of millions of people's data relies on a process of selective email forwarding, the "lone individual" in question is the CEO.


Relying on email alone for this is negligence.

But a ticketing system at least should have been used. How were they planning to check compliance with that email? Obviously there was no audit to check that the email was followed.


This was clearly a failure on the CSO’s part, for which the CEO should take responsibility (after all, he hired her).

One thing I don’t get, though. How did the CSO get hired? It seems obvious that she had no qualifications or skills whatsoever for the job. How do I get a seven-figure gig like that? (I’m kind of serious — how do these positions get filled by people who are so fundamentally incompetent, when many, many individuals could do a better job?)


That's even worst. In most cases there are failures in multiple levels to reach to such a catastrophic event. If this hack is because of the error of a single employee, it means the have no safeguards or procedures setup to prevent such failures of happening. In other words he and his CTO have failed miserably at their job. A company should never depend on a single employee for anything.

Also we should expecting to be see more issues in the future.


The DESIGN of their whole infrastructure was terrible for years.

I work at a school district. If someone broke into our public web server they'd realise the entire webapp points at an WebAPI interface that will still only let you make requests as a logged in user. Meaning it does the same thing as the GUI, nothing less, nothing more. To get "full access" they have two different layers they have to break through.

But even worse for the attacker, in this case full access doesn't even get you full access. Our credit card processing, employee SSNs, and accounting system isn't part of our main database/WebAPI system, and has IP restrictions. In order to log into that you need username/password and 2F provided by SMS.

A completely flat design where a single breakin gives you the keys to the kingdom is unacceptable for any organisation that holds sensitive information. The school district's system was only improved after an external security audit flagged our flat design as dangerous, and they were correct.

No, a single employee was definitely not responsible. This is a systemic issue likely starting at the top. A CEO who thinks a single employee COULD even be responsible is ignorant.


>The company sent out an internal email requesting that its technical staff fix the software, but “an individual did not ensure communication got to the right person to manually patch the application,” Mr. Smith told the subcommittee.

Why.. why did you just not send this person the email instead of having someone send the email to this person? This sounds like BS.

(as an aside, managers are now going to start constantly asking "did you get that email, bob?" to cover their asses)


That's not a valid excuse.

The job of a leader is in part to identify and mitigate risks, or hire a competent person do it for you, while still being responsible for it.

The fact that risks of these magnitude were being mitigated by a single lone guy is a leadership issue.

Then, the problem was not only in the risk mitigation but also in the handling of the incident as well. That's again on the leadership.

Then, the exfiltrated information is not secondary to Equifax's business. It's the core of their business. It's not that they were Target, for instance, where the core of their business is retail... the proper handling of that information was Equifax's only goddamn job.


No it's not.

It may be the actions of a single employee that finally caused the breach to occur, but there was a series of failures that lead up to this point. There should have been no way that it was possible for a single employee's error to cause such a massive failure.


Lone employee, meet bus.

Talk about a failure to take responsibility. Maybe it's the CEO's error to allow a single employee to oversee a catastrophic security breach.


Absolute rubbish! For a company that needs to protect sensitive data, the data breach could be traced back to the decision to put that much data on a publicly accessible web application without any defense in depth. I stand by what I wrote the in the week after the breach https://rietta.com/blog/2017/09/18/equifax-defense-in-depth.


Equifax is going to turn into a business school case study on what not to do... and what not to say.

You could read any PDF or Kindle eBook on leadership to realize that this headline will play very badly.

On a more technical note, how is it possible for a single person to ignore that they needed to upgrade Apache Struts and nobody else notices or cares?


The CEO may blame a lone underling, and congress may blame a lone CEO, but congress shares in the overall blame. Regulation and stiffer penalties are needed to balance incentives so that corporations are motivated to invest in security. As things stand, executives seem to rationalize skimping on security as a smart business decision.


Thanks alot, Bob. Over half the country's information got stolen and it's entirely and solely your fault.


actually, if you only count adults with credit history, it more like 80%+


Thanks Bob!


Yeah, that's pretty bad blaming one employee when a single security hole on a single server resulted in the loss of personal information for 146 million people.


To summarize what is actually going on (and pretty much what has been repeatedly said in here): a lone employee's error did cause this, and that lone employee is the CEO himself.


Ok even if you accept that it fell on one person that caused the error why did disclosure take two Months ? Why were incentives or policies not in place to correct the mistake ?


Disclosure took two months because one employee forgot to send an email about the incident.

/s


Error, Singular?

One single employee developed the requirements for these errors, implemented these errors, tested the errors, documented the errors, and signed off to ship the errors.

What a piece of scum.


If you really think your security is dependent on the practices of one person, then that is the problem.


What's the best way to a) structure the credit system so this doesn't happen and b) incentives to ensure compliance with that structure?

I suspect most people here find the CEO's explanation lacking (and most people who read the NYT, hence the headline): it's no use venting here.

I'm more curious about how to move forward, but I'm not a security expert. Let's assume credit bureaus are here to stay: we, as a society, have decided to lower the price of loans by reducing risk for lenders via easily available credit histories (with all the benefits and drawbacks).

How have some companies and agencies have managed to keep data secure, and how can we encourage other companies and agencies to do so, via carrot and/or stick?


That lone employee would be the CEO...


right?


I would like this CEO to tell me this one employee's pay grade. How is this worker high enough on the org chart to be capable of this sort of impact?

A chief surgeon doesn't blame the lab tech when a patient dies, lead council doesn't blame the paralegal for botching a death penalty case. They would consider it a public humiliation to blame an underling, especially a paraprofessional.


On multiple occasions, Mr. Smith referred to an “individual” in Equifax’s technology department who had failed to heed security warnings and did not ensure the implementation of software fixes that would have prevented the breach.

If an employee isn't heeding a significant warnings (plural!) then it sounds like a management problem too.


Well I mean that one person must have been paid $40 million + a year right? If they were the sole source of protecting a multi billion dollar enterprise surly their worth is more than that of the CEO...oh wait they didn't get paid that...oh..never mind.


Yeah it was one guy, the CEO.


What this Guy is going to say about not encrypting DOB , SSN , name and address etc and storing everything in plain text. is it also a single employees fault ?


Equifax clearly never cared about security or protecting people, because we aren't their clients. But yeah, some of this is just laughably bad.


Culture of the company, not this one guy, is to blame.


If this falls on one employee's head, well, the CEO might not be stoked to when he realizes who is actually responsible.


Former CEO. Who had to quit.


"Retired". With full pension, etc.


Firing someone for cause is difficult. Everyone likes to complain about uncontested golden parachutes, but the lawsuits that result in the few cases that are contested are never pretty.


This reminds me of the story about the intern that wiped the DB with a single command except it's worse.


The individual responsible is him.


Pin the tail on the scapegoat, House subcommittee edition


With great power comes no responsibility, apparently.


"human error" is not a root cause.


Sure, throw that one guy under the bus, CEO.


The one person is the CEO, right? Right?


Yeah that would be the CEO then!


Hiring this guy as CEO?


All of these revelations read like xkcd comics. I mean really? One guy in a corp of 9000+? Critical email threads? This situation is a joke.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: