When Will They Ever Learn?
Besides being the refrain from a 1960’s Pete Seeger song, it seems to be my most common response when I read the news each day.
Whether it is unprotected Amazon servers, Mongo database servers set up without passwords or something else, human error seems to be at the root of way too many data exposures.
In the case of this story, it is Hadoop Distributed File Systems on the Internet. Hadoop was spawned out of the Google File System and is designed to process large amounts of data on a distributed set of computers – what is called “big data”.
Well, with big data comes big problems.
A search using the Shodan search engine uncovered 4,500 servers running the Hadoop file system that were unprotected.
Unlike a similar search of Mongo database servers on the Internet which found around 47,000 servers exposing 25 terabytes of data, because Hadoop routinely processes mega quantities of data, the problem is a tad bit bigger.
Those 4,500 servers – 10 percent of the exposed Mongo servers – exposed over 5,000 terabytes of data – 200 times more data.
In some of the attacks against Hadoop server clusters, the attacker erased all of the directories and left one directory behind named NODATA4U – SECUREYOURSHIT.
While we don’t know (or at least I don’t know) what is on those servers, why is it that we don’t learn. A current Shodan search reports 200 Hadoop clusters infected.
Which brings us back to the title of this post.
Time and time again we do the same dumb crap, whether it is:
Hadoop clusters.
Mongo databases.
Booz Allen unprotected Amazon S3 buckets.
Dow Jones mis-configured Amazon S3 buckets.
And the list goes on.
Amazon themselves did a search for unprotected S3 storage buckets and found 1,950 plus or minus of them. This was after all of the disclosures of compromised Amazon storage buckets.
We have to stop the self inflicted wounds. Really.
Information for this post came from Security Week.