You are currently viewing Snowden Used A Simple Web Crawler To Gather NSA Data

Snowden Used A Simple Web Crawler To Gather NSA Data

Edward Snowden is the man who helped start a global debate on the digital surveillance of NSA by leaking critical, internal data from the agency. It has now been revealed that Snowden collected this data using simple web crawlers.

Edward Snowden

Normally, people in such situations need sophisticated tools to access such classified information. In the case of Snowden, however, he was stationed at a remote NSA station in Hawaii where he had access to a treasure-trove of NSA’s data. This data was primarily accessible to him simply because he was appointed as a technology contractor for the Hawaii office.

It was part of his job to maintain the computer systems at an office which focused on intelligence matters concerning China and North Korea. It was probably here that Snowden realized that NSA had put a massive digital surveillance net in place, not only internationally but also domestically.

He then used a web crawler to identify and gather this information. A web crawler can move through an internal network as well as from one site to another and within certain parameters, it can identify relevant information and then copy it. According to the intelligence officials in a recent hearing, Snowden was able to access a total of 1.7 million files.

Normally, anyone within NSA trying to access such a huge amount of data would have raised red flags. But Snowden was lucky in that he was posted at the Hawaii bureau office where many of NSA’s usual security checks weren’t in place. He was once asked by the senior officials about the matter and he returned a satisfying response that he was doing so merely to create backup of the data, being a part of the tech support team.

According to NSA officials, if Snowden collected data in a similar manner while being in NSA headquarters in Fort Meade, a number of security checks would have instantly flagged his activity as suspicious. It is unclear whether Snowden was simply lucky or strategic and the agency refuses to reveal whether or not the crawler he used for gathering data was created by himself.

If anything, this hints at a worrisome possibility that NSA’s internal checks are not as good enough as the agency puts them out to be. And that further reaffirms Snowden’s assertions that NSA itself often considered its actions above law, given how data could be accessed and used by any NSA official in a fashion similar to Snowden. While Snowden used this data to fulfill a civil obligation of highlighting an unconstitutional activity, others may obtain it to abuse it, for their personal or otherwise gains. And that is also one of the things Snowden hinted at in his leaks.

Courtesy: NYT



Salman Latif is a software engineer with a specific interest in social media, big data and real-world solutions using the two.Other than that, he is a bit of a gypsy. He also writes in his own blog. You can find him on Google+ and Twitter .

Leave a Reply