Forgot password
Enter the email address you used when you joined and we'll send you instructions to reset your password.
If you used Apple or Google to create your account, this process will create a password for your existing account.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Reset password instructions sent. If you have an account with us, you will receive an email within a few minutes.
Something went wrong. Try again or contact support if the problem persists.
Escapist logo header image

Edward Snowden Used a Simple Web Crawler To Gather NSA Data

This article is over 10 years old and may contain outdated information
Snowden

Snowden used a tool similar to the one Google uses to index websites for its search engine to gather his controversial NSA data.

It would be fun to imagine Edward Snowden speed-hacking his way into the NSA servers like Hugh Jackman in Swordfish, or physically stealing a briefcase full of hard-disks and making a daring escape from NSA headquarters, but the truth is much simpler. Speaking to the New York Times, a senior intelligence official said that Snowden used nothing more than a simple web crawler to gather his controversial data.

Using the web crawler, Snowden “scraped data out of our systems” while he went about his day job, said the official. “We do not believe this was an individual sitting at a machine and downloading this much material in sequence,” he said, adding that the process was “quite automated.” To automatically collect the info he wanted, Snowden only needed the right logins to bypass what internal defenses were in place.

What makes this data so damning is that the NSA’s mission statement is to “protect the nation’s most sensitive military and intelligence computer systems from cyberattacks,” which is quite embarrassing considering the simplicity of Snowden’s technique – Investigators found that Snowden’s attacks were hardly sophisticated and should have been easily detected.

Agency officials insist that if Snowden was working at NSA’s headquarters at Fort Meade, he would have been caught, but the Hawaii branch that he was employed at lacked the activity monitors that would have found his bot.

Web crawlers are commonly used by search engines like Google to index websites.

Regardless of whether or not you think Snowden was “right” or “wrong,” you have to admit the NSA is partly to blame for not protecting itself properly.

Source: New York Times via Engadget

Recommended Videos

The Escapist is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more about our Affiliate Policy