Earthweb.com Practically Networked Home Earthweb developer.com HardwareCentral earthwebdeveloper CrossNodes Datamation
Welcome to PractiallyNetworked
Product Reviews

 • Routers
 • Hubs/Switches
 • Wireless Gateway
 • Wireless AP
 • Wireless NIC
 • Network Storage
 • Print Servers
 • Bluetooth Adapters
Troubleshooting
& Tutorials

 • Networking
 • Internet Sharing
 • Security
 • Backgrounders
 • Troubleshooting
    Guides

 • PracNet How To's
User Opinions
Practicallynetworked Glossary

 Find a Network Term  
 
Forums
About
Jobs
Home

  Most Popular Tutorials

• Microsoft Vista Home Networking Setup and Options
The most daunting part of upgrading to Windows Vista may be trying to figure out where in the layers of menus the networking and file-sharing options are hidden.

• Do It Yourself: Roll Your Own Network Cables
It may not be something you do everyday, but having the supplies and know-how to whip up a network cable on the spot can be very handy.

• Tips for Securing Your Home Router
Seemingly minor and easily overlooked settings can still have profound security implications. Here are some steps you can take to make sure your wired or wireless home router and by extension, your network is as secure as possible.

  Most Popular Reviews

• Microsoft Windows Home Server
If you have a home network, you'll welcome the easy file sharing, remote access and the image-based backup features of Windows Home Server.

• Iomega StorCenter Network Hard Drive
Iomega's fourth generation StorCenter Network Hard Drive brings many of the features found in higher-end storage devices down to an attractive price.

• MikroTik's The Dude
This free tool delivers many of the same capabilities that you'd find in pricey network monitoring tools. As long as you don't mind tinkering, The Dude is a decent network utility that should be worth the download.



Networking Notes: Yes, We've Got Something to Hide

It's not paranoia: Everyone does have something to hide: Social security numbers, ATM card PINs, passwords for the company VPN, electronically transmitted medical records and more.

Networking Notes

Once upon a time, it's conceivable that the old saying, "You've got nothing to be afraid of if you've got nothing to hide" made sense beyond its utility as a rhetorical bludgeon for surveillance state apologists. These days, though, everyone does have something to hide: Social security numbers, ATM card PINs, passwords for the company VPN, electronically transmitted medical records and more.

And we've got a proliferation of ways to keep all that information secret, too: encrypted file systems, secure mail certificates, encrypted data streams, wireless security standards and much more.

Most of it is pretty good, and when any of it proves to be bad, we often hear about it quickly. It took little time, for instance, for everyone to learn that using WEP outside its intended purpose — as a way to provide about the same level of security one could expect from a wired LAN — was dangerous and foolhardy. If you want an example of the Internet at its fascinating best, in fact, go take a peek at distributed.net, where its past projects have included harnessing the processing power of computers all over the 'net to crack encryption standards.

So, to judge from all those privacy-enhancing technologies on the market, it's safe to say we've collectively accepted the idea that it's OK to hide some things.

People who worry about how secure all this stuff is worry in particular about the presence of so-called "back doors:" deliberate flaws or openings in a security protocol or technology that allow easy access to anyone who knows how to use them.

Security researcher Bruce Schneier, for instance, recently wrote about what appear to be deliberate flaws in a security protocol championed by the National Security Agency (NSA). Some security researchers, he says, showed that there exists a set of secret numbers that could be used to predict the output of an encryption key's random number generator — the heart of its usefulness when it encrypts data — with a tiny sample of its existing output.

It's not like law enforcement personnel have ever been shy about explicitly asking for back doors. One early debacle during the Clinton presidency was the introduction of the Clipper chip, a chipset meant to provide strong encryption with an added benefit for the government: something called "key escrow." Key escrow would allow government agencies to decipher any communications protected with Clipper cryptography provided they showed cause to a regulating body that held all the keys for assorted Clipper-enabled devices. Clipper went nowhere quickly, but it helped accelerate a rush for security techniques that worked independently of any third party, which we'll get back to.

Another, more recent case of applied back doors surfaced in the news recently when it came to light that Hushmail, a secure e-mail service, had handed over customer e-mail in compliance with a U.S. federal investigation into illegal steroid sales. Wired's Threat Level explained the situation fairly well:

"The Canadian e-mail provider offers two options for its users. One method works nearly identically to typical Webmail, with the exception that the company's Encryption Engine, encrypts and decrpyts messages that go to or from other Hushmail users (or to people who use PGP or GPG running on their own computers). In that service, Hushmail's servers briefly see the passphrase that unlocks a user's emails, but normally does not store it.

"A second option sends the Encryption Engine to a user's browser as a Java applet. That method, where the encryption and decryption of email is done in the browser and the passphrase never leaves the user's computer, was widely presumed to be much safer than the webcentric version."

As it turns out, though, Hushmail's more secure offering has issues of its own, and that it can also be compromised by law enforcement personnel using "a rogue Java applet to targeted users that will then report the user's pass phrase back to Hushmail, thus giving the feds access to all stored emails and any future emails sent or received."

Reaction to the situation has been somewhat mild, all things considered. Hushmail has updated the information it provides users, and security researchers have lauded the company for dealing with the situation somewhat transparently. And there's an unspoken understanding in the midst of the reaction: When you trust your privacy or security to a third party, you run a greater risk than if you engineer your own solution. Passing sensitive mail through a third party broker was problematic from the start.

So what options do you have?

The Wired quote above made passing reference to PGP and GPG.

Part of the Clipper backlash I mentioned earlier came in the form of a rush to develop encryption techniques that didn't require a third-party intermediary. PGP, or "Pretty Good Privacy" provided one such method.

PGP uses "public key" cryptography. Briefly:

  • You create a private key only you know the password for.
  • You create a public key you distribute to others.
  • When you encrypt a message meant for someone else, you use their public key to encrypt it. They use their private key to decrypt it.
  • When you digitally sign a message, you use your key to create a hash of the message. If the message is altered in any way, attempts to verify the hash using your public key will fail indicating that the message has been altered.

There's nobody in the middle collecting passwords, storing unencrypted text or otherwise participating in the encryption phase of the data's preparation. Sure, ISPs and admins see the encrypted data as it passes through their networks and servers, but it's as opaque to them as anyone else.

The techniques used to encrypt data with PGP are quite secure. Secure enough that anyone trying to get at the contents of PGP-encrypted content will probably have to resort to direct coercion to get at the data. In 1991, when PGP was first introduced, its creator took steps to share the software with everyone on the Internet, ensuring that everybody who wanted it could have practically impregnable privacy. That spurred an investigation from the U.S. government, which invoked laws against sharing cryptographic systems of a certain complexity with foreign entities. The investigation was eventually dropped and the PGP genie not only stayed out of the bottle but became the foundation of the PGP Corporation, which makes money baking PGP into a number of e-mail and desktop programs and appliances.

PGP can be used to not only encrypt e-mail traffic, but any sort of electronic data. A desktop product, for instance, encrypts sensitive files; and PGP is used to protect instant messaging traffic.

Here's the problem with PGP (and its Free Software descendant, GNU Privacy Guard (gpg)): It's not the easiest technology in the world to use, it's not as immediately easy to grasp as "this program scrambles your data and you don't have to do anything but click a button," and to work its best it requires a social element.

So having taken the time to cover what PGP is, next time we'll look at a few ways to use it. It takes a little effort, but it's one of the most secure options going for protecting your information from prying eyes.

Michael Hall has been using, maintaining and writing about networks for nearly 15 years. He's the managing editor of Enterprise Networking Planet and he blogs about Internet privacy and security at Open Networks Today.


Add to del.icio.us | DiggThis


For more help, don't forget to try one of our PracticallyNetworked Forums.



Earthwebnews.com Earthweb developer.com HardwareCentral earthwebdeveloper CrossNodes Datamation


Home | Networking | Backgrounders | Internet Sharing | Security | HowTo | Troubleshooting | Reviews | News | About | Jobs | Tools | Forums