Should All Web Traffic Be Encrypted?

The prevalence of free, open WiFi has made it rather easy for a WiFi eavesdropper to steal your identity cookie for the websites you visit while you're connected to that WiFi access point. This is something I talked about in Breaking the Web's Cookie Jar. It's difficult to fix without making major changes to the web's infrastructure.


This is a companion discussion topic for the original blog entry at: http://www.codinghorror.com/blog/2012/02/should-all-web-traffic-be-encrypted.html

In my opinion I think http should be encrypted by default. Security on the Internet is not something to take lightly and should be as wide spread as possible.

I have bookmarked https://tools.ietf.org/html/draft-mbelshe-httpbis-spdy-00 and hope to read through it tomorrow.

Lately I have been thinking a lot about how the infrastructure of the entire Internet is fairly unsustainable, especially the web. SPDY sounds let a step in the right direct for part of the problem, but that is just by reading the abstract and what you have written here.

Another complication with encrypting everything is images - if users can post images (like they can in questions and answers on StackExchange sites) then you run into browsers informing people that some elements on the page haven’t been transmitted securely.

Fixing this is possible, but it means that you can just turn on HTTPS and be done. Here’s a post from GitHub: Sidejack Prevention Phase 3: SSL Proxied Assets

Just a quick post to link people to this nice little add-on called “HTTPS Everywhere”: https://www.eff.org/https-everywhere

Stable for Firefox, alpha version for chrome and chromium.

From their site:
“Our code is partially based on the STS implementation from the groundbreaking NoScript project (there are other STS implementations out there, too). HTTPS Everywhere aims to have a simpler user experience than NoScript, and to support complex rewriting rules that allow services like Google Search and Wikipedia to be redirected to HTTPS without breaking anything.”

Being able to safely access Twitter without having your identity revealed is crucial in places like Syria where the government has been known to “disappear” those who speak out against it online. Most of us don’t worry if our online identity is known to those around us, but for people in many parts of the world, keeping their online identity secret is a life and death matter.

The problem with enabling SSL by default is that some (disfunctional) “corporate IT” departments block HTTPS traffic through their proxy servers as a control mechanism. We have SSL forced on our clients who are mostly government and semi-government agencies and we often have to ask them to whitelist our sites.

I even asked one client “… so that means I cannot do internet banking while at work” and the response was “management has only authorised internet banking to the accounts team and they are the only ones allowed SSL”. Makes you wonder how they then establish an online presence via Facebook/Twitter.

I agree, and encryption should be a toggle on the web server, no certificate, red tape or other unrelated stuff required.

This assumes that the network operator is not a bad actor.

A problem with HTTPS is that it can give you a false sense of security. In an enterprise IT environment, you usually cannot have any confidence that your HTTPS session is terminated at the website that you are visiting.

In a coffeeshop, this is harder, as the snoop needs to have a trusted SSL certificate. But still possible.

“HTTPS means The Man can’t spy on your Internet” … yeah, not really. http://www.schneier.com/blog/archives/2010/04/man-in-the-midd_2.html

Remember that VeriSign sells interception tools to law enforcement: http://www.verisign.com/static/001927.pdf

A quote from: http://forum.icann.org/lists/net-rfp-verisign/msg00008.html

Verisign also operates a 'Lawful Intercept' service called NetDiscovery [2]. This service is provided to "... [assist] government agencies with lawful interception and subpoena requests for subscriber records [3]."

We believe that under such a service, VeriSign could be required
to issue false certificates, ones unauthorised by the nominal
owner. Such certificates could be employed in an attack on the
user’s traffic via the DNS services now under question. Further,
the design of the SSL browser system includes a ‘root list’ of
trusted issuers, and a breach of any of these means that the
protection afforded by SSL can now be bypassed.

We do not intend to pass comment on the legal issues surrounding
such intercepts. Rather, we wish to draw your attention to the fact
that VeriSign now operates under a conflict of interest. VeriSign
serves both the users of certificates as customers, and also the (legal)
interceptors of same. The certificate owner loses in this battle
due to straightforward economics, and is thus no longer represented.

For some payment gateways, they require any forms taking user input to be encrypted using SSL - verified by having PCI Compliance.

http://www.mcafeesecure.com/us/products/pciFeatures.jsp

I got a hardware loadbalancer for my https://clubcompy.com cluster and it offered hardware SSL encryption. So I thought, what the heck, I’ll encrypt all the things! It’s a site for kids after all, so I had better at least try to keep 'em safe from the bad guys.

Https hasn’t slowed the site down one bit, although it took quite a bit of hacking to get every single page to be encrypted. Ended up using a servlet filter to force all requests to redirect to https it rather than configuring anything at the web tier or on the LB.

@Carson: the CA system is far from perfect, but encryption without any kind of identity verification is almost completely useless, because anyone can still MITM you - how would your browser know if they’re connecting to the real website or a fake?

@Porges: A simple way of eliminating that threat is to connect to the same website both directly and through a proxy (VPN, Tor, etc) and then compare the certificates you get each time. Unless “The Man” is MITMing that proxy too (which is improbable), the fingerprints won’t match.

Perspectives[1] and Convergence[2] are more or less automated versions of the same principle: you ask different computers - called Notaries - to tell you if the certificate they get is the same you got.

[1]: http://perspectives-project.org/
[2]: http://convergence.io/

Oddly, can’t connect to https://www.codinghorror.com/blog/2012/02/should-all-web-traffic-be-encrypted.html

(BTW, about the notice at the bottom of your page, “©” is a nonentity as far as US copyright law goes. The law is explicit about what constitutes a copyright notice, and the three characters “(”, “c”, and “)” are not among them.)

…And https://stackoverflow.com still results in a certificate error.

@Duffbeer703, how would IT commit a man in the middle account without the user knowing about it? Install a compromised browser or computer spyware?

There’s a single point in SSL that makes me shiver. And those are CAs.

I just don’t trust the current CAs out there and I think that the model is broken. There should be an public and open CA to issue certificates in a secure and manageable way. Besides that the current model inflate certificate prices.

Glad you finally came around to our way of seeing things :slight_smile:

I sort of like Amazon’s approach to this problem. They have a two-stage authentication setup which is a nice compromise between security, performance, and convenience.

Essentially, if you’re a registered user and you return to Amazon.com to do some shopping, then you’re half-way logged in by default. You can browse your wishlist, or look at Amazon’s recommendations for you. You can even look at reviews you may have given.

But if you decide to make changes, view past orders, or do anything that reveals sensitive information, then Amazon forces you to log in for real.

However, after you log in for real, Amazon doesn’t switch you over to HTTPS for the entire site. This might be a mistake, and is likely something they’ll change in the future.

Why cant webapps just present a challenge on every page? Basically imagine if every request was a post and on the page there was a challenge, and you had to have the correct response in order for your IP to not get banned (or whatever).

Just a thought.

a compelling argument, but what about caching?

bandwidth had gotten cheaper and cheaper, true, but there are still many places where it makes sense. 2nd world universities and large offices, for example, that may have a slower pipe for free browsing.