We Done Been ... Framed!

seriously, frame busting scripts again - what year is it?

The user gets to decide how to use your site. That’s the choice you make when you publish it on the internet. Why should you care about a frame? It’s the user’s choice to have a frame. Shockingly this page has a frame around it too. The frame is none of your business.

Or am I missing something?

@Robert: “We Done Been… Framed!
What???
Is that english? I can’t make sense of the title, am I missing some obscure reference?”

Yes you are

A firefox extension would be great.

The thing is, clickjacking only needs CSS, it doesn’t need javascript. Any solution based on just disabling it isn’t going to get you anywhere. (not sure if noscript, to pick out a specific example, does anything extra to prevent clickjacking - it might)

If you disable javascript, you disable any anti-anti-clickjacking the malicious site might have… but you’ve also disabled the anti-clickjacking measures anyway

Shame on you Jeff. Shame! You are writing a blog on evils of the web while you are propagating the greatest evil of all. Scripts.

Without scripts there can be no XSS and makes ClickJacking very hard. Yet I have to enable scripts just to post on this site. Shame on you!

Scripts are for those who either lack the knowledge to build Servlets (or some non-Sun variation) or who are too lazy to do so. You are a programmers so I am to assume that it is not the lack of knowledge that is the reason for your propagation of this evil, so I must assume that you either truly do not care for security or you are to lazy to enact it.

Mark

@Andrew
Yes you’re missing that it allows you to clickjack. Which was the entire second part of the post…

@seanb, perhaps if you are a .net shop that is true. I am not a .net programmer, I do not know. I do know that Java Servlets can do most of anything that javascripts(including the neat google type stuff) can do, plus many thing that scripts cannot do. Most of what it cannot do, reputable sites have no use of doing anyway(injecting code into the clients hard-drives, etc.)

The one use that have seen for scripts that servlets cannot do is display Swing components on a web page, but neither can javascript. That you have to use JavaFX for that. If you are a .net shop I’m sure that silverlight would be comparable.

Point being, turning scripts off breaks this page. That is a big no-no. One should never build a page that breaks from lack of scripts. Many work sites will not allow scripts to be run (ie DOE, DOD).

The thing about innovation is that if you are given freedom to do good work in your own field of expertise, you perform well.

Since this does not happen in the current setup of industry and the world in general, people who are good at framing, counter-framing, installing malware, etc. are encouraged by the system, while the straight simple guys are prevented from doing what they like and doing which they will be allowed to anonymously help many more people on the web, while also remaining totally dispensable.

This is good for them as much as for the company. This is the good thing about people earning from opensource programming work, especially web-development.
When “Elvis” programmers with good web development skills and a wider vision of the good use of web technologies are prevented from contributing, those programmers get frustrated, the web innovates slower and innovation remains in the hands of a few, leading to monopolies or oligopolies.

This is elementary socio-economics, which the BOSSes seem to not know or choose to ignore for their own perverse pleasures. In such a company, the Holy Grail or Promised Treasure of happy harmony is impossible.
And “Elvis” or “Einstein” programmers get discouraged and sometimes distraught. This is not a good way of running a company by the BOSS. It is agreeable that there is a queue in the company for every desirable post or task. Without that queue it would be quite a chaotic and an unfair system.

But if the queue uses bondage rather than open, acceptable rules for enforcement, it moves from a organised company to a terrible jail.
This only shows the BOSSes in very poor light. This is what the present-day web development ecosystem looks like becoming.

Hence these exploits and these discussions. Any possibilities of change?

Sometimes good programmers decide to quit when they see such behaviour all over the company. It is good to be someone who has been close to the BOSS in earlier projects and to have successfully completed projects or importance to the company. But to see such people behaving like predators with newcomers is not good for the company.

What is true for a company is also true for the system and the industry as a whole. This is the reason the web is so badly broken. There is full freedom to do evil and no freedom and purpose to repair the broken web. Maintaining standards is an activity that does not give the BOSSes of the companies any revenue, after all.

[/rant]

icanhaspostcomment?

How many legitimate uses are there for onBeforeUnload and (onAfterUnload)? It seems to me that allowing a page to interfere with when the browser leaves it is inviting abuse.

It is presumably what enables sitemeter to tell me the outgoing link that a reader clicked, which again is something I am a bit uncomfortable with.

As someone alluded to earlier, there is a very simple fix for this kind of problem.

Obviously you can’t block framing via javascript ultimately, but you can server side by checking the HTTP Referer header.

You might not be able to completely break out of the frame but at least you can serve up a different page.

At least you can prevent clickjacking in this manner though, by requiring that the person has originated from your site and isn’t inside a frame.

Besides if you javascript frame-busting fails, then you can simply get the function to blank the entire page and spit out a warning message. Since cross-domain security applies in both directions.

As someone alluded to earlier, there is a very simple fix for this kind of problem.

Obviously you can’t block framing via javascript ultimately, but you can server side by checking the HTTP Referer header.

You might not be able to completely break out of the frame but at least you can serve up a different page.

At least you can prevent clickjacking in this manner though, by requiring that the person has originated from your site and isn’t inside a frame.

Besides if you javascript frame-busting fails, then you can simply get the function to blank the entire page and spit out a warning message. Since cross-domain security applies in both directions.

Jeff - not sure on a couple of points and hope you can clear them up:

  1. You speak of shortened URLs as evil… but assuming we have our security on PCs and Servers tied up, how are they bad? I’ve read your articles and I’m still left wondering why you find them so bad when the rest of us deal with them so well.

  2. What do you have against framing? I like framing. You have explained how they do it and how it is hard to stop, but you havne’t explained why you don’t want it. Again, I’m left wondering why you find them so bad when the rest of us deal with them so well.

Is this a loss in advertising revenue? Is this altering your site’s statistics? Please explain.

Dude, you actually have a typo in your post… First one i’ve seen you make.

Shortend URL’s are evil because you cannot see where you are going to

Frames are a old fashioned inconsistant and generally bad way of doing what can be done with DIV/SPAN easier, more consistantly and with more flexibility …

Like onBeforeUnload and onAfterUnload it now has very little use outside being annoying and hijacking …

@Jaster: “Shortend URL’s are evil because you cannot see where you are going to”

So what?

Ever seen those new barcodes on posters that when you take a photo can take you to a web page? I don’t know where that is taking me either!

Half the time I don’t know where taxi drivers are taking me - even when we communicate as to where I want to go.

Are either evil?

Could someone please explain how an unclear end location is evil, or even dangerous if client security is up to date?

Ta.

I’d still be quite happy to win a goat.

@Philip,

The hidden urls are evil because they are extremely useful for phishing. Having security up to date won’t protect you from a zero day phishing site asking for your bank password.

@Practicality

Agreed - but does anyone fall for that any more? I don’t know anyone who has ever been silly enough to fall for one of those. Maybe my circle of friends are smarter than the average bear.

So is this only “evil” for those who are silly enough to get caught and those who are upset about the impact on their internet stats???

Sorry - but I still don’t get how it is bad.