We Done Been ... Framed!

The thing about innovation is that if you are given freedom to do good work in your own field of expertise, you perform well.

Since this does not happen in the current setup of industry and the world in general, people who are good at framing, counter-framing, installing malware, etc. are encouraged by the system, while the straight simple guys are prevented from doing what they like and doing which they will be allowed to anonymously help many more people on the web, while also remaining totally dispensable.

This is good for them as much as for the company. This is the good thing about people earning from opensource programming work, especially web-development.
When “Elvis” programmers with good web development skills and a wider vision of the good use of web technologies are prevented from contributing, those programmers get frustrated, the web innovates slower and innovation remains in the hands of a few, leading to monopolies or oligopolies.

This is elementary socio-economics, which the BOSSes seem to not know or choose to ignore for their own perverse pleasures. In such a company, the Holy Grail or Promised Treasure of happy harmony is impossible.
And “Elvis” or “Einstein” programmers get discouraged and sometimes distraught. This is not a good way of running a company by the BOSS. It is agreeable that there is a queue in the company for every desirable post or task. Without that queue it would be quite a chaotic and an unfair system.

But if the queue uses bondage rather than open, acceptable rules for enforcement, it moves from a organised company to a terrible jail.
This only shows the BOSSes in very poor light. This is what the present-day web development ecosystem looks like becoming.

Hence these exploits and these discussions. Any possibilities of change?

Sometimes good programmers decide to quit when they see such behaviour all over the company. It is good to be someone who has been close to the BOSS in earlier projects and to have successfully completed projects or importance to the company. But to see such people behaving like predators with newcomers is not good for the company.

What is true for a company is also true for the system and the industry as a whole. This is the reason the web is so badly broken. There is full freedom to do evil and no freedom and purpose to repair the broken web. Maintaining standards is an activity that does not give the BOSSes of the companies any revenue, after all.

[/rant]

icanhaspostcomment?

How many legitimate uses are there for onBeforeUnload and (onAfterUnload)? It seems to me that allowing a page to interfere with when the browser leaves it is inviting abuse.

It is presumably what enables sitemeter to tell me the outgoing link that a reader clicked, which again is something I am a bit uncomfortable with.

As someone alluded to earlier, there is a very simple fix for this kind of problem.

Obviously you can’t block framing via javascript ultimately, but you can server side by checking the HTTP Referer header.

You might not be able to completely break out of the frame but at least you can serve up a different page.

At least you can prevent clickjacking in this manner though, by requiring that the person has originated from your site and isn’t inside a frame.

Besides if you javascript frame-busting fails, then you can simply get the function to blank the entire page and spit out a warning message. Since cross-domain security applies in both directions.

As someone alluded to earlier, there is a very simple fix for this kind of problem.

Obviously you can’t block framing via javascript ultimately, but you can server side by checking the HTTP Referer header.

You might not be able to completely break out of the frame but at least you can serve up a different page.

At least you can prevent clickjacking in this manner though, by requiring that the person has originated from your site and isn’t inside a frame.

Besides if you javascript frame-busting fails, then you can simply get the function to blank the entire page and spit out a warning message. Since cross-domain security applies in both directions.

Jeff - not sure on a couple of points and hope you can clear them up:

  1. You speak of shortened URLs as evil… but assuming we have our security on PCs and Servers tied up, how are they bad? I’ve read your articles and I’m still left wondering why you find them so bad when the rest of us deal with them so well.

  2. What do you have against framing? I like framing. You have explained how they do it and how it is hard to stop, but you havne’t explained why you don’t want it. Again, I’m left wondering why you find them so bad when the rest of us deal with them so well.

Is this a loss in advertising revenue? Is this altering your site’s statistics? Please explain.

Dude, you actually have a typo in your post… First one i’ve seen you make.

Shortend URL’s are evil because you cannot see where you are going to

Frames are a old fashioned inconsistant and generally bad way of doing what can be done with DIV/SPAN easier, more consistantly and with more flexibility …

Like onBeforeUnload and onAfterUnload it now has very little use outside being annoying and hijacking …

@Jaster: “Shortend URL’s are evil because you cannot see where you are going to”

So what?

Ever seen those new barcodes on posters that when you take a photo can take you to a web page? I don’t know where that is taking me either!

Half the time I don’t know where taxi drivers are taking me - even when we communicate as to where I want to go.

Are either evil?

Could someone please explain how an unclear end location is evil, or even dangerous if client security is up to date?

Ta.

I’d still be quite happy to win a goat.

@Philip,

The hidden urls are evil because they are extremely useful for phishing. Having security up to date won’t protect you from a zero day phishing site asking for your bank password.

@Practicality

Agreed - but does anyone fall for that any more? I don’t know anyone who has ever been silly enough to fall for one of those. Maybe my circle of friends are smarter than the average bear.

So is this only “evil” for those who are silly enough to get caught and those who are upset about the impact on their internet stats???

Sorry - but I still don’t get how it is bad.

@o.s. – Invisible iframes are often necessary in multi-vendor Ajaxian web sites for cross-site scripting… until cross-document messaging takes off, we’re stuck with the late-90’s hidden iframe trick.

@Mark – Let’s see what happens to the web-based economy of the past 8 years if we all turn off JavaScript. Ooh, that’s ugly. Yes Google, but no Google Maps, Gmail, or any of thousands of elegant web applications that make life better. Features that fall under the “Ajax” rubric aren’t just bells and whistles – they’re called “functionality.”

seanb said it best… if security against hacking is your only motivator, use pen and paper.

Finally, @Jeff – good post. Ignore the nay-sayers.

I think that the only one who should be worried is the user. Consider the scenario where someone wants to provide a “Browser in browser” service. Think of it like Digg providing internet browsing through their site having the bar act as a browser window frame. Of course you shouldnt bust it. It is like a “virtual browser” in a browser. If the user doesnt want it or like it, he/she should manually change domain.

URL Shorteners USERS UNITE!

  1. Thou shalt not FRAME my URL.

  2. Thou shalt not display any advertisements during the redirection of my URL.

  3. Thou shalt not promote additional information during the redirection of my URL.

http://www.url360.me/commandments.html

Any pop ups would be more damaging to the innocent site that they would be for the disreputable frame buster busting site…

Therefore, I must agree with this post:

http://beerpla.net/2009/02/12/how-to-fight-clickjacking-using-the-recent-twitter-hijacking-as-an-example/

The solution does not go without having the user be a bit more savvy about browsing. (i.e. paranoid)

As a user, have NoScript or something like it block by default, and make sure you only allow the sites you trust to run any code. A bit troublesome to start with, but it grows on you like a third arm.
(very handy once you get used to it some would say)

This of course would prevent the frame busting code from being busted.

Not even close to being a silver bullet, but unless the user trusts a site he/she shouldn’t trust, the chances of the browsing experience being safe are high.

We should blame the nature of general web browsing on this, the initial thought of having a way to read information online has gone beyond what was supposed to be, we all know this, but still we need it to be more that what it can be, this is why time and again we all think that maybe the answer is moving away from all of that JavaScripting and move towards something less open to so much inteference:

Flash( and/or adobe air, Flex) or Silverlight, WPF to provide any sort of responsible service on the internet, and leave the rest of the web content to just book reading.

Personally I would like to see a world where I don’t have to enable JavaScript to have a good browsing experience.

Ric

I’m not sure why everyone continues to comment after Sicovitol spoke.

Frames aren’t the issue, period. Smart DOM processing means that you can’t prevent what you are trying to prevent. Ever.

@pkchukiss and others suggesting a solution involving detecting being framed, and then serving a page with a “click here to proceed to the content” link – would the framing site (attacker) be able to respond by just positioning their transparent button over your “click here” link?

@Andrew:

What you’re missing is that what other people wrap your site in can affect your reputation and perhaps even land you in legal trouble.

Let’s say your site is maliciously wrapped in an attack and the victimized site user decides to sue your company for failing to take adequate security measures to prevent them from falling prey to the scam. Do you really trust your judicial system to understand that the inherent structure of the web makes this impossible? I can easily see a judge and jury handing down a multi-million dollar tort judgement against an innocent website owner whose only crime was being victimized by a malicious clickjacker.

I writ about my experience from the other side (me trying to frame stack overflow) https://medium.com/@leeleepenkman/running-a-http-proxy-server-at-scale-7a6397dd5dce#.czim1nn6g

I used the iframes sandbox attribute to disable alerts and navigation e.g.
<iframe sandbox="allow-same-origin allow-scripts allow-forms allow-pointer-lock"

The way i framed you was a bit different in that my product has an iframe which is a proxy server which shares the same origin to allow communication between the frames, serve things through
tecnically you could combat that attribute because i used ‘allow-same-origin allow-scripts’ together and i’m running on the same origin so you could actually have a script that looked into the top page and disabled any sandbox attributes on your own iframe (if xorigin allows) and then do your window.locatoin.replace breakout once granting yourself the permissions…

Then i could combat that by locking down the communications by not using the same origin and using postmessage between the frames, which you could probably combat by destroying the functionality of your own page when you detect you have been framed/proxied, i could start to regex strip that destructive code out in my proxy server that serves stackoverflow and yea what a mess.

on the subject of proxies i have seen more malicious annoying ones than my product (WebFiddle) i noticed stackoverfliow .com a spammy stack overflow proxy which adds their own ads, and i guess theres another anti malicious proxy dance which goes on a similar tit for tat style, it would be awesome to hear about how that goes too :smiley: