Why Are Web Uploads So Painful?

I upload files to a server all the time via FTP and it supports resume as well.

Surely this can be utilised by browsers as well.

I uploaded a video to MSN’s SoapBox last night and the experience was much better than anything you described. While the service isn’t as popular as YouTube, I noticed 2 things:

  • the quality of the resulting video was much better than YouTube’s
  • the upload experience was actually pleasant. During the upload, I was allowed to view other views and even upload multiple files!

If MSN can do it, I am sure YouTube and others can too. BTW, I wasn’t using Internet Explorer so there was no ActiveX components involved.

Using HTTP for file transfer (and email and everything else under the sun) seems like a classic case of a developer armed with a hammer treating every problem as a nail.

What happened to the protocol stack? Tunneling everything over HTTP is a big step backwards - TCP provides much better tunnels, it has flow control, error checking etc etc. POP, IMAP and FTP are all much better suited to their respective tasks. HTTP sucks for anything requiring state information.

This is where web 2.0 has got us. It seems to stem from the public perception in the mid 90’s that the web and the internet are the same thing (“teh interweb”). Perhaps it’s just the default mental model most people have of computers. WYSIWYG. The GUI is the program, (anything else is just too confusing). We have folders instead of directories that we “paste” files between.

It’s bad enough when it’ Jo Public’s mental model of the net, it’s unforgivable when programmers view the internet the same way.

I just love that your screenshot of IE’s download dialog is downloading Firefox.

I totally disagree that ‘HTTP was never meant for uploads’ - this is observation is completely baseless. HTTP is underdeveloped for uploads via the browser, which is what Jeff’s post is all about. Because it’s underdeveloped we all invent little nonstandard hacks to get around this basic need that is currently unfulfilled.

We could easily implement better (and stricter) standards for HTML and browsers to support better uploading capabilities. Unless you’re doing something like re-encoding video or resizing pictures, chances are that you don’t (or shouldn’t) need a specialized uploader app. What we’re talking about here is meeting the needs of 80% easily and let the other 20% do their own things.

I also have to make a rebuttal against all of the ‘just use flash/whatever’ crowd. Flash is not a standard part of HTTP/HTML. Things need to be folded into the standard so that compliant clients can be built without relying on 3rd party proprietary applications. Flash isn’t free nor is Flash typically found in a web developers skillset. I don’t want to hire someone just to do a flash-based upload applet and then have to support that as well.


A personal peeve is how in ASP.NET you have to set both IIS and a .config setting for the max upload size with the unfortunate effect that you can’t (or couldn’t) restrict it to just one page, but it affects the entire web application. Per-page restrictions are necessary to prevent DOS-style attacks.

I also hate how you can’t just drag and drop the ‘to be uploaded’ files onto browsers. They unfortunately want to display the file in the browser instead of putting the path into the file input control. Flickr without the uploadr is painful at best, and usability is reduced.

Sounds like a great candidate for a Firefox plugin.

I get thoroughly fed up at the way providers such as comcast hawk their service and encourage us to share our photos and videos etc. but don’t provide adequate upstream bandwidth to make it usable and reliable.

I agree that the browsers could do a much better job of providing upload support for progress and restart, but there wont be much incentive until the ISPs open up the pipe.

I use a dynamic iframe which posts to my upload PHP, a server-side CGI which monitors the upload in realtime, and some ajaxy stuff to show progress. All links on the page should have a different target, and the window.onbeforeunload event should be trapped to allow the user to confirm they want to leave the page.

It’s actually pretty complicated to pull this off (for eg, iframe behaviour across browsers is a total PITA) but there’s no excuse not to add at to HTTP/javscript standards least some manner of feedback to the form file upload field so people don’t have to struggle so much with this crap.

There’s a whole other problem, too – getting old film TO digital format so it CAN be uploaded. Disclosure - I work for www.imemories.com. This solves the problem of upload for Baby Boomer types and those who don’t have bandwidth.

I’ve been using http://transferbigfiles.com lately. Not sure what their upload progress bar actually is, but it’s a nice enough service for sending big files to one or more people quickly.

Our team has developed a web site that allows file uploads along with a fairly accurate progress indicator and estimated time of upload. Send me an email at mehus***NOS_PAM***ain[at]yahoo[dot]com in case you would like a preview…

I’d like to point out that opera does have full transfer feedback, up and down. While it does not appear to calculate the total upload/download, so it cannot give you a percentage, it does give upload speed and progress in byte-units. this is a general setting for all webpages, and the interface covers both up and down. When I open a page, I can see it downloading and, by the same token, I can see how much has been uploaded when a web form requests a file. This is extremely useful for email attachments and any other web-based uploading. I don’t however, use youtube or any other such video site unless I want to spread the video to the general public. FTP (or SFTP rather) is still a far superior method of uploading files to a server, and BitTorrent is without a doubt the best way to upload to multiple computers. And yes, you can use BT to upload to a server network, not justt peer clients.

Here’s an example screen shot showing the interface on a regular page load. Uploading is the same, the upload is lumped in with the loading of the recieving page.

http://img456.imageshack.us/img456/8277/operaawesometh9.png

It’s also extremely useful for embedded content as you can see it’s load progress in terms of speed and size, if not percentage.

While there is a percentage progress bar there, it is rather innacurate as the browser’s information on the page is updated during loading.

What an amazingly timely post, Jeff, for me at least. I’m currently dealing with this very situation in a web application I’m writing that needs to handle large (multi-gig) file uploads in a user-friendly way.

The problem, as others have mentioned, is that the current version of the much hacked HTTP protocol doesn’t support any sort of file upload other than the whole damn thing. That’s how a file upload works – the browser sends a POST request with the entire file base-64 encoded as part of the request. There’s no way to get real feedback from the server because the server doesn’t do anything until it gets the whole POST and if the timeout isn’t long enough or the maximum acceptable file length isn’t large enough, the user ends up with a failed transmission after having waited for several minutes/hours. There’s no way to get around it without hackery.

Some others have suggested FTP. Yes, FTP is designed for file transfers, but remember that it’s designed specifically for file transfers, and for no other purpose, and if you go the FTP route, you open up several cans of worms. Web applications are built on top of HTML/HTTP, with cookie or URL-embedded session based authentication. If you want to include FTP you have to figure out a way to include that authentication in the FTP session, a very non-trivial task. Even then, you’re stuck with whatever FTP client the browser provides, meaning the entire website layout disappears, creating a big usability nightmare.

There are third-party hacks out there that do a decent job of providing feedback… typically they take the form of an AJAX component that sends the file in the background, with a server-side component that monitors the progress of file upload tied to a web service that the AJAX uploader component calls repeatedly to obtain a percent uploaded.

Although this helps solve the usability issue, it also opens the server up to denial of service attacks, as it requires very large maximum file upload and very long timeout settings for the server.

Alas, there isn’t a web application silver bullet. The best way to maintain usability and security in a pure web application would be to split the file into chunks the web server can handle (as suggested by transcriber) and reassemble them server-side. Unfortunately (for entirely different security reasons), browsers don’t typically allow client-side scripts to perform the necessary file I/O for splitting the file.

This can be gotten around with an active X control or a plugin, but that route brings up user-pain issues (another thing to install, etc.).

Still… having to install an active x control (or silverlight, or whatever) may be the past of least pain, especially given the feedback benefit.

I just had a discussion with some colleagues about this. They just got an upload progress bar setup for their folder sharing site [1]. They said it involved “a deal with the devil” and “a world of darkness and terror” from a code perspective (.NET shop), but that once it was setup it works really well. Just my (well, their) two cents.

-Erik

[1] - http://folders.freshlogicstudios.com/

I’m currently using FileUp from SoftArtisan in a project where we have 3GB+ uploads downloads. To get all of the functionality you need to purchase the Enterprise version.

http://fileup.softartisans.com/

And now I need to post a retraction. It turns out that IIS 7 / Windows Server 2008 will make FTP authentication integration with a web application much more trivial:

http://weblogs.asp.net/scottgu/archive/2007/09/27/iis-7-0-hits-rc0-lots-of-cool-new-iis7-extensions-also-now-available.aspx

Of course, there’s still the whole separate client issue, but it’s a big jump in the right direction.

It’s true. I use my own written software:
.NET clickonce or a Java applet or an ActiveX to upload. Real responses but you have to install software…

The most general uploading tool that I know of is Firefox Universal Uploader:

https://addons.mozilla.org/en-US/firefox/addon/4724

I have no idea how well it works,

The UI is painfully, painfully slow. It runs like molasses. If you don’t mind being tied to a specific upload type (e.g. photo sites), there are some good site-specific upload managers like Fotofox, which will not only manage uploads but also handles tagging, folders on the remote site, and so on, all on the local machine rather than having to do it via POSTs to the remote site.

I agree absolutely with your article. I do not understand why Firefox developers do not take the chance to enhance their product offering a decent file upload interface. I think browsers should offer a wider palette of tools, making upload forms in .js or flash looks like running around an obstacle in fancy ways instead of finding a simple solution. Enhancing the browser’s user interface for file upload would give instant benefit to all traditional websites as well!