It’s pretty sad when your OS needs third-party software to copy files correctly.
user performance in simpler terms, static bar == slow ,dynamic bar == fast. so why don’t you push that bar gradually to take away the static perspective.It’s all relative. 
LOL. I was backing up my “downloads” directory last night onto my second hard drive. Mostly small files, but a few CD/DVD images from MSDN:AA. The “estimated time remaining” kept fluctuating between 7 and 30 minutes (winxp).
I understand some algorithms are difficult to estimate accurately, but file copying? Shouldn’t it just add up the size of all selected files and divide by # bytes/sec copied?
Copy time depends to varying degrees on OS cluster sizes, hard disk (or whatever media) cluster sizes, internal file fragmentation, hard disk fragmentation (some of which could be very natural fragmentation simply from moving disparate files that have no reason to be unfragmented with respect to each other), memory fragmentation, amount of memory, other hard disk characteristics, other I/O that competes with file transfers for bus bandwidth, thrashing from other processes, what other software is consuming resources, what is sitting in memory at the present time, the size of the swap file, the transfer error rate, the HDD cache, and (I kid you not) the ambient temperature, amongst many, many other things. The bytes/second copied for any given operation cannot be known beforehand. We can only try to improve our guesses.
If any computer operation was as obvious as the one you mentioned, everybody would be doing it that way and progress bars would never be inaccurate.
[OS X Tiger] treats images just like any other file and gives me a small icon
Make sure “Icon view” is on, then right-click in the finder, click View Options and tick Show Icon Preview. Fixed!
Of course its all much better in Leopard 
You’re right if x was greater than 1 you would end up with a non-real result. However as the graph is measuring from 0% to 100% x will never be greater then 1 (1 is often considered to be 100%, 0.5 is 50%, etc).
Of course I’m no math wizard either so please correct me if I’m wrong.
Man, I never thought of that! I’ll make a small piece of software with those varying progress bar styles and ask my friends to share their thoughts… will I get the same results of the study.
Brazilian Cheers.
@Daniel Maloney
You’re spot on
I was so blinded by the “0% - 100%” used in the graph that I totally forgot about the 0-1 representation. Thanks for reminding me 
Riis
Just a couple of things:
The perception part is so important. Andrew Moyer’s point is exactly right: even increasing the load (and time) by providing feedback can result in better perception and user happiness. Been there, done that.
jaster’s point about spininess is also well taken. You have to be careful with just cycling something without providing actual feedback. Sometimes the Mac “shows” progress by using the a spinning multi-colored ball. But they use the same thing when an app hangs - so some call it the “the beach-ball of death”, and it causes the expectation of a hang. Quite the opposite of the intention I think!
Thanks for the article. It provides an interesting aspect for determining the speed of a process, and I think is something that all software should take into account.
I think that the write-behind can be bad when the write fails.
Unfortunately MS won’t get any love from increasing reliability at the cost of performance.
People only value reliability after they’ve been hurt by the lack of it.
“I tend to use a pessimistic time progress, where if the task is 90% I report the time left multipled by 1.05, and if 80%, multiplied by 1.075.”
Anyone else find this comment troubling?
As an electrical engineer I swore a professional oath, and I don’t recall any place in the oath where it said “lie to the user to make them feel better”. I’m sure software engineering’s oath also lacks that “lie to the user” phrase.
When I’m given a progress bar or time left estimate, I expect the programmer to TELL ME THE TRUTH, not lie about it. Otherwise, if it’s a lie (estimated time multipled by 1.075), then there’s no reason to believe what you’re telling me. I don’t know if “5 minutes left” is the actual truth, or if it’s a fudge-factored number and the actual time is 4 minutes???
Be HONEST to your users.
Remember your ethical oath that your swore to uphold. I may not like hearing “5 minutes left” but I’d rather hear the truth, than wonder if the programmer is lying to me.
Try launching simultaneous copies from a CD/DVD, that is also fun.
Somehow Perceived and Actual performance match: iffy.
SuperCopier is another tool that improves Window’s file copy.
( http://supercopier.sfxteam.org/modules/mydownloads/ )
The existence of such tools speaks loudly for the quality of the OS…
I remember earlier versions of IE, whenever you download a web page (in the 90s, with dial-up) the progress bar always speeding towards the middle of the bar no matter what (trying to deceive us). When the first 20% of the page REALLY downloaded, then the progress bar would MOVE BACKWARD back to the 20% position of the progress bar.
It seems Microsoft staff already known this human trick long ago but it is hard to understand why the Vista team failed to learn from history.
This is totally bullshit.
I remember a number of times in Vista where copying large video files simply didn’t work, and cancelling them caused the whole OS to shit itself while it slowly tried to stop doing something it never fucking started.
Get yourself one of those fancy Scuzzy/RAID setups they had in the 80’s and all your file copy blues will be solved. Did I say that out loud?
Just to return to this old (and quite interesting) post, it’s of note that the latest Safari version (4.0) no longer gives any progress bar whatsoever – it’s just a spinning wheel. And indeed, I perceive that nothing is happening!
I’m still waiting for a skinable file copy so we can have creative people devise clever animations to replace the boring flying folders.
Perceived performance is the cornerstone of multi-tasking on computers, i.e. doing more things at once than you have processors available.
two general points though:
- further evidence that over-optimising one aspect leads to a sub-optimal whole.
- users are fickle.
To me it doesn’t matter if the new copy is 2% faster, I just want to know what is actually happening. 10% would start to change things, but at that point, the problem is probably not progress reporting.
This reminds me of the story of people complaining that a new fancy elevator was slow. The designers proved it was fast, but complaints continued. The designers did research, and put a big mirror in the lobby.
Complaints stopped.
People could now preen in the few seconds before the elevator arrived, just like in most elevator lobbies.
(Sorry, I can’t remember who to attribute the story to - probably a ted.com presentation.)