Filesystem Paths: How Long is Too Long?

I don’t see what’s so outrageous about having long file names. If the user wants to name files something meaningful to themselves who’s to say they shouldn’t use their computer that way? For that matter, for small enough notes, who’s to say the user can’t just put ALL the data in the file name? I can hear the shreiks now, but short of outdated technical reasons is there any reason to prohibit users from doing this? Filename (and location) is just one of myriad metadata about a blob of data.

I tend to agree that hierarchical filesystems are going to start disappearing because organizing everything in a single axis of taxonomy is inefficient (to humans) (same goes for symlinks, which are possibly worse), but while they are still around brain-damage restrictions from the 1980s shouldn’t be restricting how users use their system.

In PowerShell:
dir -path “c:” -recurse | where { $_.fullname.length -gt 150 } | sort-descending fullname.length | select fullname.length, fullname -first 1

Not sure if that where clause helps performance; it might not be necessary.

Ah, I see KL provided the PowerShell answer a while ago. I like mine better, though - the where clause seems to make it run more than twice as fast on my box (160GB, dual 2.8, 3GB ram). My query took 4 minutes 47 seconds, KL’s is still running at 13 minutes.

I figured a baseline of 150 characters made sense since Jeff found that 152 was the longest path on a fresh install.

Here it is in a more terse format:
ls -r -fo C:|where {$.fullname.length150}|sort {$.fullname.length}|select -l 1|select fullname|fl

System.IO.PathTooLongException

Yes I got it, from Jeff Atwood’s program.

I just hit this problem organising some MP3s. My hierarchy is only a few levels deep but one or two albums had some very long names and track names. One track in particular, took its path over the limit after the album was moved, and that left me with a file that I could not even rename or move from Windows Explorer (Explorer just refused to do anything with the file).

The workaround was to rename the lower-level folders so they weren’t so long, and that is how it should have been from the start. However, the fact remains that I managed to blow the 260 character limit just by moving a few folders around, and the results were pretty unpredictable. For example, I couldn’t find the total running time for the album, and MS Synctoy simply skipped half the files it was meant to be backing up after hitting this file (that is how I managed to find the culprit).

– Jason

I think the critical point is that there is no actual limit to the paths that can be created, just a limit in some Win32 APIs. Those APIs have to stay the same for backwards-compatibility purposes. OK. So old programs won’t be able to read long files.

So the logical thing to do is create a new interface without the restriction that new code can use. There is a way to handle paths with lengths up to the filesystem’s limit, but it doesn’t support things like relative paths. Microsoft doesn’t promote its use and doesn’t in fact use it in its own software.

This problem could go away eventually if simply new and useful API functions without the path limit were created, used throughout Microsoft’s software, and promoted for use in all new software. If it had happened around the year 2000 most people would be blissfully unaware today (honestly, WinNT4 would have been a good time to do this, but it’s never too late). If it happens tomorrow maybe we’ll put this to rest by 2020. But it won’t happen tomorrow. When I turn 40 I’ll still be dealing with the path limit and I won’t even have a flying car. Some future.

All Problems have ONE solution and that is:

www.longPathTool.com

Path too long.

Error cannot delete file: cannot read from source file or disk

Cannot delete file: Access is denied

There has been a sharing violation.

Cannot delete file or folder The file name you specified
is not valid or too long. Specify a different file name.

The source or destination file may be in use.

The file is in use by another program or user.

Error Deleting File or Folder

Make sure the disk is not full or write-protected and that the file is not currently in use.

Path too deep.

I had the similiar problem and finally I found solution:
www.tooLongPath.com

I had the similiar problem and finally I found solution:
www.longPathTool.com

I had the similiar problem and finally I found solution:
www.tooLongPath.com

I had the similiar problem and finally I found solution:
www.sipVoipSdk.com

Silverlight definitely needs long file paths:

C:\Documents and Settings\reallylongerusername\Local Settings\Application Data\Microsoft\Silverlight\is\qzm2cwh3.bk0\xvzol3cm.540\1\s\wbt3ggyazzgf42xkvmzkumxhseqylr2brzmpatqodms5hg2z1jaaahaa\f\888b19e95-cf6b-4b0c-9588-1a8ea8ce8ca4-c4b19e95-cf6b-4b0c-9588-1a8ea8ce8ca4.db

You’re left with around 50 characters once that completely OTT IsolatedStorage path is created.

without drifting through the massive amount of boring trolling by users of various operating systems …

The biggest problem, is that a not inconsiderable amount of those 256 characters are ‘taken’ from you by default paths, for instance with Visual Studio “C:\Users\user.name.domain\Documents\Visual Studio 2010\Projects” and THEN if you have a sensible naming structure you use something akin to the project namespace so “yourapp.company.com”, which is then used twice, once for ‘solution’ level folder and once for each project under it (whatever namespace they use).

Only THEN do you actually get to the content folders of your project.

To get anywhere near useful you have to ignore all the windows ‘friendliness’ anyway and create yourself short paths from drive root like ‘C:\bill\projects’

Bah!

~256 characters might be okay for most people, but anyone working with NodeJS and/or NPM (in Windows) is in for some hell regarding the node_modules directory. Once you have a project inside 8 directories and run npm install you wish you would be on a Linux or Mac. Workarounds exist, but having to create a build process for a build process to prevent things from breaking is certainly something we shouldn’t need to do.

1 Like

From what I’m seeing, there seems to be a slow but certain push by Microsoft to address this problem in their core libraries:

Refs:
https://github.com/dotnet/apireviews/tree/master/2015-08-26-long-path
https://github.com/dotnet/apireviews/tree/master/2015-07-28-long-paths

1 Like

The programmer in me hopes they ignore the 1% that actually care about this and at least there are some heavy API restrictions to stop them doing it. Even on linux which can handle longer pathnames runs into the restriction that you can’t transfer those long pathnames onto anything with FAT32 (so like most USB drives). Last time I saw when someone tried to copy one of those stupid linux deep directory projects onto a FAT32 it just cut all the names off and trashed the drive, I assume it’s been patched to at least just warn it can’t do the transfer now.

It seems to escape peoples grasp that the longer you make the filenames the slower disk access becomes just based on having to parse thru the stupidly long strings on trying to find and open a file. So the change for 1% that actually want this affects the 99% that never want it or use it.

I think a realistic alternative that people who want to do this rubbish is make a cabinet structure that they mount and use. If you don’t know what that means it’s like how you can open and go inside zip files like they are a normal disk drive and the same with ISO files. The cabinet would have the long names and an API that supported those long names and you don’t effect everyone else. Really all your long directory junk is just in a file that fits on a normal drive. It isn’t like your going to be able to copy of move that directory structure anywhere else anyhow if you don’t do it in that format. So I have solved two problems allow you to have the stupidly deep and long filesname you want and you can move your data onto a normal external USB drive for backup as it looks just like a single file.

1 Like

Perhaps, but 256 is clearly too short of a limit and based on old DOS values not any rational modern value. Something like 1024 is at least a reasonable modern starting point.

Especially since something like this is used by Microsoft long_ass_name_and.{xxxx-xxxx-xxxx-xxxx}

I’m trying to copy a file with no extension (perfc) to the windows folder on remote PCs. I’m using a robocopy command that works fine in other deployment packages, but with this one I get the following error in the log:

Result: Deployment ended: The directory cannot be removed. Stop(Failure). Credential: (halquist\perryh). ShareCredential: (lansweep). Command: robocopy \server\Installers\perfc C:\Windows

I’m not sure if the problem is that there is no extension or a permissions issue writing to the Windows folder. I’ve tried installing with both ‘scanning credentials’ which is an admin account, and with ‘Currently Logged In’ and get the same results.

Thoughts?