Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Show all files/folders in Human-Readable mode #35

Closed
Prid13 opened this issue Sep 5, 2022 · 8 comments
Closed

Show all files/folders in Human-Readable mode #35

Prid13 opened this issue Sep 5, 2022 · 8 comments

Comments

@Prid13
Copy link

Prid13 commented Sep 5, 2022

Just discovered this from StackOverflow and I'm impressed with the speed of this versus using dir or Get-ChildItem.

However, for my use case, I simply want to print out all the files and folders in a nice table with sizes in human-friendly values (kB, MB, GB, etc.).

The -hr switch works great, but it cuts off the results after a certain number of rows and only shows the last row after that. Looking at the code, it's after 48 rows.

Is there a planned feature to allow for all rows to be shown in -hr mode? It's bit a of a shame that the options are so few for what seems like a great tool :)

image

Side note: sometimes the first few letters are omitted from folders, like ode_modules instead of \node_modules. Am I doing something wrong?

@Prid13
Copy link
Author

Prid13 commented Sep 5, 2022

Upon closer inspection, why is the "Total size" showing 0? :(

@aleksaan
Copy link
Owner

aleksaan commented Sep 6, 2022

Upon closer inspection, why is the "Total size" showing 0? :(

50 rows it's limit will be printed in human readable mode.

Is there a planned feature to allow for all rows to be shown in -hr mode?

And what will you do if you get 100 000 rows to console, for example? :) I can create no limit flag but it will be dangerous and unpredictable behavior. You can't look over all of these rows with your eyes, you cannot proccess these rows manually. But if you want to process these rows in another program you should't use -hr mode and get full results in json.

Upon closer inspection, why is the "Total size" showing 0? :(
first few letters are omitted from folders

Yes, these are bugs. Thanks, I'l try to fix them near days

@Prid13
Copy link
Author

Prid13 commented Sep 6, 2022

And what will you do if you get 100 000 rows to console, for example? :) I can create no limit flag but it will be dangerous and unpredictable behavior. You can't look over all of these rows with your eyes, you cannot proccess these rows manually. But if you want to process these rows in another program you should't use -hr mode and get full results in json.

Maybe increase the upper limit to something like 10,000 files or 50,000 files? :)

I'll come clean and admit that my use case is different from the intended usage of this tool you've developed. I simply wish to list all the files and folders in a given directory and get a human-readable output with nice file sizes and metadata, so that I can repeatedly do this in a scheduled script to print out and save the contents of certain folders for backup and history logging purposes :)

You don't need to implement this feature. But maybe you can show me how to compile the source code after editing the limit myself for Windows? :)

@aleksaan
Copy link
Owner

aleksaan commented Sep 7, 2022

@Prid13 please test new release https://github.com/aleksaan/diskusage/releases/tag/v2.9.0

@aleksaan aleksaan closed this as completed Sep 7, 2022
@Prid13
Copy link
Author

Prid13 commented Sep 7, 2022

You're an absolute legend! This works wonderfully, and I truly appreciate you doing this so much ~⭐

I'm sorry if I caused you any inconvenience. I know this is beyond the scope of this tool, but such a feature is really handy for my use case 😇

Are there are any plans to add more options like showing creation/modification date, and sorting features? :)


Actually, just realized that this tool is actually a bit slow when it comes to deeply-nested folders like Documents and Downloads. I tried limiting the search by adding a -depth 1 option, but the listing took the same amount of time regardless of using that option or not. Take e.g. my usage of this on the Downloads folder:

image

image

Notice also how the total dirs, files and size remains unchanged.

The reason I discovered this is because I tried using this tool on my Documents folder, but really only want the top level (or level 2 at most) files and folders, but it was taking way too long that I thought perhaps an error was occurring for that folder. But testing it on the Downloads folder proved to me that limiting the search with a depth filter still yields the same amount of run time.

Is this the intended behavior? :O

@aleksaan
Copy link
Owner

aleksaan commented Sep 8, 2022

@Prid13 Thanks for thanks
Why are times with depth and without (no depth limit) the same?
If you want to calculate size of folder you schould calculate sizes of all included subfolders and files including deepest level.
There's not previosly calculated size of folder in the file system.

And option -depth is needed only for reducing number of results rows (but it's full scan yet)

@aleksaan
Copy link
Owner

aleksaan commented Sep 8, 2022

Are there are any plans to add more options like showing creation/modification date, and sorting features?

It should be new issue. Create it please

@aleksaan
Copy link
Owner

aleksaan commented Sep 8, 2022

a bit slow when it comes to deeply-nested folders like Documents and Downloads

14000+ files (folders) per second - It's not slow I think ) And, there is some intresting fact - this program calculates size more accuracy than FAR. You can check it on c:\Users folder for example, or C:\Program Data

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

When branches are created from issues, their pull requests are automatically linked.

2 participants