Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

Already on GitHub? Sign in to your account

Add fall-through for bytesToHuman for very large values (just print the bytes) #858

Closed
wants to merge 2 commits into
from

Conversation

Projects
None yet
2 participants
Contributor

dvdplm commented Jan 2, 2013

We have seen an issue where an instance had a very large peak memory usage: used_memory_peak:18446744073706301432. The bytesToHuman() function can't stringify that properly (we get: used_memory_peak_human:\x90\f\x83\xFB\x83\u007F) so here's a version that resorts to simply printing the bytes for too-large values.

Contributor

charsyam commented Jan 2, 2013

@dvdplm I just have a question about your environment. Do you really use more than 15EB memory?

It is because 15EB overflow double's range. but, I really wonder in your environment.

Contributor

dvdplm commented Jan 2, 2013

It's an AWS m2.xlarge instance, so yeah, it has 17.1Gb. There are plenty of 64Gb servers out there, so it's not like this is a particularly exotic situation right?

Contributor

charsyam commented Jan 2, 2013

Hmm. it's somewhat weird. because "used_memory_peak" is more than 15EB not 15GB in your report.

Contributor

dvdplm commented Jan 2, 2013

Hehe, you're right, it is a weirdly high value (and I have no explanation for it, unfortunately). Maybe that's another unrelated bug?
Either way, I think the bytesToHuman should be able to deal with any kind of crazy value. I mean, it's simply a string formatting helper function and it should simply work and not encode/require any knowledge about what "valid" values are here.

Contributor

charsyam commented Jan 2, 2013

@dvdplm Yes, I think that it is first to look for what makes it strange.

In my personal opinion, "used_memory_peak" already shows not encoded value.

so, I think adding "TB", "PB", "EB", "ZB", "YB" is better.

Contributor

dvdplm commented Jan 2, 2013

@charsyam from PB onwards I'm getting compiler warnings about comparison of unsigned expression < 0 is always false and potential division by zero issues, so I stopped at TB and PB (and kept the final rescue with a note saying it's an exceptional case). You like better?

Contributor

charsyam commented Jan 2, 2013

@dvdplm maybe there is no possibility to use more than PB size memory in next 10 years. :)
I just worry about what makes "used_memory_peak" weird.

@dvdplm dvdplm referenced this pull request in redis/redis-rb Jan 23, 2013

Closed

Downgrade INFO string to ASCII #306

mattsta added a commit to mattsta/redis that referenced this pull request Aug 2, 2014

Extend range of bytesToHuman to TB and PB
Also adds a fallthrough case for when given
large values (like overflow numbers of 2^64 by mistake).

Closes #858

mattsta added a commit to mattsta/redis that referenced this pull request Aug 2, 2014

Extend range of bytesToHuman to TB and PB
Also adds a fallthrough case for when given
large values (like overflow numbers of 2^64 by mistake).

Closes #858

@mattsta mattsta referenced this pull request Aug 2, 2014

Closed

ALL simple issue fixes #1906

mattsta added a commit to mattsta/redis that referenced this pull request Aug 6, 2014

Extend range of bytesToHuman to TB and PB
Also adds a fallthrough case for when given
large values (like overflow numbers of 2^64 by mistake).

Closes #858

@antirez antirez closed this in 100c331 Aug 25, 2014

antirez added a commit that referenced this pull request Aug 26, 2014

Extend range of bytesToHuman to TB and PB
Also adds a fallthrough case for when given
large values (like overflow numbers of 2^64 by mistake).

Closes #858

antirez added a commit that referenced this pull request Aug 27, 2014

Extend range of bytesToHuman to TB and PB
Also adds a fallthrough case for when given
large values (like overflow numbers of 2^64 by mistake).

Closes #858
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment